The 2010s: techno-pessimism and stagnation
In the 2010s, we largely decided that we were in the middle of a technological stagnation. Tyler Cowen’s The Great Stagnation came out in 2011, Robert Gordon’s The Rise and Fall of American Growth came out in 2016. Peter Thiel declared that “we wanted flying cars, instead we got 140 characters”. David Graeber agreed. Paul Krugman lamented the lack of new kitchen appliances. Some economists asked whether ideas were simply getting harder to find. When the startup Juicero came out with a fancy new kitchen appliance, it was widely mocked as a symbol of what was wrong with the tech industry. “Tech” became largely synonymous with software companies, particularly social media, gig economy companies, and venture capital firms. Many questioned whether those sorts of innovations were making society better at all.
So it’s fair to say that the 2010s were a decade of deep techno-pessimism.
Which is a little odd, because the 2010s weren’t actually that horrible in terms of total factor productivity growth (for the uninitiated, TFP is a measure of productivity that economists think kinda-sorta measures the productive power of technology). TFP is hard to measure, but it looks like it rose 3% in the U.S. between 2010 and 2017. Other measures put private-sector TFP growth at around 1% annually. That’s slower than earlier decades, to be sure, but it’s not nearly as bad as the slowdown of the late 70s and early 80s. In fact, it looks like the worst of the recent stagnation came in the latter half of the 2000s.
So it’s a little hard to explain the rampant techno-pessimism of the 2010s just by looking at the economic statistics. Maybe it was the fact that it was the second productivity slowdown in a half century, making slowdowns seem like the rule rather than the exception. Maybe it was a pendulum-like reaction to the wild-eyed techno-optimism of the 1990s and early 2000s. Perhaps it was an indirect way of expressing frustration with inequality, recession, political divisions, social media flame wars, and/or jerky tech founders. Or perhaps people simply weren’t getting the kind of technologies they wanted. I don’t know. Maybe all of these.
But it does seem like things could be in for a big change in the 2020s. A number of developments in science and technology suggest that we could be due for a re-acceleration of technological progress. And not of the “more smartphone apps” variety, either.
Technology vs. “Tech”
At this point it it’s important to differentiate technology from “tech” as it’s popularly conceived. “Tech”, “Silicon Valley”, or “the tech industry” has come to mean either the software industry, plus anything funded by venture capital, plus anyone who’s perceived to be culturally affiliated with the aforementioned (e.g. Elon Musk). It doesn’t typically include older companies like General Electric or General Motors, biotech/pharma companies, or government-funded science.
Americans could be forgiven for thinking that technological progress mostly comes out of the tech industry. Since 1980, many of the products that changed our lives the most — computers, the internet, smartphones, social media — came out of that industry, as did the Big 5 tech companies (Microsoft, Apple, Amazon, Google, and Facebook) that together now represent about a fifth of the entire U.S. stock market. To some degree, the archetype of Steve Jobs building computers in a garage has replaced the image of a scientist in a lab coat in popular imagination.
But that tech industry lies downstream of a vast ecosystem of science and innovation. Most published science comes out of universities, which are funded by a combination of government grants, tuition dollars, and corporate joint-venture money. (Big corporate labs used to do a sizeable chunk of research, but this is no longer true, though Google’s efforts in AI might represent a partial return of that trend.)
The private sector does spend a lot on R&D. But the “tech” industry as we know it only accounts for about a third of that spending:
“Tech” is certainly very important, but it’s certainly not the whole ball game.
Also, most product innovation is done by big companies rather than startups. Economists Daniel Garcia-Macia, Chang-Tai Hsieh & Peter J. Klenow recently tried to estimate how much productivity growth comes from improvement of existing products — loosely corresponding to what Clay Christensen called “sustaining innovation” — rather than from creative destruction. They found that it was about 75% to 25%.
Furthermore, lots of technological progress gets implemented through channels other than downloading apps from the App Store or buying electronics at Best Buy. As the amazing and apparently successful COVID-19 vaccine effort shows, plenty of innovation makes its impact through the medical system. The rollout of solar and wind power will be done mostly via utility companies, and electric cars will come from big manufacturers. And the military uses a lot of technology as well, though I hope we aren’t reminded of that via a major war anytime soon.
In other words, there’s a good chance that the technological progress of the 2020s will not mostly come in the form of software or electronics startups giving us new gizmos or new ways to talk to each other. The big software companies will still be big, startups will still start up, and VCs will still make money, but my bet is that in the realm of game-changing innovation, Big Science will take over as the big deal.
And that’s fine. Personally, I think the IT revolution has done amazing things for the world — it has allowed us to keep in touch with our old friends, date easily even after divorce, meet people from all over the world, and so on, and it created social movements that have already changed society on a level similar to the advent of radio or the printing press. I think the IT era created much more progress and social change than the productivity numbers capture. But even if you disagree, take heart; technological progress in the 2020s will not necessarily be concentrated in “tech” as we know it.
New veins to mine
Science and technology have always seemed to me like mining for veins of ore. Sometimes you strike it rich. A theoretical discovery like finding the structure of DNA can open up whole new fields of knowledge. A key invention like the transistor can enable entire new classes of inventions. As time goes on, the veins get mined out, and the effort required to mine more ore increases. But the old veins help point the way to new ones. Science doesn’t progress linearly, but in a branching, ramifying tree.
So what new branches are we going to see grow rapidly in the next decade? It’s hard to be sure, but we can make some guesses. Caleb Watney has a cool new post where he makes some guesses. He focuses on vaccines, solar and batteries, lab-grown meat, A.I., self-driving cars, nuclear fusion, and VR.
Personally, I’m skeptical that self-driving cars, fusion, or VR will take the world by storm this decade; my guess, and I’m happy to be wrong, is that these are a little further down the road.
But there’s no denying that vaccine technology just took a huge leap forward. Propelled by the crisis of COVID, mRNA vaccines went from a speculative tool to a proven one. This will probably accelerate the development of all kinds of new vaccines, potentially including vaccines for cancer. It’s worth reading that phrase again and thinking about it for a second: VACCINES FOR CANCER.
A.I. could also have a big effect on the rate of drug discovery, which has been flagging in recent years. Though it’s an exaggeration to say that Google’s DeepMind has “solved protein folding,” it’s undeniable that the company’s machine learning technology is opening up the possibility of rapid advances in the field.
And if self-driving cars are slow to arrive, machine learning will no doubt revolutionize a number of other fields; it’s hard to predict which ones. Warfare, for example, might be revolutionized by autonomous drone swarms. Deepfakes might replace traditional movie production. Algorithms might even be able to write op-eds. And lots of fields of research could be changed by adding machine learning.
Lab-grown meat, of course, would also change the world. The amount of land that could be freed up from livestock grazing is absolutely staggering.
And of course the biggest technological revolution will be in green energy and storage. As I mentioned in a recent post, the price drops for solar and wind over the last decade have just been nothing short of revolutionary. But storage is potentially even bigger of a deal. It isn’t just lithium-ion batteries that are declining rapidly in cost as deployment increases — it’s a whole bunch of technologies.
This is going to do a lot to help stop climate change and make civilization sustainable. That in and of itself is a form of unmeasured productivity growth. But in addition, cheap solar and storage are going to bring back something that we haven’t seen in many decades — cheaper energy.
Ever since oil stopped getting cheaper and fission didn’t pan out as hoped, humanity has had to make do with the antiquated technology of fossil fuels, which promised to go up in price due to scarcity and to action against climate change. This was a very bad place for the human race to be; the end of cheap energy was probably responsible for the first big productivity slowdown, back in the 1970s, and was the reason that the “Jetsons” sci-fi future never materialized.
But now, for the first time since the 60s, technology is going to make energy cheaper. That has the potential to both supercharge productivity and to help regular workers reap more of the benefits directly (since energy is highly complementary to human intelligence). It also promises to give birth to a huge array of other innovations that are hard to predict — like when protesters in Portland used electric leaf blowers to foil tear gas attacks by federal agents over the summer.
And then there are the technologies that will allow humans to modify ourselves — particularly CRISPr. The first CRISPr gene editing of humans caused a huge scandal, but more reasonable efforts are surely not far behind.
There are plenty of other fields with huge potential for the next decade — synthetic biology, additive manufacturing, brain-computer interface, biomechanical engineering, drone warfare, regenerative medicine, and so on. And probably some I don’t even know much about, like single-cell biology.
Note that most of these are things that depend on lab research, on the medical system or the military, and/or on innovations by companies outside the software industry (A.I. being the big exception). If these are truly the areas of advancement we should be excited about, it means that technological progress in the 2020s will be about a lot more than “tech”.
…And then there’s space.
The new Sputnik moment
You may not have noticed, but China just landed robotic spacecraft on the moon. Human landings aren’t far behind, and a moon base is in the cards as well. The country’s moves have spurred India, Japan, Europe, and the U.S. to plan their own moon missions. Meanwhile, the U.S. has a gigantic rocket called Starship that can hop short distances and may be able to fly to the Moon and Mars. And back here in Earth orbit, a battle to control space militarily is underway between the U.S., China, and Russia.
The space race is back.
Now, military competition is a scary thing. But if the return of great-power rivalry has to happen — and at this point it seems nigh-inevitable — at least it should give us a new space race.
Because technological progress costs money. And it costs increasingly more money as time goes on. And it can be very difficult to get Congress to shell out the necessary dollars to keep the fountain of progress flowing. Federal research spending has been falling as a share of GDP since the 60s:
Note that huge increase in the 60s, though. A large part of that was the first space race! The Apollo program was hugely expensive, but the great-power competition with the USSR to control the heavens was sufficient to get Congress to pony up the cash.
Now, the space race pushed forward technological development, giving us far more than freeze-dried food. But the more important principle here is that great-power competition can be used to pressure a short-sighted and narrow-minded political class into funding research at an appropriate level.
Our competition with China won’t just be about moon bases or space weapons; it’ll be a comprehensive contest across a vast array of technologies. Energy, A.I., drones, vaccines, and all kinds of areas with huge civilian and commercial spillovers will also be the focus of intense U.S.-China rivalry. Already, competition with China is being cited as a reason to pass the Endless Frontier Act, a much-needed expansion of federal research funding.
So while it would be nice if we didn’t need international rivalry to drive technological progress, at least its return gives us one more reason to be optimistic about innovation and research in the coming decade.
Onward and upward! Moar science! Discover all the things!!! Let’s put the 2010s behind us, and make the 2020s a decade of techno-optimism.
(By the way, remember that if you like this blog, you can subscribe here! There’s a free email list and a paid subscription too!)