100 Comments
Dec 8, 2022ยทedited Dec 8, 2022

Interesting analysis but I don't quite agree with it.

First problem - the decline of corporate R&D is over-stated. These graphs are showing relative shares, but all that means is that governments flooded universities with money which then spent it on expanding their quantity of output. Quality, however, is often missing. I've been in the tech industry for a long time and seen a lot of R&D happen, but the reliance on academic research is pretty minimal even in the AI space. Part of my job involved reading academic papers for a few years, but I eventually gave up because the ROI was zero. Lots of papers that made enticing sounding claims but when examined carefully had caveats that made them useless.

Second problem - the distinction between research and development. Very important for academics, not important at all for companies. When I think about the successful big tech projects I've seen, there was no clear line delineating the two. Experiences and results of development informed research, and research was often the same thing as development, just on more speculative features. The same people would do work that smelled like research for six months, then development of those research ideas into a product launch for another six months, then back again, with no change in job title or function. I myself have done such projects in the past.

Thirdly - the assumption that "papers published at academic conferences" is the same thing as output. Very few people in the corporate world care about publishing papers. It's just a distraction. Unless they came from academia and publishing has become a part of their identity, or they want to keep the door opening to returning, papers just aren't going to happen. The only peer reviews you care about are actual reviews by your actual peers, i.e. managers, because that's what determines if your R&D turns into promotions or bonuses. Google AI research is really an odd duck in this respect in that they seem to care a lot about getting published but that's really rare.

Obviously if you just plot by papers published then universities will look very productive and it'll seem like there's a pipeline or supply chain or something here because that's what universities are paid to do and what they optimize for. If you've actually been in real companies doing cutting edge work with tech, like I have, then it all looks like a shell game over the definitions of arbitrary words. Companies invest a ton into "R&D", just not in a way you can neatly sum and plot on a graph. Often they struggle to even measure it for their own purposes!

Finally - a nationalized energy company that can fund research? I know this is a left wing academic blog but come on. That would wreck the energy markets and deliver absolutely nothing in the way of usable research output. The west already dumps way too much money into public/academic energy R&D labs, as well as VCs pumping up private firms, and then Google also funded lots of energy R&D in the past too (see here: https://www.google.org/pdfs/google_brayton_summary.pdf ). There's very little to show for it because doing proper R&D that matters requires experience in the field, so it all gets done by oil companies, wind turbine manufacturers etc.

Expand full comment

Excellent but some observations.

The transistor is not basic research like Cosmology. It was based on basic research done a half century earlier by Bohr, Plank and Einstein.

Some misses. The greatest industrial research center in America for 200 years was the Dupont Experimental station. And they very much capitalized!! Nylon. Gunpowder. Lycra. Synthetic fibers like polyester. As well as Agriculture.

Sarnoff labs... TV and Radio development but not the basic research... Marconi et al

GE Schenectady- plastics basic research for wire insulation led to polycarbonate and others. Electrical..well developments not really basic

Still Bell Labs was the premiere research organization of the world.

Bell labs invented fiber optics. In the late 50s. Used as a super top secret sensor deep underwater for detection of submarines. The laser also needed in that scheme they coinvented. But the basic research for both was based on basic optics of Newton and Maxwell, and Einstein.

Expand full comment

I worked at AT&T Labs/Research for several years as a technical staff member, way back before I went to get my PhD in computer science. (For the uninitiated: AT&T Labs was a spinoff of Bell Labs that took a fraction of the computer science research from Bell, leaving the rest with Lucent.) Most of the people I worked with were old Bell Labs hands.

From my perspective, the main benefit of a large research lab was the huge concentration of brilliant researchers. I'm now a CS professor and love my lab, but I only have a small number of local collaborators. At AT&T you could have lunch with Peter Shor and Bjarne Stroustrup as well as a dozen other subject-matter experts. This produced an amazing number of good new ideas, which resulted in an impressive amount of research productivity. If you put a bunch of smart researchers in that environment I imagine they're going to thrive, as long as you lay the groundwork for (1) those people to work on problems that are relevant to the organization, and (2) you have a plan to deploy and monetize that work.

The real problem with AT&T is that by the time I worked there (circa 2000) it had absolutely zero ability to monetize most of the research it produced. Silicon Valley was in the grips of the first dot-com boom and firms on the West Coast were doing relatively amazing things. AT&T was hidebound and constantly tossing good research on the floor. The few examples where they did attempt to monetize were even more embarrassing: I worked on a software project trying to sell music over the Internet, and it was obvious that we were completely outclassed in this effort (Apple eventually licensed some of the audio compression patents AT&T owned, and the rest is history.) In principle we could have been a hotbed incubator for startups, but: (1) we were in New Jersey and not Silicon Valley, and (2) management didn't really know how to handle this. Even a "Intellectual Ventures"-style patent strategy might have been lucrative, but AT&T did not execute that. Thankfully.

Expand full comment

Some observations from someone who spent some time in academic research and industry (pharma regulatory affairs, not R&D just to be clear). A good summary of the academic/industrial nexus can be found in the symposium proceedings held at U Penn back in the early 1980s, "Partners in the Research Enterprise: University-Corporate Relations in Science and Technology." There were speakers from both sectors and it was really an all star cast with then Congressman Gore (pre-Internet invention), Kenneth Arrow, Bart Giamatti (pre-Baseball Commish), Jim Wyngaarden (NIH Director) and others. There were some very good case studies presented including an intriguing agreement between Du Pont and Notre Dame back in the late 1920s that goes to show these types of agreements are not new.

Bell Labs is a special case (and one of the things Noah did not mention was ground breaking work in computing, the 'C' programming language was developed there). Ma Bell had a monopoly and had both the research arm, Bell Labs, and the manufacturing arm, Western Electric. They could afford to sponsor a lot of basic research as they did not face the type of competition that other businesses were subject to.

Google has a huge market share with a cash flow that allows them to carry out a lot of fundamental research such as in AI as noted by Noah. They also do a bunch of other stuff that is not necessarily related to their core business. Along with other software companies such as Microsoft, this is perhaps the best version of the industrial research lab that Brad De Long writes about in his recently published book, 'Slouching Towards Utopia.'

This is not to say that all corporate R&D is robust. In the sector that I worked in, biopharma, there has been a shift away from corporate R&D to direct purchase via merger and acquisition. It was more cost effective for a pharma company to purchase a competitor that might have a blockbuster drug as the cash flow from sales quickly covers the acquisition cost (Pfizer were among the first to do this big time). The same thing continues to go on with the purchase of biotech assets that have originated in academia, spun out into a company that got early stage VC funding and then subsumed into a pharma company. Since the biotech revolution began in the 1970s, many companies (I don't have a good number but it's well over 1000) have been founded but only a small number remain independent (Amgen, Gilead, and Biogen), the rest either stumble along, get acquired or go out of business.

Yes, as Noah points out the energy sector is one area that might profit from a better R&D model but it's mainly in the distribution and storage areas. Alternative energy source production seems to be going along just fine under the current system.

In sum, there are already adequate measures in place that facilitate technology transfer. Occasionally, a need to do something above and beyond comes up such as the COVID-19 pandemic that required massive amounts of money and effort to develop both a vaccine and therapeutics. The first of these was accomplished but the second, not so much. There is a continued need to look at lessons learned.

Expand full comment

Great article as always. I personally don't think Bell Labs will make a comeback. I like to err on the realistic side and a lot of the tailwinds that enabled Bell Labs are gone.

I have only two contributions. First, is it possible that for the last thirty years, we have been operating with a faulty model of innovation. These days, we glorify the young novice and the talented upcomer. What is first principles thinking after all apart from a glorification of the advantages of the amateur. But it seems to me that we have confused one field ( Software where that isn't a problem and could even be a boon) with everything else, where real expertise is necessary and the gap between research and development is lengthy, and the marginal cost of distribution is significant.

Of course, this ties in to venture capital too, which is uniquely suited for ICT technology and not so apposite for a lot of other industries. The world of bits and bytes has had quite a lot of innovation. The world of atoms and stuff ( a funny contrast since everything is still atoms and stuff), not so much.

Expand full comment
Dec 9, 2022ยทedited Dec 9, 2022

Bell Labs kind of sucked.

They had an outsized impact on R&D largely by exploiting AT&T's government-backed monopoly on communication systems. If you wanted to work on (say) fiber optic communications, there was only one place to work. If you talk to old timers they love to talk about how great it was that they got paid to sit around and have lunch with Nobel laureates, but they never seem to recognize how overall bad it was for society to have all those productive researchers locked into an organization that had no incentive to actually ship anything.

Expand full comment

I worked for DOE supported national labs in the 90s and early 2000s -_ Sandia and Lawrence Livermore. These were products of WW2. The former was managed by Bell Labs. These places were curious in part because you needed a security clearance to work there and in part because of their traditions. On one hand when I worked at these places they were quite bureaucratic. One the other hand scientists greatly resented government oversight of their work despite the fact that almost all the funding came from government. One would hear that once these places were run by scientists for scientists. Or that if you want nuclear weapons you need to give us whaterver funding we need. I always presumed that during WW2 things could not have worked this way. Sometimes the powers that be just need to know when to get out of the way and let folks do their work.

There is a joke that went around about the 3 nuclear weapons labs -- Sandia, Livermore and Los Alamos. It says alot. It goes like this. When DOE says jump Sandia answers how high sir. Los Alamos says Up Yours. And Livermore in 3 months puts together a $500 million jump management program.

Expand full comment

It would be interesting to know how much research funding is being spent on research of persuasion, broadly. I suspect an increasing amount. It's cheaper than a physical lab of course but I see the persuasion ecosystem -- buy this, don't eat that, do this and you'll live an extra 25 years, ask your doctor about this -- ballooning in my lifetime, a good deal of it grant funded, when it comes to public health. But private food consumption persuasion is also big business and there are apps for everything. None of this will build us underwater cities or floating airports.

Expand full comment

Standard Oil was perhaps the original Energy Bell.

Expand full comment

I worked at Bell Labs from 1969 to 1973 and the Xerox PARC from 1973-1991.

Both entities were derived from monopoly.

Bell Labs was part of the ATT supply chain which was disassembled by anti-trust laws and the world was in disarray from WWII.

Xerox failed to thrive due to managerial incompetence and a Corporate culture derived from the auto industry that could not convert PARC invention into business.

I worked at RCA labs for a summer in 1961. RCA failed because management made a lot of bad choices and bet the farm on them. GaAs vs Si; Capacitive Electronic Disk, and more....

Currently the US does a lot of research but no longer invests in developments that can take decades. That is now done by others outside of the US.

There is no economic basis on recreating the ATT supply chain of which Bell Labs was only one part.

Expand full comment

โ€œNewโ€ transmission lines should be a non-starter for multiple reason. To name just a handful:

- stringing transmissions lines about be hundreds of miles of climates-change-induced desiccated forest when high-wind events are now common is a recipe for more wildfithe amount of energy lost via transmission of electricity over long distances is a waste we canโ€™t afford.

- improvement in battery technologies (silicon anode for 100% more energy density; Brakeflow technology to prevent fires when Li-ion batteries short-circuit) eliminates the problem of intermittency for solar and wind farms, as well as danger of fires, and electricity lost via long-distance transmission.

- thousands of miles of transmission lines is a waste of metals (copper, steel, etc.) that should be devoted to substations (wind- and solar-energy farms near major metropolitan areas).

As with the trend in many other business sectors, you move the product/service closer to areas of large populations. To do otherwise is a waste of money, personnel, metals in tight supply, etc. โ€” all to purchase a public safety problem (fire) and more-expensive conventional energy.

Expand full comment

>>The inflationary theory of cosmology and the invention of the transistor are both โ€œbasic researchโ€, but their economic value is very very different.

It's important to note, though, that both sprang from Einstein's work on quantum mechanics and relativity, as well as the work of Volta and Tesla and others. We just don't get to have transistors without a bunch of people actually doing the more "useless" kind of basic research. So, if we're building a model of economic value of innovation, we have to not only factor in the proximate invention of a technology, but the more distal innovations as well.

That said, the distal innovations are still probably a relatively low proportion of the "Most-Optimal Gross National Investment Portfolio for Innovation" that we're essentially trying to formulate here. But the vast majority of academic research money isn't going into wacky topics like the evolution of duck genitalia; no, it's going to worthwhile projects.

IOW, Noah, I'm saying we don't really have a maldistribution problem within *topics* of basic research, we have a maldistribution between the more innovation-distal academic research *institutions* and the more valuable innovation-proximal private research *institutions*. And that maldistribution isn't driven by the federal government overallocating to the former and ignoring the latter, it's driven by the private sphere's abject failure to keep investing in their institutions. Which itself is, obviously, a result of an overly financialized economy and badly-counterproductive-to-the-point-of-bordering-on-evil philosophies like shareholder value. Moreover, even during the recent tech boom, all that ZIRP-fueled VC money was notoriously *not* going to your deep-innovation super-startups. It's clear that the private sector can't be trusted to invest in innovation. Like, ever.

I recognize that shareholder value and overfinancialization are complex problems that simply can't be solved with some handwaving and my own pretty rhetoric, but I think there's value in at least diagnosing the problem correctly. At the end of the day, I think we still agree that it's unlikely Bell would be replicated. But I think where we diverge is that I don't think we should necessarily focus on defining all innovation-proximal sectors as public utilities and then creating utility monopolies to serve them as the "One Cool Trick" that will magically make the next Bell for every industry. In fact, it kind of sounds like a fast-track to dystopia.

We're probably better served by working on the hard problems like shareholder value and overfinancialization, and then championing public policies that will simply incubate and empower deep-innovation super-startups to come up with the next innovative business model that, when combined with the right confluence of other circumstances, *will* create another generation of Bells.

Expand full comment

Interesting piece. I would argue that another approach might be what the Obama administration tried to do through the SBA: the Regional Innovation Clusters. The idea was to link regional universities and anchor large firms with the regional small and medium-size business ecosystem around a specific set of technologies (e.g., polymers, drones, wood products, logistics) in which it had a comparative (often historical) advantage. This would improve the pipeline of talent, funnel basic research into the market, and facilitate collaborations and client acquisitions (often at different stages of the supply chain).

The evaluations done of this initiative were quite promising and this wasn't a costly endeavor, relative to other actions the Federal government can take (and has taken).

Expand full comment

Thanks Noah. I started my career at Bell Labs and worked at Xerox PARC as a grad student. They were great places in their day. I remember one morning at PARC I was struggling with the coffee machine and a nice man came over and showed me how to use it. It turned out he was one of the top physicists in the world.

I gifted a subscription to a very smart friend. His comment was, "He [that's you] makes complex issues easy to understand. "

Expand full comment

An orthogonal argument to why the U.S. will likely never see a return of large, centralized for-profit research divisions is that they do not generate a sufficient ROI to shareholders. As cited above, the vast majority of the profits generated by the invention of the transistor did not return to the Bell system, but to downstream companies that successfully monitored the inventions. Similarly with Xerox PARC where Alan Kay invented the modern windowed system. It is a business case issue: the Bell system was in the business of telephone calls, not transistors or the eventual development of microprocessors. Xerox was in the business of selling copiers, not mini or microcomputers. A further example is Kodak, in the business of selling film, couldnโ€™t successfully monetize digital even though they were technologically ahead of Japan, who did manufacture the cameras.

Which leads to a different question about the technological ecosystem. Would these trajectories have been different if the U.S. had a coherent national industrial policy, such as Japanโ€™s MITI, in the latter half of the 20th century?

Expand full comment

Many of my friendโ€™s dads growing up were Bell Labs scientists, others worked for big pharma companies HQโ€™d in the same area of NJ.

Research funding for universities certainly made a huge difference and provided a career option for scientists interested in research.

However, it was the shift toward shareholder value and lower interest rates and inflation starting in the 1980s that really drove the shift. Lots of scientists and R&D depts got laid off. Shareholders didnโ€™t want to pay for expensive, risky bets. I worked at a chemical company where we had a 15 percent IRR threshold for projects back when inflation was 5 percent and interest rates were 8 percent. In the 90s as rates and inflation fell even lower, our IRR threshold was bumped up closer to 20 percent. Completely non-sensical from a CAPM point of view unless we wanted to take massive risk (then again, equity index investors think they will get 15 percent returns in perpetuity while the risk free rate is 2 percent - all part of the same insanity). Of course, my company didnโ€™t want to take massive investment risks. Instead we invested less or went after the low hanging fruits of cost cuts and outsourcing (and later, share repurchases).

Fortunately, VC and PE in some fields have sponsored innovation. In pharma, the big payoff is still a new drug. In tech, it is largely figuring out how to steal and monetize personal information and sell more ads.

Google has invested a tonne, though - most of it wasted and wasteful. University research is mostly a waste as well and a fairly corrupt/incestuous system to boot.

Also remember in the 1980s that Japan was in fashion - Japan was about continual improvement and leveraging/implementing others IP rather than creating from scratch.

Expand full comment