108 Comments
Dec 1, 2022Liked by Noah Smith

Looking forward to Noah's posts coming, like, four times a day, with the help of his AI assistant.

Expand full comment
Dec 1, 2022Liked by Noah Smith

The problem with training an AI on existing code is that it won't necessarily generate _good_ code, and if you as an engineer are not sharp enough to catch that, you can wind up propagating bugs. https://www.spiceworks.com/it-security/security-general/news/40-of-code-produced-by-github-copilot-vulnerable-to-threats-research/

Expand full comment

Yep. I am a senior software engineer with tons of experience and AI code autocomplete is extremely useful, but not necessarily due to the comparative advantage factor. It's a search engine.

Using generative code AI, I can answer the question, "How would someone else write this thing that I know how to write?" and get a coherent answer that I can evaluate for effectiveness and possibly incorporate into my own technique.

Similarly, instead of trying to figure out the API of a particular complex package or system, I can just ask the AI to get me started. It might get things wrong, but it'll dump the most commonly used calls, routes, etc. in my lap.

Also, it teaches conventions. If I'm not exactly sure what the canonical way to, say, write a route in a new web api framework I'm presented with, I can just ask co-pilot and get a ton of useful context. This can sometimes be better than the official documentation for systems that are poorly maintained.

I think this holds true basically everywhere. You can ask generative AI to explain something to you and get basically a distillation of the median internet understanding of that topic as a crash course.

Expand full comment

I love posts like this! But who gets the wealth of the added productivity from those tasks that are automated? The Governance and wealth distribution of A.I. at scale, either leads to Dystopia or Utopia, depending on how this is handled.

Expand full comment

What you are describing in this post is incredibly dystopian, dressed up with positive language to sound utopian.

Humans, in your telling, will still be around to *press the button* on the machine that does all the thinking, creating, adjusting and producing and then say "LGTM" like a pointy-haired middle-manager.

Then, as a reward for our efforts, we can go home and do an inferior job "for fun" for lack of any other purpose. Essentially this predicts that humans land in the role of permanent consumers, whose most important decision is which hobby to distract us from the gaping meaninglessness of existence.

The analogy of the boss still hiring a secretary even though he can type faster is very telling -- extrapolate that onto computers and people and you're describing a world where the AI is running everything and making all the important decisions and humans are still employed because we are occassionally useful at some minor task.

This post is able to spin this into a utopian story only by describing the process exclusively in terms of economics -- comparative advantage, utility, productivity, jobs, and so on. But I think it's relevant to most people's life satisfaction that you're predicting that every major endeavor that has given human beings meaning since at least the enlightenment, arguably for the last few millenia -- art, science, math, literature, invention, creativity, craftsmanship -- is about to be unceremoniously handed over to something else while we watch from a distance and get to fiddle with the output.

Expand full comment

This is a horribly naive take on jobs. Replacing tens of thousands of good, well paying jobs as weavers with tens of thousands of wretched, poorly paying jobs as loom operators was not a wonderful thing for those tens of thousands of weavers. There's a reason workers turned to violence in a movement that only ended when the king intervened on the part of the works.

If you look at the structure of the US job market, one notable change has been the decline in well paying mid-level skill jobs in urban areas. These have been replaced by low paying jobs, and our workforce has been increasingly bifurcated. Those jobs have been replaced by automation. I'll cite Autour's "Work of the Past, Work of the Future" on this.

There used to be a much larger market for trained secretaries, skilled shift managers, paralegals, resource allocation experts and so on. Those jobs have been computerized out of existence without need of artificial intelligence. AI is just going to make it worse, and if it does a much worse job than the humans it replaced, it won't matter to the handful of companies that dominate each industry.

I know you are trying to be a techno-optimist, but new technologies can have costly side effects that impact tens of millions and can lead to political instability. If we are serious about these technological transformations being good things, we need to make sure that they aren't bad things.

Expand full comment

One thing that gives me pause about the AI revolution is that the first 100 years of the industrial revolution made a few people rich but were absolutely miserable for the working class. Like worse than subsistence farming.

Expand full comment

I write code for a living. Or, at least, my job would be described as "developer" or "software engineer".

The amount of time I spend actually doing tasks at which GitHub Copilot would help is vanishingly small.

For instance, in a large system, changing one line of code often means reading and understanding hundreds more to understand whether your one-line change impacts anything else. GitHub Copilot can't help with that. Nor can it help translate the requirements from stakeholders into something implementable, or decide whether your unit and system tests are adequate.

As such, I don't think GitHub Copilot is going to result in major productivity improvements from developers. It's an incremental bump, no more.

Expand full comment

"Autocomplete for everything" is a good way of putting it. But I think we need to take a more nuanced look at the impact on the job market:

- In industries at the cutting edge, and where demand isn't yet saturated, AI will allow us to produce more value and we could see increased employment.

- But industries where demand is close to saturated (and costs need to come down), I imagine we'll see job cuts. Since AI will tend to make us more efficient rather than replace most jobs entirely, this could take the form of reduced hours, but the way employment is structured in the US (high health insurance costs being tied to jobs) are a barrier to this. So in some areas there will be technological unemployment.

It will be interesting to see where the S-curve of generative AI takes us, and how steep technological unemployment could be at its peak. Hopefully we can learn from history and have policies ready to keep the labor market strong if needed-- I took a closer look at what happened in the 70s and 80s here and hope to do a similar speculative analysis of the 2029s and 30s soon. At least this time, demographic trends will help keep the labor market tighter than it could be otherwise: https://2120insights.substack.com/p/wtf-happened-in-1971

Expand full comment

Great post!

This is similar to my model of the world. A lot of AI is going to be more like the cotton gin, than replacing our jobs completely. I 'solved' a science problem three weeks ago, and as is standard in the ML world, have spent three weeks since then performance optimizing my code and laboriously testing every component and fitting it into a DAG. I'd have happily not spent the last week autistically testing and writing every small component to do this, which I've written a million times in the past at various past jobs.

In the future tech teams won't need 10 engineers on permanent infra work to keep things from crashing. They can each then go and support an entire product on their own, with the tooling to keep things running.

I mean, when you think about it, is the AI revolution a bigger revolution than LAPACK? I mean eventually sure, but I've worked with oldhats who've talked about implementing their own matrix libraries before starting work, now we don't do any of that.

The only people who are scared are those who stopped thinking a decade ago, and get by only on their esoteric knowledge of some old framework that they're paid to keep running. But those people are cringe and need to be automated away anyway.

Expand full comment

"The increased wealth that AI delivers to society should allow us to afford more leisure time for our creative hobbies."

And yet, that wealth never seems to go to the workers, and so we don't ever seem to have more leisure time.

I work at a content marketing company that's starting to introduce AI services to clients. On the front-end, they'll be getting more articles for less money. As far as I can tell, on the back end, that means the writers patching up Jasper's writing - it's prone to circuitous logic and inventing facts - will be working more hours, not fewer.

As they say, we sure live in interesting times.

Expand full comment

I thought I would share that my partner, an artist, has adopted the use of Midjourney exactly as described and is churning out artwork with the AI completing the task of composition of the piece for her. However she fears that we’re months away from AI advancing to the point of ironing out the weird kinks like the extra fingers and other uncanny valley aspects. At that point, surely any paying customer will just source their artwork straight from the AI?

For my own part I’m an actuary and would appreciate the Excel autocomplete. I wonder if the AI will be as prone to the catastrophic spreadsheet error as the human.

Expand full comment

Interesting, but I think I agree more with the caveat than the rest of the piece. The technology seems to be improving so rapidly that what's predicted here might be a very brief phase before the AI takes over everything.

Also: tell us the truth. Is roon a person or something else?

Expand full comment

Wow, I can't wait to pick out my favorite statistically likely choice for 8 hours a day! Oops, I meant 2 hours, because we'll all have more leisure time and get paid 4x as much :)

It's also very important to me that everyone has a job, because otherwise we couldn't get work done, which is necessary. So I'm very glad AI will allow me to keep working for the rest of my life, because I like working, but who doesn't? In fact, I hope AI creates more jobs so I can get two of them. My dream job is to generate uncontroversial and inoffensive listicles using AI, but I would settle for generating terms of service for software. Secretly, I'd love to make AI generated music that has lyrics that no reasonable person in a normal culture would find upsetting.

The future sounds so great! I can't wait to race down it with you all!

Expand full comment

Comparative advantage doesn't always save you.

> “[T]here was a type of employee at the beginning of the Industrial Revolution whose job and livelihood largely vanished in the early twentieth century. This was the horse. The population of working horses actually peaked in England long after the Industrial Revolution, in 1901, when 3.25 million were at work. Though they had been replaced by rail for long-distance haulage and by steam engines for driving machinery, they still plowed fields, hauled wagons and carriages short distances, pulled boats on the canals, toiled in the pits, and carried armies into battle. But the arrival of the internal combustion engine in the late nineteenth century rapidly displaced these workers, so that by 1924 there were fewer than two million. There was always a wage at which all these horses could have remained employed. But that wage was so low that it did not pay for their feed.” - A Farewell to Alms

Expand full comment
Dec 1, 2022·edited Dec 1, 2022

I like the sandwich model of work a lot. I think the next question is how much that improves productivity and what that means for employment in different fields. If an illustrator can now produce 10X as much output, does that mean we need 10X less illustrators? Maybe. I think having enough context to do a particular project well will still be a scarce, less scalable commodity.

I do think we'll be awash in generic text and image outputs (think SEO-style content and stock images). The value/market price of that type of media will go down massively, and that will lead to dislocations. It's also going to be extremely annoying to search for things on the web as the results get clogged up with 95 different AI-generated versions of an article, none of which have real differentiated insight. I could see generative AI hastening the decline of the current generation of search ranking algorithms. If I were Google, I would be hard at work on algorithms that can detect novel information in a piece of writing compared to superficially similar articles.

I'm less convinced by the point about GPU/AI scarcity. We don't see web searches being rationed now, and while AI auto-completion is more likely to be a paid service, I have a hard time seeing us "running out" of AI capacity.

Expand full comment