18 Comments

I am not an economist, so below are my 2 cents based on my observations, reading, and experience, and since it is a sample size of 1, I could be way off:

1. I do not look at jobs only based on the number of jobs created by a new technology. Will these newly created jobs pay as much or better than jobs that will disappear? What would be the quality of employment? Are these jobs going to motivate people to go to work, or is this one of the Bullshit jobs mentioned in this book (https://tinyurl.com/msckmhxd)? Note: I'm not 100% in agreement with this book, but I think it is mainly true.

2. Also, if the new jobs require little or no skill or very high skill, the former will be boring for skilled people, and the latter will create a situation in which there may not be enough people to do the jobs even after AI augmentation.

3. Also, doing more with fewer people or doing more with the same number of people is an excellent way to consider AI benefits if they ever become realized. However, in the short term, it may create higher unemployment since companies will hire fewer people or not look for replacements once someone leaves.

4. My other fear is that companies will stop hiring people with no or little experience if they get very high productivity from experienced people. There won’t be a lot of incentive to hire low—or no-experience people, creating a long-term situation where we will not have a pipeline of experienced people.

5. We may see a lot of pushback from people who think AI will threaten their jobs. I recently read in the Washington Post (https://tinyurl.com/4jk32red) that even doctors do not want to use AI in their work because they think it would be an excuse for insurance and hospital administrators to cut staff in the name of innovation and efficiency.

6. As someone said, “As soon as it works, no one calls it AI anymore, and it is software.” It will become part of the software and be introduced without people knowing it exists.

7. I agree that we will eventually need AI and robots to keep the world growing as the birth rate continues to drop and there are not enough working-age people. However, we are not there yet. Do we still need very high growth if we have fewer people, as older people, either way, consume much less?

8. How will we compensate people who will lose their jobs? Yes, universal basic pay is an option. However, most of us work not only for money but also because it gives us a purpose and a reason to wake up in the morning. A few people say you can travel and work on your hobbies, but I do not think it is for everyone or most people. You can do these things for so long after that you want to do something similar to a job as you want to feel valued and add value to your organization, community, and society.

9. Very few companies will say it will replace people's jobs as they do not want people to think their product is a threat, even though they know this is the case.

10. Drawing parallels between AI and past technological revolutions may be misleading. While previous innovations like the printing press, steam engine, or computers primarily automated physical tasks or streamlined existing workflows, AI represents something fundamentally different - it's the first technology designed to replicate and potentially surpass human cognitive abilities. Unlike past transitions where humans could shift to roles requiring higher thinking, AI targets these very cognitive tasks. While new jobs may emerge, as they did with previous innovations, we cannot rely on historical patterns when facing a technology that competes with human intelligence itself. As uncomfortable as this conclusion may be, we're in uncharted territory. I do not like to say that, but it may be different this time.

To summarize, we need to be ready for a worst-case scenario in which there would be a lot of unemployment, and many people may not find purpose without jobs. We should plan for it and hope we do not see it.

Thoughts?

Expand full comment
Nov 1Liked by Jurgen Gravestein

Nothing wrong with your optimism, but so far there is scant evidence that genAI is amplifying human abilities. It was designed to replace them, its founders talk about replacing people, and the first target for replacement was creative work: something we actually enjoy and don't want to replace. Market forces and CEO greed is turning this into a race to the bottom, with no plan in place for how to deal with the fallout.

Expand full comment
author
Nov 1·edited Nov 1Author

I agree. But if you believe like me that real creative work (done by humans) isn’t so easily substituted I think even on that front there’s hope. That doesn’t mean certain people will lose their jobs or see their roles change in the coming years — that I’m afraid is inevitable.

Expand full comment

You and I can agree that human creative work is worth supporting and paying for, but if we end up being the minority while art is commodified, guess what? A minority of patrons won't keep the majority of art alive and we will lose some of it forever. There will be little incentive for future artists. That's just one angle. Now do the same for any other job. Sure, if it's a bullshit job I don’t care if it is automated, but most jobs are not bullshit. AI isn't like the computer or the motor car, it's almost like a new species that plays by completely different rules, and it might be completely controlled by a tech oligarchy who wants to take us into an era of neo fuedalism.

Expand full comment
Oct 29Liked by Jurgen Gravestein

That Neopeople pricing screen really illustrates the allure to software businesses. You're telling me, instead of creating software that needs to produce output in realtime and I can get 10% of users to pay a $5/month subscription while the rest use a free tier, I can instead dress it up as an "agent", charge 500x more as a "salary", and pass muster with up to 8 hours of latency per "deliverable"?

Expand full comment

I find Dr. Author and Lamanna arguments lacking some important aspects of AI. The speed of job disruption for AI seems much more rapid that previous general purpose technologies and the fact AI is not a tool but an agent (able to act autonomously and make decisions and create new content)

It does not touch on how automation of knowledge work could make our lives worse off. https://econ.st/3YEXRhH

Anton Korinek and Daron Acemolgu seem to have better knowledge around AI compared to AI

https://www.nber.org/system/files/working_papers/w32980/w32980.pdf?utm_source=PANTHEON_STRIPPED&amp%3Butm_medium=PANTHEON_STRIPPED

Even Keynes might disagree with Author's assessment...John Maynard Keynes in his famous 1930 paper titled Economic Possibility for our Grandchildren defined “technological unemployment” as the following: situation where the pace of automation exceeds the pace of new job creation.

This is already happening with my current job when it comes to task displacement by AI https://dmantena.substack.com/p/is-this-time-different

Expand full comment
author

Thank you for sharing your thoughts, Dan. I will definitely look into those sources - which may change my mind. I won’t pretend that I know anything with certainty.

I could see a world where the number of jobs automated by AI outpaces any new roles/jobs created. When that happens thay would be bad for society, especially if the wealth generated through this automation isn’t redistributed. (Which given how slow governments are to act is likely to happen too late or not at all; basically evaporating the middle class).

Expand full comment

Let me rephrase "AI is designed to replace humans."

to

"This AI is designed to replace knowledge workers." (aka white-collar, yup)

But then again, ultimately, we wanted machines to help with brain work, didn't we?

Expand full comment

who is "we" in your last sentence? :)

Expand full comment

Well, I'm not going to impose this on you, Dan.

Feel free to be excluded from that particular "we"

Expand full comment

I do want machines to help with brain work! but I would prefer narrow AI tools (like perplexity ai, or notebookLM by google) instead of general, LLM agents that are intended to do all of my work. I do enjoy some of it, haha

The idea of AGI seems to be a desire from the AI community that is being pushed on us knowledge workers imo.

Expand full comment

The idea of AGI also seems to be what you need to continuously point to, since that supposedly, maybe undoubtedly, is unlocking the jackpot.

Gotta justify these investments.

Also, OpenAI mentions AGI right in the mission …

"OpenAI is an AI research and deployment company. Our mission is to ensure that artificial general intelligence benefits all of humanity."

vs Anthropic …

We believe AI will have a vast impact on the world. Anthropic is dedicated to building systems that people can rely on and generating research about the opportunities and risks of AI.

Expand full comment

See, Dan, we are pumping engagement double-handedly here.

Jurgen doesn't even have to kick this off.

Expand full comment

agreed.

from a rationalist perspective (which I have heard Silicon Valley VCs strongly believe in), the asymmetric payoff of AGI can help justify any amount of investment.

Expand full comment
author

Love the discussion you got going on here!

Expand full comment

I couldn't easily come by data about what people are using ChatGPT for. My hunch is that the majority of users are using it mostly for very trivial stuff. And by trivial, I don't mean stupid (but yes, that is probably another major sub-segment within that majority segment), but, you know, not very productivity-related.

Even those who report to use AI for work, I mean, what are they doing? What is always mentioned is "writing email" – but email is in itself a sinkhole of (or for) productivity. So maybe people are churning out more emails now. Fantastic 🫣

Expand full comment
Oct 28Liked by Jurgen Gravestein

Completely agree and think it’s the only way it will make financial sense for these businesses, and it’s the only way OpenAI etc can ever make it a viable model. Is it right for us as humans and as a society? I doubt they ever stopped to ask that.

Expand full comment
author
Oct 28·edited Oct 28Author

But we have to generate value for the shareholders, Stephen...

Expand full comment