Not a day goes by without reading someone saying software development jobs as we know them will soon disappear. Github's copilot is an undeniable AI-app success accounting for a significant ARR contribution as well as lines of code contributions to repositories. Other AI startups like Cognition Labs and Poolside have raised significant rounds of funding at $B high valuations, both promising end-to-end AI-powered development agents.
As the founder of a software developer marketplace, my mother is already bothering me that I should probably find a different job.
It is undeniable that disruption will occur in the knowledge economy, but will it be a net destroyer of human jobs? And if so, what are the long-term implications?
In this post, i don't offer solutions, but rather a perspective on what key factors may determine that tipping point and some long-term consequences that keep me awake at night.
Can human intuition be replaced?
Most people in the software industry know that software applications evolve as more customer needs are understood and as customers use the products that enable new needs. Predicting all this ahead of building an app is a true form of art, and is an area of expertise that Coding Agents may take some time to acquire.
One would expect that Coding Agents will detect common scalability patterns and just program solutions off the bat, but more left-field optionality may be harder to predict and pre-build. The world is filled with companies that started with a very different product to the one that really took off.
On the other hand, as a counter-argument, if the cost of “rebuilding the house” goes down thanks to Coding Agents, then maybe there's a case to be made about brute-forcing app development and reducing the criticality of having a strong Product Manager or Technical Lead that can “see into the future” and design the architecture of the product in a way that can support future unknowns. In other words, pivoting may be easier and less costly.
Can you delegate a task if you don't precisely know what the task is?
One of Antonio Machado's most famous poems reads:
Walker, your footsteps are the road, and nothing more.
Walker, there is no road, the road is made by walking.
Walking you make the road, and turning to look behind you see the path you never again will step upon.
Walker, there is no road, only foam trails on the sea.
There is something to be said about the value of working towards an outcome as a key ingredient to getting to the “right” outcome. What i mean by that is that while there may be things that you want to delegate to get a job done, delegating all of it may prevent you from getting to the right satisfactory result.
For instance, in planning for a vacation trip, there's certainly value in using an assistant, but sometimes you don't really know what the trip you want to make until you start researching and planning for the trip. It's a chicken-and-the-egg problem brought to the level of the task itself. It may be that we can't fully delegate a task if we really want the task to be successfully completed.
What is the cost (effort and time) of delegation?
One final thought is around the concept of taste and preferences. I believe we are underestimating the amount of contextual information embedded in the hundreds of thousands decisions we've made in the past. Fully offloading those onto a agent can be time consuming and were a key breakthrough will be needed for mass adoption of agents to take place.
The reason is the same that prevents delegation today: sometimes it just takes too much time to fully explain someone what you need and, more importantly, how you need it done, than to do it yourself.
What do career paths look like?
Fast forward 50 years from now. If you believe there will be Senior Partners at a law firm, or a Executives at a Corporation, I wonder where their careers will have started. Many of the roles that serve as entry-points for those professional careers are the roles AI are more immediately starting to disrupt.
Consider this, as a company grows so does the complexity of what the company does and how. The organization requires help understanding the situation at a deeper level, and that creates the opportunity for a new job: the analyst. You see them everywhere: sales analyst, marketing analyst, data analyst, etc. These are great entry points to put a young brain to chew on small (and gradually larger) problems. Same is true with programming: the simple, menial task goes to the junior developer, giving them low-risk exposure to the end-to-end engineering process. Besides expertise, it helps the junior individual gain confidence.
What happens when an AI agent takes on that job-to-be-done? It may be easier to ask an AI agent for the answer to a “simple” problem than to pass it onto to “an unreliable human in need of training” —but if that opportunity never makes it to the trainee, what happens when time goes by and the humans at the top start to retire? Who do they get replaced with?
Will the Jevons paradox play out in this case?
I like to play tennis with a group of friends. It started as a friendly casual way to find someone to play with, but quickly devolved into a ATP-tour like mini league. So much so that one player decided to use AI to build a web app to track our results and calculate our rankings. Despite not being a developer, he got it up and running in a day, but soon hit some infrastructure blockage. After some serious frustration, he requested the help from a developer. Up until now, this use case (an amateur tennis group deciding to build custom software) wouldn’t exist. Instead of replacing developers, AI created new demand for developer resources.
I believe that AI Agents will make developers more efficient, which will accelerate the creation of more software, which in turn will command more demand for experienced developers. This seems to fit the definition of the Jevons paradox.
In economics, the Jevons paradox (/ˈdʒɛvənz/; sometimes Jevons effect) occurs when technological progress increases the efficiency with which a resource is used (reducing the amount necessary for any one use), but the falling cost of use induces increases in demand enough that resource use is increased, rather than reduced.[1][2][3][4] Governments, both historical and modern, typically expect that energy efficiency gains will lower energy consumption, rather than expecting the Jevons paradox.
If this paradox plays out, developers will be even more needed in the future, not less, and the same could potentially be applied to many other functions.
A final confession
No AI was used to write this post. But I did feed it to an AI in order to get a “spicy” title suggestion. I was surprised that the AI offered some unsolicited advice. Funnily enough, the AI seemed to like the reference to the Jevons Paradox as a strong conclusion that humans should not fear the rise of AI.
Paradoxically, I am now a bit more scared.