> Previous industrial revolutions externalised their costs onto environments that seemed infinite until they weren't. Software ecosystems are no different: dependency chains, maintenance burdens, security surfaces that compound as output scales. Technical debt is the pollution of the digital world, invisible until it chokes the systems that depend on it. In an era of mass automation, we may find that the hardest problem is not production, but stewardship. Who maintains the software that no one owns?
This whole article was interesting, but I really like the conclusion. I think the comparison to the externalized costs of industrialization, which we are finally facing without any easy out, is a good one to make. We've been on the same path for a long time in the software world, as evidenced by the persistent relevance of that one XKCD comic.
There's always going to be work to do in our field. How appealing that work is, and how we're treated as we do that work, is a wide open question.
This essay, like so many others, mistakes the task of "building" software with the task of "writing" software. Anyone in the world can already get cheap, mass-produced software to do almost anything they want their computer to do. Compilers spit out new build of any program on demand within seconds, and you can usually get both source code and pre-compiled copies over the internet. The "industrial process" (as TFA puts it) of production and distribution is already handled perfectly well by CI/CD systems and CDNs.
What software developers actually do is closer to the role of an architect in construction or a design engineer in manufacturing. They design new blueprints for the compilers to churn out. Like any design job, this needs some actual taste and insight into the particular circumstances. That has always been the difficult part of commercial software production and LLMs generally don't help with that.
It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.
You're getting caught up on the technical meaning of terms rather than what the author actually wrote.
Theyre explicitly saying that most software will no longer be artisianal - a great literary novel - and instead become industrialized - mass produced paperback garbage books. But also saying that good software, like literature, will continue to exist.
I guess two things can be true at the same time. And I think AI will likely matter a lot more than detractors think, and nowhere near as much as enthusiasts think.
Perhaps a good analogy is the spreadsheet. It was a complete shift in the way that humans interacted with numbers. From accounting to engineering to home budgets - there are few people who haven't used a spreadsheet to "program" the computer at some point.
It's a fantastic tool, but has limits. It's also fair to say people use (abuse) spreadsheets far beyond those limits. It's a fantastic tool for accounting, but real accounting systems exist for a reason.
Similarly AI will allow lots more people to "program" their computer. But making the programing task go away just exposes limitations in other parts of the "development" process.
To your analogy I don't think AI does mass-produced paperbacks. I think it is the equivalent of writing a novel for yourself. People don't sell spreadsheets, they use them. AI will allow people to write programs for themselves, just like digital cameras turned us all into photographers. But when we need it "done right" we'll still turn to people with honed skills.
This was already true before LLMs. "Artisinal software" was never the norm. The tsunami of crap just got a bit bigger.
Unlike clothing, software always scaled. So, it's a bit wrongheaded to assume that the new economics would be more like the economics of clothing after mass production. An "artisanal" dress still only fits one person. "Artisanal" software has always served anywhere between zero people and millions.
LLMs are not the spinning jenny. They are not an industrial revolution, even if the stock market valuations assume that they are.
Agreed, software was always kind of mediocre. This is expected given the massive first mover advantage effect. Quality is irrelevant when speed to market is everything.
This thought-provoking essay does not consider one crucial aspect of software: the cost of a user developing a facility with a given software product. Historically monopolistic software producers can force these costs to be borne because the user has no alternative to upgrading to the latest version of, for example, Windows, or gmail, or the latest version of the github GUI. A signficant portion of the open source / free software movement is software providing stable interfaces (including for the user) so that resources otherwise spent on compulsory retraining to use the latest version of something proprietary, can be invested in configuring existing resources to better suit the user's problem domain. For example, programs like mutt or vim, or my latest discovery, talon.
I don't think the division line runs on the open-source software front here. Windows has historically offered some of the most stable APIs, meanwhile there's plenty of examples of popular open-source software with a lot of breaking changes.
The comment you replied to said "significant portion of" and I believe it is clear which portion that refers to: the culture around c, linux, vim and bash, not things like nodejs, java and (semi-open-source) elasticsearch which are culturally separate.
I've never found a term I liked for this particular concept at the intersection of education & business so I made one up a while back:
A Knowledge Pool is the reservoir of shared knowledge that a group of people have about a particular subject, tool, method, etc. In product strategy, knowledge pools represent another kind of moat, and a form of leverage that can be used to grow or maintain market share.
Usage: Resources are better spent on other things besides draining the knowledge pool with yet another new interface to learn and spending time and money filling it up again with retraining.
You could say the same things about assemblers, compilers, garbage collection, higher level languages etc. In practice the effect has always been an increase in the height of a mountain of software that can be made before development grinds to a halt due to complexity. LLMs are no different
In my own experience (and from everything I’ve read), LLMs as they are today don’t help us as an industry build a higher mountain of software because they don’t help us deal with complexity — they only help us build the mountain faster.
I've been thinking about this for a while, and largely agree that industralization of software development is what we are seeing. But the emphasis on low quality is misplaced.
Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.
Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.
I generally agree. Industrialization puts a decent floor on quality, at low cost. But it also has a ceiling.
Perhaps an industrial car is better than your or my artisanal car, but I'm sure there's people who build cars by hand of very high quality (over the course of years). Likewise fine carpentry vs mass produced stuff vs ikea.
Or I make sourdough bread and it would be very impractical/uncompetitive to start selling it unless I scaled up to make dozens, maybe hundreds, of loaves per day. But it's absolutely far better than any bread you can find on any supermarket shelf. It's also arguably better than most artisanal bakeries who have to follow a production process every day.
> but also they deliver quality way beyond what can be achieved in artisanal production
I don't think this is true in general, although it may be in certain product categories. Hand-built supercars are still valued by the ultra-wealthy. Artisanal bakeries consistently make better pastries than anything mass produced... and so on
As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.
I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand.
And this was long before any "big data" thing.
Nothing changed with AI. I keep doing the same things, but maybe the output have colours.
Heh...I've worked for 25 years and basically I'm yet to put code into production. Mostly projects that were cancelled or scrubbed either during development or shortly after or just downright never used since they were POC/prototypes.
I think I've overall just had just 2 or 3 projects where anyone has actually even tried the thing I've been working on.
That holds true for a tailor, even expensive clothing items eventually wear out and get thrown away. They are cared for better, repaired a few times, but in the end, disposed of. I’d say that analogy holds up for 'traditionally' created software vs. AI-created software. Handmade clothes vs. fast fashion.
These damn articles. Software moved into an industrial revolution when you could write in a high level language, and not in assembly. This has already happened.
The article makes this very point. From the article: “software has been industrialising for a long time: through reusable components (open source code), portability (containerisation, the cloud), democratisation (low-code / no-code tools), interoperability (API standards, package managers) and many other ways”
You either see what codex and opus are capable of and extrapolate the trendline or you don’t; the author clearly saw and extrapolated.
Not that I disagree: I’m on record agreeing with the article months ago. Folks in labs probably seen it coming for years.
Yes we’ve seen major improvements in software development velocity - libraries, OSes, containers, portable bytecodes - but I’m afraid we’ve seen nothing yet. Claude Code and Codex are just glimpses into the future.
Huh. Your statement was probably hyperbole? But just back of the napkin:
If we use about 20 TW today, in a thousand years of 5% growth we’d be at about 3x10^34. I think the sun is around 3.8x10^26 watts? That gives us about 8x10^7 suns worth of energy consumption in 1000 years.
If we figure 0.004 stars per cubic light-year, we end up in that ballpark in a thousand years of uniform spherical expansion at C.
But that assumes millions ( billions?) of probes traveling outward starting soon, and no acceleration or deceleration or development time… so I think your claim is likely true, in any practical sense of the idea.
Economic growth is not directly proportional to energy consumption. A major feature of any useful tool is that it (often dramatically) reduces energy consumption.
Thing is: Industrialization is about repeating manufacturing steps. You don't need to repeat anything for software. Software can be copied arbitrarily for no practical cost.
The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.
We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.
The "industrialisation" concept is an analogy to emphasize how the costs of production are plummeting. Don't get hung up pointing out how one aspect of software doesn't match the analogy.
> The "industrialisation" concept is an analogy to emphasize how the costs of production are plummeting. Don't get hung up pointing out how one aspect of software doesn't match the analogy.
Are they, though? I am not aware of any indicators that software costs are precipitously declining. At least as far as I know, we aren't seeing complements of software developers (PMs, sales, other adjacent roles) growing rapidly indicating a corresponding supply increase. We aren't seeing companies like mcirosoft or salesforce or atlassian or any major software company reduce prices due to supply glut.
So what are the indicators (beyond blog posts) this is having a macro effect?
In fact this is a counter argument to the point of the article. You're not making 'just more throwaway software' but instead building usable software while standing on the shoulders of existing algo's and libraries.
> You don't need to repeat anything for software. Software can be copied arbitrarily for no practical cost.
...Or so think devs.
People responsible for operating software, as well as people responsible for maintaining it, may have different opinions.
Bugs must be fixed, underlying software/hardware changes and vulnerabilities get discovered, and so versions must be bumped. The surrounding ecosystem changes, and so, even if your particular stack doesn't require new features, it must be adapted (a simple example: your react front breaks because the nginx proxy changed is subdirectory).
A question that was not addressed in the article and contrasts software with industrialized products from the past is - who are the consumers of the software produced at industrial scale? Stitching of clothes by machines accelerated garment product only because there was demand and consumption tied to population. But software is not tied to population similar to food and clothes. It doesn't deprecate, it is not exclusively consumed by persons.
Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.
Software was never coded in a big-bang one shot fashion. It evolves through years of interacting with the field. That evolution takes almost same time with AI or not. Remember a version release has many tasks that need to go at human speed.
User base comes from the value you provide. Value comes from the product features. Features come from code. If code is easy, anyone with 10K bucks in their pocket can provide those features and product. The only thing missing is, is the product battle-tested? That fortunately remains out of reach for AI.
Hmm, I'm not sure I see the value in "disposable software". In any commercial service people are looking for software solutions that are durable, dependable, extensible, maintainable. This is the exact opposite of disposable software.
The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.
I play several sports across several teams and leagues. Each league has their own system for delivering fixtures. Each team has its own system of communication.
What I want is software that can glue these things together. Each week, announce the fixture and poll the team to see who will play.
So far, the complete fragmentation of all these markets (fixtures, chat) has made software solutions uneconomic. Any solution's sales market is necessarily limited to a small handful of teams, and will quickly become outdated as fixtures move and teams evolve.
I'm hopeful AI will let software solve problems like this, where disposable code is exactly what's needed.
Yes, software needs to be secure. If we accept the premise that software is going to be churned out in bulk, then the mechanisms for securing software must evolve rapidly... I don't see a world where there is custom software for everything but all insecure in different ways.
Too many articles are written comparing LLMs to high-level languages. Sure, if you squint enough, both has to do with computers. But that comparison misses everything that is important about LLMs.
High-level languages are about higher abstractions for deterministic processes. LLMs are not necessarily higher abstractions but instead about non-deterministic processes, a fundamentally different thing altogether.
What I've been pondering is the nature of what makes the user interface of some software "industrial" versus "complicated." ~
“The difference I return to again and again isn’t tech depth. It’s constraints.”
"Rough framework I’m using lately:"
Consumer software aims at maximizing joy.
“Enterprise software is all about coordination.”
"Industrial software operates in a environment of the real-world "mess", yet its
"Industrial stuff appears to be more concerned with:
a.
failure modes
long-term maintenance
predictable behavior vs cleverness
But as soon as software is involved with physical processes, the tolerance for ambiguity narrows quickly.
Curious how others see it:
What’s your mental line between enterprise and industrial? What constraints have affected your designing? “Nice abstractions.” Any instances where these failed the test of reality?
The article kind of misses that cost has two axes : development cost and maintenance cost.
low cost/low value software tagged as disposable usually means development cost was low, but maintenance cost is high ; and that's why you get rid of it.
On the other hand, the difference between good and bad traditional software is that, while cost is always going to be high, you want maintenance cost to be low. This is what industrialization is about.
But I think the important part of this is the reach that the Industrial Revolution had. Consumer facing software, or the endusers who were able to "benefit" from the Industrial Revolution, and individual needs for all of these mass produced goods.
The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.
So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.
The industrial revolution was constrained by access to the means of production, leaving only those with capital able to actually produce, which lead to new economic situations.
What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?
And also false. Good programmers are always aware of the debt. It’s just not easily quantifiable as part of it can only be estimated when a change request has been made. And truly known when implementing the change.
It’s always a choice between taking more time today to reduce the cost of changes in the future, or get result fast and be less flexible later. Experience is all about keeping the cost of changes constant over time.
Not convinced. There is an obvious value in having more food or more products for almost anybody on Earth. I am not sure this is the case for software. Most people's needs are completely fulfilled with the amount and quality of software they already have.
> There is an obvious value in having more food or more products for almost anybody on Earth
Quite the opposite is true. For a large proportion of people, they would increase both the amount of years they live and quality of life by eating less.
I think the days where more product is always better lapse to an end - we just need to figure out how the economy should work.
But how about some silly software for just a giggle. Like 'write website that plays fart sound when you push button'? That can be a thing for the kids at school.
So many fallacies here, imprecise, reaching arguments, attempts at creating moral panic, insistence that most people create poor quality garbage code, in start contrast to the poster, the difference between his bespoke excellence, and the dreck produced by the soulless masses is gracefully omitted.
First the core of the argument that 'Industrialization' produces low quality slop is not true - industrialization is about precisely controlled and repeatable processes. A table cut by a CNC router is likely dimensionally more accurate than one cut by hand, in fact many of the industrial processes and machines have trickled back into the toolboxes of master craftsmen, where they increased productivity and quality.
Second, from my experience of working at large enterprises, and smaller teams, the 80-20 rule definitely holds - there's always a core team of a handful of people who lay down the foundations, and design and architect most of the code, with the rest usually fixing bugs, or making bullet point features.
I'm not saying the people who fall into the 80% don't contribute, or somehow are lesser devs, but they're mostly not well-positioned in the org to make major contributions, and another invariable aspect is that as features are added and complexity grows, along with legacy code, the effort needed to make a change, or understand and fix a bug grows superlinearly, meaning the 'last 10%' often takes as much or more effort than what came before.
This is hardly an original observation, and in today's ever-ongoing iteration environment, what counts as the last 10% is hard to define, but most modern software development is highly incremental, often is focused on building unneeded features, or sidegrade redesigns.
i would say comparing making of software and working factory makes analogy mistake. complete software is analogy to running factory. making software is making of the factory. that is specialised tooling, layouts, supply chain etc. when you have all this your factory runs on industrial scale and produces things. like your software produces value when its completed and used by enduser.
If that is true we will live in a funny world when you will loose all your money because you where running some outdated, riddled with holes software written by LLM running on some old router old cheap camera. Or some software will stop working after an update because some fix was written by LLM and nobody checked that nor tested. Or they will 3 outages of big internet services in 2 months.
I spent 15 years writing literal industrial software (manufacturing, test, and quality systems for a global high-tech manufacturing company, parts of which operated in regulated industries).
One of the things that happened around 2010, when we decided to effect a massive corporate change away from both legacy and proprietary platforms (on the one hand, away from AIX & Progress, and on the other hand, away from .Net/SQL Server), was a set of necessary decisions about the fundamental architecture of systems, and which -- if any -- third party libraries we would use to accelerate software development going forward.
On the back end side (mission critical OLTP & data input screens moving from Progress 4GL to Java+PostgreSQL) it was fairly straightforward: pick lean options and as few external tools as possible in order to ensure the dev team all completely understand the codebase, even if it made developing new features more time consuming sometimes.
On the front end, though, where the system config was done, as well as all the reporting and business analytics, it was less straightforward. There were multiple camps in the team, with some devs wanting to lean on 3rd party stuff as much as possible, others wanting to go all-in on TDD and using 3rd party frameworks and libraries only for UI items (stuff like Telerik, jQuery, etc), and a few having strong opinions about one thing but not others.
What I found was that in an organization with primarily junior engineers, many of which were offshore, the best approach was not to focus on ideally "crafted" code (I literally ran a test with a senior architect once where he & I documented the business requirements completely and he translated the reqs into functional tests, then handed over the tests to the offshore team to write code to pass. They didn't even mostly know what the code was for or what the overall system did, but they were competent enough to write code to pass tests. This ensured the senior architect received something that helped him string everything together, but it also meant we ended up with a really convoluted codebase that was challenging to holistically interpret if you hadn't been on the team from the beginning. I had another architect, who was a lead in one of the offshore teams, who felt very strongly that code should be as simple as possible: descriptive naming, single function classes, etc. I let him run with his paradigm on a different project, to see what would happen. In his case, he didn't focus on TDD and instead just on clearly written requirements docs. But his developers had a mix of talents & experience and the checked-in code was all over the place. Because of how atomically abstract everything was, almost nobody understood how pieces of the system interrelated.
Both of these experiments led to a set of conclusions and approach as we moved forward: clearly written business requirements, followed by technical specifications, are critical, and so is a set of coding standards the whole group understands and has confidence to follow. We setup an XP system to coach junior devs who were less experienced, ran regular show & tell sessions where individuals could talk about their work, and moved from a waterfall planning process to an iterative model. All of this sounds like common sense now that it's been standard in the tech industry for an entire generation, but it was not obvious or accepted in IT "Enterprise Apps" departments in low margin industries until far more recently.
I left that role in 2015 to join a hyperscaler, and only recently (this year) have moved back to a product company, but what I've noticed now is that the collaborative nature of software engineering has never been better ... but we're back to a point where many engineers don't fully understand what they're doing, either because there's a heavy reliance on code they didn't write (common 3P libraries) or because of the compartmentalization of product orgs where small teams don't always know what other teams are doing, or why. The more recent adoption of LLM-accelerated development means even fewer individuals can explain resultant codebases. While software development may be faster than ever, I fear as an industry we're moving back toward the era of the early naughts when the graybeard artisans had mostly retired and their replacements were fumbling around trying to figure out how to do things faster & cheaper and decidedly un-artisanally.
Personally I think AI is going to turn software into a cottage industry, it will make custom software something the individual can afford. AI is a very long ways off from being able to allow the average person to create the software they want unless they are willing to put a great deal of time into it, but it is almost good enough that the programmer can take the average person's idea and execute it at an affordable price. Probably only a year or two from when a capable programmer will be able to offer any small buisness a completely customized POS setup for what the cost of a canned industrial offering today; I will design your website and build you a POS system tailored to your needs and completely integrated with the website, and for a little more I can throw in the accounting and tax software. A bright dishwasher realizing they can make things work better for their employer might be the next billionaire revolutionizing commerce and the small buisness.
I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.
Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.
I find it interesting but not surprising that this got downvoted. Sure my idea of the craft is different than the articles and of many people but if the craft only there if it is pure hand written code then it is a craft which the vast majority can not afford. I can pay a luthier a few thousand and get my dream guitar and would happily spend that sort of money on getting custom software but that is not going to happen if I insist on 100% handwritten code, just as getting my dream guitar would not happen if I insisted on the luthier only using hand tools.
> and for a little more I can throw in the accounting and tax software
As someone who has worked in two companies that raised millions of dollars and had hundred people tackling just half of this, tax software, you are in for a treat.
Sure, that is still a ways off, but being able to hire a programmer to meet my personal modest software needs is almost there. Also, the needs of any company that required a hundred people and millions of dollars is very different from the needs of a small restaurant or the like; anyone with enough ambition to run a small restaurant can manage the accounting and taxes for that restaurant, the same can not be said for the sort of buisness you are describing. You are comparing an apple to an orange orchard.
Edit: Just noticed I said "any buisness", that was supposed to be "any small buisness." Edited the original post as well.
Another AI entrepreneur who writes a long article about inevitability, lists some downsides in order to remain credible but all in all just uses neurolinguistic programming on the reader so that the reader, too, will think the the "AI" revolution is inevitable.
Tldr; initially I thought we might be onto something, but now, I don't see much of a revolution.
I won't put intention into the text because I did not check any other posts from the same guy.
That said, I think this revolution is not revolutionary yet.
Not sure if it will be, but maybe?
What is happening os that companies are going back to "normal" number of people in software development. Before it was because of adoption to custom software, later because of labour shortage, then we had a boom because people caught up into it as a viable career but then it started scaling down again because one developer can (technically) do more with AI.
There are huge red flags with "fully automated" software development that are not being fixed but for those outside of the expertise area, doesn't seem relevant. With newer restrictions related to cost and hardware, AI will be even a worse option unless there is some sort of magic that fixes everything related to how it does code.
The economy (all around the world) is bonkers right now.
Honestly, I saw some Jr Devs earning 6 fig salaries (in USD) and doing less than what me and my friends did when we were Jr. There is inflation and all, but the numbers does not seem to add.
Part of it all is a re- normalisation but part of it is certainly a lack of understanding of software and/or// engineering.
Current tools, and I include even those kiro, anti-gravity and whatever, do not solve my problems, just make my work faster.
Easier to look for code, find data and read through blocks of code I don't see in a while.
Writing code not so much better. If it is simple and easy it certainly can do, but for anything more complex it seems that it is faster and more reliable to do myself (and probably cheaper)
I think the idea is interesting, but immensely flawed.
The following is just disingenuous:
>industrialisation of printing processes led to paperback genre fiction
>industrialisation of agriculture led to ultraprocessed junk food
>industrialisation of digital image sensors led to user-generated video
Industrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.
The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.
>This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.
This just is not true and goes against all available evidence, as well as basic economics.
>For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.
This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.
Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.
I agree with you on all of this, and found myself wondering if the author had actually studied the Industrial Revolution at all.
The Industrial Revolution created a flywheel: you built machines that could build lots of things better and for less cost than before, including the parts to make better machines that could build things even better and for less cost than before, including the parts to make better machines... and on and on.
The key part to industrialisation in the 19th-century framing, is that you have in-built iterative improvement: by driving down cost, you increase demand (the author covers this), which increases investment in driving down costs, which increases demand, and so on.
Critically, this flywheel has exponential outputs, not linear. The author shows Jevons paradox, and the curve is right there - note the lack of straight line.
I'm not sure we're seeing this in AI software generation yet.
Costs are shifting in people's minds, from developer salaries to spending on tokens, so there's a feeling of cost reduction, but that's because a great deal of that seems to be heavily subsidised today.
It's also not clear that these AI tools are being used to produce exponentially better AI tools - despite the jump we saw ~GPT-3.5, quantitive improvement in output seems to remain linear as a function of cost, not exponential. Yet investment input seems to be exponential (this makes it feel more like a bubble).
I'm not saying that industrialisation of the type the author refers to isn't possible (and I'd even say most industrialisation of software happened back in the 1960s/70s), or that the flywheel can't pick up with AI, just that we're not quite where they think it is.
I'd also argue it's not a given that we're going to see the output of "industrialisation" drive us towards "junk" as a natural order of things - if anything we'll know it's not a junk bubble when we do in fact see the opposite, which is what optimists are betting on being just around the corner.
This whole article was interesting, but I really like the conclusion. I think the comparison to the externalized costs of industrialization, which we are finally facing without any easy out, is a good one to make. We've been on the same path for a long time in the software world, as evidenced by the persistent relevance of that one XKCD comic.
There's always going to be work to do in our field. How appealing that work is, and how we're treated as we do that work, is a wide open question.
What software developers actually do is closer to the role of an architect in construction or a design engineer in manufacturing. They design new blueprints for the compilers to churn out. Like any design job, this needs some actual taste and insight into the particular circumstances. That has always been the difficult part of commercial software production and LLMs generally don't help with that.
It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.
Theyre explicitly saying that most software will no longer be artisianal - a great literary novel - and instead become industrialized - mass produced paperback garbage books. But also saying that good software, like literature, will continue to exist.
Perhaps a good analogy is the spreadsheet. It was a complete shift in the way that humans interacted with numbers. From accounting to engineering to home budgets - there are few people who haven't used a spreadsheet to "program" the computer at some point.
It's a fantastic tool, but has limits. It's also fair to say people use (abuse) spreadsheets far beyond those limits. It's a fantastic tool for accounting, but real accounting systems exist for a reason.
Similarly AI will allow lots more people to "program" their computer. But making the programing task go away just exposes limitations in other parts of the "development" process.
To your analogy I don't think AI does mass-produced paperbacks. I think it is the equivalent of writing a novel for yourself. People don't sell spreadsheets, they use them. AI will allow people to write programs for themselves, just like digital cameras turned us all into photographers. But when we need it "done right" we'll still turn to people with honed skills.
It's the article's analogy, not mine.
And, are you really saying that people aren't regularly mass-vibing terrible software that others use...? That seems to be a primary use case...
Though, yes, I'm sure it'll become more common for many people to vibe their own software - even if just tiny, temporary, fit-for-purpose things.
Unlike clothing, software always scaled. So, it's a bit wrongheaded to assume that the new economics would be more like the economics of clothing after mass production. An "artisanal" dress still only fits one person. "Artisanal" software has always served anywhere between zero people and millions.
LLMs are not the spinning jenny. They are not an industrial revolution, even if the stock market valuations assume that they are.
A Knowledge Pool is the reservoir of shared knowledge that a group of people have about a particular subject, tool, method, etc. In product strategy, knowledge pools represent another kind of moat, and a form of leverage that can be used to grow or maintain market share.
Usage: Resources are better spent on other things besides draining the knowledge pool with yet another new interface to learn and spending time and money filling it up again with retraining.
Take this for example:
``` Industrial systems reliably create economic pressure toward excess, low quality goods. ```
Industrial systems allow for low quality goods, but also they deliver quality way beyond what can be achieved in artisanal production. A mass produced mid-tier car is going to be much better than your artisanal car.
Scale allows you not only to produce more cheaply, but also to take quality control to the extreme.
Perhaps an industrial car is better than your or my artisanal car, but I'm sure there's people who build cars by hand of very high quality (over the course of years). Likewise fine carpentry vs mass produced stuff vs ikea.
Or I make sourdough bread and it would be very impractical/uncompetitive to start selling it unless I scaled up to make dozens, maybe hundreds, of loaves per day. But it's absolutely far better than any bread you can find on any supermarket shelf. It's also arguably better than most artisanal bakeries who have to follow a production process every day.
This has never been true for "artisanal" software. It could be used by nobody or by millions. This is why the economic model OP proposes falls apart.
I don't think this is true in general, although it may be in certain product categories. Hand-built supercars are still valued by the ultra-wealthy. Artisanal bakeries consistently make better pastries than anything mass produced... and so on
As a developer for almost 30 years now, if I think where most of my code went, I would say, quantitatively, to the bin.
I processed much data, dumps and logs over the years. I collected statistical information, mapped flows, created models of the things I needed to understand. And this was long before any "big data" thing.
Nothing changed with AI. I keep doing the same things, but maybe the output have colours.
I think I've overall just had just 2 or 3 projects where anyone has actually even tried the thing I've been working on.
Not that I disagree: I’m on record agreeing with the article months ago. Folks in labs probably seen it coming for years.
Yes we’ve seen major improvements in software development velocity - libraries, OSes, containers, portable bytecodes - but I’m afraid we’ve seen nothing yet. Claude Code and Codex are just glimpses into the future.
If we use about 20 TW today, in a thousand years of 5% growth we’d be at about 3x10^34. I think the sun is around 3.8x10^26 watts? That gives us about 8x10^7 suns worth of energy consumption in 1000 years.
If we figure 0.004 stars per cubic light-year, we end up in that ballpark in a thousand years of uniform spherical expansion at C.
But that assumes millions ( billions?) of probes traveling outward starting soon, and no acceleration or deceleration or development time… so I think your claim is likely true, in any practical sense of the idea.
Time to short the market lol.
The idea of automation creating a massive amount of software sounds ridiculous. Why would we need that? More Games? Can only be consumed at the pace of the player. Agents? Can be reused once they fulfill a task sufficently.
We're probably going to see a huge amount of customization where existing software is adapted to a specific use case or user via LLMs, but why would anyone waste energy to re-create the same algorithms over and over again.
Are they, though? I am not aware of any indicators that software costs are precipitously declining. At least as far as I know, we aren't seeing complements of software developers (PMs, sales, other adjacent roles) growing rapidly indicating a corresponding supply increase. We aren't seeing companies like mcirosoft or salesforce or atlassian or any major software company reduce prices due to supply glut.
So what are the indicators (beyond blog posts) this is having a macro effect?
I'm personally doing just that because I want an algorithm written in C++ in a LGPL library working in another language
...Or so think devs.
People responsible for operating software, as well as people responsible for maintaining it, may have different opinions.
Bugs must be fixed, underlying software/hardware changes and vulnerabilities get discovered, and so versions must be bumped. The surrounding ecosystem changes, and so, even if your particular stack doesn't require new features, it must be adapted (a simple example: your react front breaks because the nginx proxy changed is subdirectory).
Another common misconception is, it is now easier to compete with big products, as the cost of building those products will go down. Maybe you think you can build your own Office suite and compete with MS Office, or build a SAP with better features and quality. But what went into these software is not just code, but decades of feedback, tuning and fixing. The industrialization of software can not provide that.
On the contrary, this is likely the reason why we can disrupt these large players.
Experience from 2005 just don't hold that much value in 2025 in tech.
That would be why a significant portion of the world's critical systems still run on Windows XP, eh?
The whole premise of AI bringing democratization to software development and letting any layperson produce software signals a gross misunderstanding of how software development works and the requirements it should fulfill.
What I want is software that can glue these things together. Each week, announce the fixture and poll the team to see who will play.
So far, the complete fragmentation of all these markets (fixtures, chat) has made software solutions uneconomic. Any solution's sales market is necessarily limited to a small handful of teams, and will quickly become outdated as fixtures move and teams evolve.
I'm hopeful AI will let software solve problems like this, where disposable code is exactly what's needed.
High-level languages are about higher abstractions for deterministic processes. LLMs are not necessarily higher abstractions but instead about non-deterministic processes, a fundamentally different thing altogether.
There is a difference between writing for mainstream software and someone's idea/hope for the future.
Software that is valued high enough will be owned and maintained.
Like most things in our world, I think ownership/stewardship is like money and world hunger, a social issue/question.
“The difference I return to again and again isn’t tech depth. It’s constraints.”
"Rough framework I’m using lately:"
Consumer software aims at maximizing joy.
“Enterprise software is all about coordination.”
"Industrial software operates in a environment of the real-world "mess", yet its
"Industrial stuff appears to be more concerned with:
failure modeslong-term maintenance
predictable behavior vs cleverness
But as soon as software is involved with physical processes, the tolerance for ambiguity narrows quickly.
Curious how others see it:
What’s your mental line between enterprise and industrial? What constraints have affected your designing? “Nice abstractions.” Any instances where these failed the test of reality?
Your consumer/enterprise/industrial framework is orthogonal to the articles focus: how AI is massively reducing the cost of software.
low cost/low value software tagged as disposable usually means development cost was low, but maintenance cost is high ; and that's why you get rid of it.
On the other hand, the difference between good and bad traditional software is that, while cost is always going to be high, you want maintenance cost to be low. This is what industrialization is about.
The important thing is that goods =/= software. I, as an end user, of software rarely need specialized software. I dont need an entire app generated on the spot to split the bill and remember the difference if I have the calculator.
So, yes, we are industrializing software, but this reach that people talk about (I believe) will be severely limited.
What are the constraints with LLMs? Will an Anthropic, Google, OpenAI, etc, constrain how much we can consume? What is the value of any piece of software if anyone can produce everything? The same applies to everything we're suddenly able to produce. What is the value of a book if anyone can generate one? What is the value of a piece of art, if it requires zero skill to generate it?
It’s always a choice between taking more time today to reduce the cost of changes in the future, or get result fast and be less flexible later. Experience is all about keeping the cost of changes constant over time.
Quite the opposite is true. For a large proportion of people, they would increase both the amount of years they live and quality of life by eating less.
I think the days where more product is always better lapse to an end - we just need to figure out how the economy should work.
First the core of the argument that 'Industrialization' produces low quality slop is not true - industrialization is about precisely controlled and repeatable processes. A table cut by a CNC router is likely dimensionally more accurate than one cut by hand, in fact many of the industrial processes and machines have trickled back into the toolboxes of master craftsmen, where they increased productivity and quality.
Second, from my experience of working at large enterprises, and smaller teams, the 80-20 rule definitely holds - there's always a core team of a handful of people who lay down the foundations, and design and architect most of the code, with the rest usually fixing bugs, or making bullet point features.
I'm not saying the people who fall into the 80% don't contribute, or somehow are lesser devs, but they're mostly not well-positioned in the org to make major contributions, and another invariable aspect is that as features are added and complexity grows, along with legacy code, the effort needed to make a change, or understand and fix a bug grows superlinearly, meaning the 'last 10%' often takes as much or more effort than what came before.
This is hardly an original observation, and in today's ever-ongoing iteration environment, what counts as the last 10% is hard to define, but most modern software development is highly incremental, often is focused on building unneeded features, or sidegrade redesigns.
Oh wait. It is already a thing.
The mass production of unprocessed food is not what led to the production of hyper processed food. That would be a strange market dynamic.
Shareholder pressure, aggressive marketing and engineering for super-palatable foods are what led to hyper processed foods.
One of the things that happened around 2010, when we decided to effect a massive corporate change away from both legacy and proprietary platforms (on the one hand, away from AIX & Progress, and on the other hand, away from .Net/SQL Server), was a set of necessary decisions about the fundamental architecture of systems, and which -- if any -- third party libraries we would use to accelerate software development going forward.
On the back end side (mission critical OLTP & data input screens moving from Progress 4GL to Java+PostgreSQL) it was fairly straightforward: pick lean options and as few external tools as possible in order to ensure the dev team all completely understand the codebase, even if it made developing new features more time consuming sometimes.
On the front end, though, where the system config was done, as well as all the reporting and business analytics, it was less straightforward. There were multiple camps in the team, with some devs wanting to lean on 3rd party stuff as much as possible, others wanting to go all-in on TDD and using 3rd party frameworks and libraries only for UI items (stuff like Telerik, jQuery, etc), and a few having strong opinions about one thing but not others.
What I found was that in an organization with primarily junior engineers, many of which were offshore, the best approach was not to focus on ideally "crafted" code (I literally ran a test with a senior architect once where he & I documented the business requirements completely and he translated the reqs into functional tests, then handed over the tests to the offshore team to write code to pass. They didn't even mostly know what the code was for or what the overall system did, but they were competent enough to write code to pass tests. This ensured the senior architect received something that helped him string everything together, but it also meant we ended up with a really convoluted codebase that was challenging to holistically interpret if you hadn't been on the team from the beginning. I had another architect, who was a lead in one of the offshore teams, who felt very strongly that code should be as simple as possible: descriptive naming, single function classes, etc. I let him run with his paradigm on a different project, to see what would happen. In his case, he didn't focus on TDD and instead just on clearly written requirements docs. But his developers had a mix of talents & experience and the checked-in code was all over the place. Because of how atomically abstract everything was, almost nobody understood how pieces of the system interrelated.
Both of these experiments led to a set of conclusions and approach as we moved forward: clearly written business requirements, followed by technical specifications, are critical, and so is a set of coding standards the whole group understands and has confidence to follow. We setup an XP system to coach junior devs who were less experienced, ran regular show & tell sessions where individuals could talk about their work, and moved from a waterfall planning process to an iterative model. All of this sounds like common sense now that it's been standard in the tech industry for an entire generation, but it was not obvious or accepted in IT "Enterprise Apps" departments in low margin industries until far more recently.
I left that role in 2015 to join a hyperscaler, and only recently (this year) have moved back to a product company, but what I've noticed now is that the collaborative nature of software engineering has never been better ... but we're back to a point where many engineers don't fully understand what they're doing, either because there's a heavy reliance on code they didn't write (common 3P libraries) or because of the compartmentalization of product orgs where small teams don't always know what other teams are doing, or why. The more recent adoption of LLM-accelerated development means even fewer individuals can explain resultant codebases. While software development may be faster than ever, I fear as an industry we're moving back toward the era of the early naughts when the graybeard artisans had mostly retired and their replacements were fumbling around trying to figure out how to do things faster & cheaper and decidedly un-artisanally.
I have some programming ability and a lot of ideas but would happily hire someone to realize those ideas for me. The idea I have put the most time into, took me the better part of a year to sort out all the details of even with the help of AI, most programmers could have probably done it in a night and with AI could write the software in a few nights. I would have my software for an affordable price and they could stick it in their personal store so other could buy it. If I am productive with it and show its utility, they will sell more copies of it so they have an incentive to work with people like me and help me realize my ideas.
Programming is going to become a service instead of an industry, the craft of programming will be for sale instead of software.
As someone who has worked in two companies that raised millions of dollars and had hundred people tackling just half of this, tax software, you are in for a treat.
Edit: Just noticed I said "any buisness", that was supposed to be "any small buisness." Edited the original post as well.
I won't put intention into the text because I did not check any other posts from the same guy.
That said, I think this revolution is not revolutionary yet. Not sure if it will be, but maybe?
What is happening os that companies are going back to "normal" number of people in software development. Before it was because of adoption to custom software, later because of labour shortage, then we had a boom because people caught up into it as a viable career but then it started scaling down again because one developer can (technically) do more with AI.
There are huge red flags with "fully automated" software development that are not being fixed but for those outside of the expertise area, doesn't seem relevant. With newer restrictions related to cost and hardware, AI will be even a worse option unless there is some sort of magic that fixes everything related to how it does code.
The economy (all around the world) is bonkers right now. Honestly, I saw some Jr Devs earning 6 fig salaries (in USD) and doing less than what me and my friends did when we were Jr. There is inflation and all, but the numbers does not seem to add.
Part of it all is a re- normalisation but part of it is certainly a lack of understanding of software and/or// engineering.
Current tools, and I include even those kiro, anti-gravity and whatever, do not solve my problems, just make my work faster. Easier to look for code, find data and read through blocks of code I don't see in a while. Writing code not so much better. If it is simple and easy it certainly can do, but for anything more complex it seems that it is faster and more reliable to do myself (and probably cheaper)
The following is just disingenuous:
>industrialisation of printing processes led to paperback genre fiction
>industrialisation of agriculture led to ultraprocessed junk food
>industrialisation of digital image sensors led to user-generated video
Industrialization of printing was the necessary precondition for mass literacy and mass education. The industrialization of agriculture also ended hunger in all parts of the world which are able to practice it and even allows for export of food into countries which aren't (Without it most of humanity would still be plowing fields in order not to starve). The digital image sensor allows for accurate representations of the world around us.
The framing here is that industrialization degrades quality and makes products into disposable waste. While there is some truth to that, I think it is pretty undeniable that there are massive benefits which came with it. Mass produced products often are of superior quality and superior longevity and often are the only way in which certain products can be made available to large parts of the population.
>This is not because producers are careless, but because once production is cheap enough, junk is what maximises volume, margin, and reach.
This just is not true and goes against all available evidence, as well as basic economics.
>For example, prior to industrialisation, clothing was largely produced by specialised artisans, often coordinated through guilds and manual labour, with resources gathered locally, and the expertise for creating durable fabrics accumulated over years, and frequently passed down in family lines. Industrialisation changed that completely, with raw materials being shipped intercontinentally, fabrics mass produced in factories, clothes assembled by machinery, all leading to today’s world of fast, disposable, exploitative fashion.
This is just pure fiction. The author is comparing the highest quality goods at one point in time, who people took immense care of, with the lowest quality stuff people buy today, which is not even close to the mean clothing people buy. The truth is that fabrics have become far better and far more durable and versatile. The products have become better, but what has changed is the attitude of people towards their clothing.
Lastly, the author is ignoring the basic economics which separate software from physical goods. Physical goods need to be produced, which is almost always the most expensive part. This is not the case for software, distributing software millions of times is not expensive and only a minuscule part of the total costs. For fabrics industrialization has meant that development costs increased immensely, but per unit production costs fell sharply. What we are seeing with software is a slashing of development costs.
The Industrial Revolution created a flywheel: you built machines that could build lots of things better and for less cost than before, including the parts to make better machines that could build things even better and for less cost than before, including the parts to make better machines... and on and on.
The key part to industrialisation in the 19th-century framing, is that you have in-built iterative improvement: by driving down cost, you increase demand (the author covers this), which increases investment in driving down costs, which increases demand, and so on.
Critically, this flywheel has exponential outputs, not linear. The author shows Jevons paradox, and the curve is right there - note the lack of straight line.
I'm not sure we're seeing this in AI software generation yet.
Costs are shifting in people's minds, from developer salaries to spending on tokens, so there's a feeling of cost reduction, but that's because a great deal of that seems to be heavily subsidised today.
It's also not clear that these AI tools are being used to produce exponentially better AI tools - despite the jump we saw ~GPT-3.5, quantitive improvement in output seems to remain linear as a function of cost, not exponential. Yet investment input seems to be exponential (this makes it feel more like a bubble).
I'm not saying that industrialisation of the type the author refers to isn't possible (and I'd even say most industrialisation of software happened back in the 1960s/70s), or that the flywheel can't pick up with AI, just that we're not quite where they think it is.
I'd also argue it's not a given that we're going to see the output of "industrialisation" drive us towards "junk" as a natural order of things - if anything we'll know it's not a junk bubble when we do in fact see the opposite, which is what optimists are betting on being just around the corner.