Some of it is the fault of AI and the belief that software is super easy to create now and maintenance won't be an issue in the future, mainly by folks who have very little to no experience with writing software but attempt it via the many ways you can do it with AI these days.
Then there's the shiny object syndrome of humanity in general even if we just look at websites they went through so many different cycles, plain html, flash, everything built with bootstrap.css, then came the frameworks, then back to SSR/SSG, etc... etc..
Both of those are just symptoms of a larger disease , namely lack of enthusiasm in general has fallen, a lot of it has to do with how demanding day to day software jobs have gotten, or how financially unstable the younger generations feel so they rarely set aside any time for creative endeavors and passion projects
Human beings are ephemeral. They're born, they die.
Everything human beings create is ephemeral. That restaurant you love will gradually drop standards and decay. That inspiring startup will take new sources of funding and chase new customers and leave you behind, on its own trajectory of eventual oblivion.
When I frame things this way, I conclude that it's not that "software quality" is collapsing, but the quality of specific programs and companies. Success breeds failure. Apple is almost 50 years old. Seems fair to stipulate that some entropy has entered it. Pressure is increasing for some creative destruction. Whose job is it to figure out what should replace your Apple Calculator or Spotify? I'll put it to you that it's your job, along with everyone else's. If a program doesn't work, go find a better program. Create one. Share what works better. Vote with your attention and your dollars and your actual votes for more accountability for big companies. And expect every team, org, company, country to decay in its own time.
[2] Vanity of vanities, says the Preacher,
vanity of vanities! All is vanity.
[3] What does man gain by all the toil
at which he toils under the sun?
[4] A generation goes, and a generation comes,
but the earth remains forever.
[5] The sun rises, and the sun goes down,
and hastens to the place where it rises.
Agreed, but If I can add one more angle: creative destruction is strangled when engineers are glued to the Internet about “how software should be built.”
I've published a blog post urging [0] top programmers to quit for‑profit social media and rebuild better norms away from that noise.
My hot take is that quality is inversely proportional to income. The more someone is paying for something, the more bloodsucking mercenaries are attracted to it that have less consideration for the quality of the output than for their own enrichment. (A corollary to this is that the more a job pays the more it will suck: the only way they can get people to come help and keep them there is to offer high compensation).
Look at trappist brewers. Long tradition of consistent quality. You just have to devote your life to the ascetic pursuit of monkhood. It attracts a completely different kind of person.
It's certainly a provocative thought. But I think it's too blunt. In our commercial world sometimes the cheaper thing works better and sometimes the more expensive thing works better. So the lesson I take away is that price is not a great signal in the absence of other context. Trappist brewers have some other cultural norms going for them, and the focus should be on those norms rather than price. The people attracted to it aren't thinking much about the money. If you value them, why would you?
I think there’s often a misalignment of incentives when annual perf reviews are judged on feature work delivered not quality. Engineers who spend any time polishing or finding and fixing bugs wind up rated mid, while folks who quickly crank out mediocre or bad code that does something new are on a fast track for promotion. This creates a current in the whole org where PMs, engineers, managers, etc., are all focused on new things all the time. Any quality work has to accompany a new feature proposal for traction, and the quality items in that project will never get cross functional support like the new feature work does.
This resource allocation strategy seems rational though. We could consume all available resources endlessly polishing things and never get anything new shipped.
Honestly it seems like the another typical example of the “cost center” vs “revenue center” problem. How much should we spend on quality? It’s hard to tell up front. You don’t want to spend any more than the minimum to prevent whatever negative outcomes you think poor quality can cause. Is there any actual $ increase from building higher quality software than “acceptable”?
You shouldn't, because that 20% more will not go to the engineers; it will go to parasitic management. Money is better if paid for services than for software. I would find well-maintained open source software instead, perhaps contribute to one, or put in the effort to develop one.
If I'm choosing between 2 products on the market, if I pay for the higher quality product, that's going to increase that company's revenue. That will either allow them to pay the existing engineers more, or hire more engineers. The lower quality product won't get my revenue, and thus won't be able to do that.
That's a different situation from what this conversation was about. The conversation was about if there are 2 for-profit products, one costing 20% more than the other, would paying for the more expensive one provide money to engineers or not.
Regarding the new conversation topic: some open source software does have revenue. Forms of revenue include: donations, selling support, selling licenses that are less restrictive than the original open source license, ads, and selling addons. Yes, revenue for open source software is generally less than for-profit software, and despite that the open source software is often higher quality. I didn't claim that a higher quality product will always have more revenue than a lower quality product. I just made a claim about where the money goes.
I think we're getting somewhere now. The solution could be to fund the developers of open source software, actively paying them to implement the desired features that otherwise don't get picked up (while maintaining good quality and their continued ownership over the software). Micromanaging them as employees wouldn't work.
I'm not sure this really matters at this point. It's like filming yourself giving food to the homeless. Is it better if you didn't? Yeah, probably. But at the end of the day does that person have food when otherwise they wouldn't? Also yeah.
I'd rather take a step in the right direction than none at all. If the management can be convinced that there's more money to be made this way then that gives us engineers more power to convince them to solve other such problems. If they care about quality then that gives us back negotiating power. You don't outsource to a third world software mill or AI when your concern is quality. But you do when you were trying to sell the cheapest piece of shit that people will still buy. So yeah, I'm okay with this
You can be okay with it, but it's not going to solve the problem. Management will fix the issue, then soon revert to enshittification and exploitation, so your next major issue will stay unfixed. In the best case, your software will become an annual subscription where you've to keep paying an obscene amount for no new features at all. Overall, it would be a step in the right direction, but only a single step.
> You don't outsource to a third world software mill or AI when your concern is quality.
That's a disastrously fallacious set of presuppositions. A good engineer will use AI well to improve their software, whereas a bad engineer will use it to produce junk.
I want to stress that this is a highly complex problem that needs to be solved and that means we need to break it down into smaller manageable tasks. You're not going to change everything overnight, a single person won't change things, nor will a single action change things. There's no clear definitive objective that needs to be solved to sole this problem. Nor is there a magic wizard in the tower that needs to be defeated.
In other words, I gave you my explanation for why I think this can be a step in the right direction (in a sister comment I said even more if you want to read that). But you have complained and given no alternative. Your only critique is that it does not solve the problem in one fell swoop. That was never an assumption I made, it is not a reasonable assumption to make (as you yourself are noting), and I explicitly said it is not an assumption being made. Do not invent problems to win an argument. All you've done is attempt to turn a conversation into an argument.
> it would be a step in the right direction, but only a single step.
So don't stop after one step.
> That's a disastrously fallacious set of presuppositions.
Read more carefully. I did not say "use AI" I said "outsource to AI". There is a huge difference in these two things.
Do we need to fight or can we actually have a discussion to help figure out this problem together? You do not need agree with me, contention can be beneficial to the process, but you do need to listen. I have no interest in fighting, so I'll leave the choice to you.
That's the problem with lemon markets though. They are a feedback loop and usually dependent on asymmetric information.
As a simple version think about it this way: if a customer can't tell the difference in quality at time of purchase then the only signal they have is price.
I think even here on HN if we're being honest with ourselves it's hard to tell quality prior to purchase. Let alone the average nontechnical person. It's crazy hard to evaluate software even hands on. How much effort you need you put in these days. The difficulty of differentiating sponsored "reviews" from legitimate ones. Even all the fake reviews or how Amazon allows changing a product and inheriting the reviews of the old product.
No one asks you because all the sellers rely too heavily on their metrics. It's not just AI people treat like black boxes, it's algorithms and metrics in general. But you can't use any of that effectively without context.
At engineers I think we should be a bit more grumpy. Our job is to find problems and fix them. Be grumpy to find them. Don't let the little things slip because even though a papercut isn't a big deal, a thousand is. Go in and fix bugs without being asked to. Push back against managers who don't understand. You're the technical expert, not them (even if they were once an engineer, those skills atrophy and you get disconnected from a system when you aren't actively working on it). Don't let them make you make arguments about some made up monetary value for a feature or a fix. It's managements job to worry about money and our job to worry about the product. There needs to be a healthy adversarial process here. When push comes to shove, we should prioritize the product over the profit while they should do the opposite. This contention is a feature, not a bug. Because if we always prioritize profits, well, that's a race to the bottom. It kills innovation. It asks "what's the shittiest cheapest thing we can sell but people will still buy". It enables selling hype rather than selling products. So please, be a grumpy engineer. It's in the best interest of the company. Maybe not for the quarter, but it is for the year and the decade. (You don't need to be an asshole or even fight with your boss. Simply raising concerns about foreseeable bugs can be a great place to start. Filling bug reports for errors you find too! Or bugs your friends and family find. Or even help draft them with people like on HN that raise concerns about a product your company works on. Doesn't need to be your specific team, but file the bug report for someone who can't)
And as the techies, we should hold high standards. Others rely on us for recommendations. We need to distill the nuances and communicate better with our nontechnical friends and family.
These won't solve everything but I believe they are actionable, do not require large asks, and can push some progress. Better something than nothing, otherwise there will be no quality boots to buy
Exactly. It's an industry shift, and one person can't reverse it alone.
But I disagree with "better something than nothing" when it comes to quality. That's how we normalized catastrophes in the first place.
The lemon market problem you described is real—users can't evaluate quality, so price becomes the only signal. But engineers can evaluate quality. We're the ones who should refuse to ship garbage, even if management pushes back.
Being grumpy works locally. It won't fix the industry, but it fixes your team. And when enough teams refuse to normalize this, the pattern shifts.
This just makes it sound like software engineering hasn't matured yet to realize we're building real world systems. It's still a pretty new field, comparatively, but big companies can't run like startups. They should have groups in them that look like that, but not for most groups
My view is fairly simple - demand for technology is always increasing at a rate which far outstrips supply of _good_ engineers (by a significant factor). The lure of a well paid career tempts many to the world of software engineering even if they're not very good at it.
Look at the construction industry. Many buildings on this planet were built hundreds, sometimes a thousand or more years ago. They still stand today as the quality of their build quality was excellent.
A house built today of cheap materials (i.e poor quality software engineers) as quickly as possible (i.e urgent business timelines) will fall apart in 30 years while older properties will continue to stand tall long after the "modern" house has crumbled.
These days software is often about being first to market with quality (and cough security) being a distant second priority.
However occasionally software does emerge as high quality and becomes a foundation for further software. Take Linux, FreeBSD and curl as examples of this. Their quality control is very high priority and time has proven this to be beneficial - for every user.
> Look at the construction industry. Many buildings on this planet were built hundreds, sometimes a thousand or more years ago. They still stand today as the quality of their build quality was excellent.
True. And yet, far more buildings built then are not standing. We just don't notice them, because they aren't still here for us to notice.
So don't think that things were built better then. A few were; most weren't.
It's a simple, timeless, inescapable law of the universe that failures, while potentially damaging, are acceptable risks. The Pareto principle suggests that addressing only the most critical 20% of issues issues yields a disproportionate 80% of the benefits, while the rest of the big bounties yield diminishing marginal returns.
We're seeing bugs in bigger slices because technology is, overall, a bigger pie. Full of bugs. The bigger the pie, the easier it is to eat around them.
Another principle at play might be "induced demand," most notoriously illustrated by widening highways, but might just as well apply to the widening of RAM.
Are we profligate consumers of our rareified, finite computing substrate? Perhaps, but the Maximum Power Transfer Theorem suggests that anything less than 50% waste heat would slow us down. What's the rush? That's above my pay grade.
I guess what I'm saying is that I don't see any sort of moral, procedural, or ideological decay at fault.
In my circles, QA is still very much a thing, only "shifted left" for tighter integration into CI/CD.
Edit: It's also worth reflecting on "The Mess We're In."[0] Approaches that avoid or mitigate the pitfalls common to writing software must be taught or rediscovered in every generation, or else wallow in the obscure quadrant of unknown-unknowns.
Close. Failure-free is simply impossible. And believing the opposite fails even harder and dies out.
This is not "acceptable", because there is no alternative, there is no choice or refutation (non-acceptance). It is a fact of life. Maybe even more so than gravity and mechanical friction.
Where I’m at, needless complexity is forced upon us. At the same time, we are constantly pushed to deliver new capabilities on timelines that are dictated to us, devoid of any grounding in reality. There is no room to even have the conversation about proper design or solving the right problems. It’s all about hitting arbitrary dates with “features” no one really cares about, while ignoring the foundation it all has to sit on.
The more loudly someone speaks up, the faster they are shown the door. As a result, most people keep their head down, pick their battles carefully, and try to keep their head above water so they can pay the rent.
I wouldn't work at a place that doesn't value my perspective either.
The problem is that most engineers realize this 6-12 months in, after they're already invested, and leaving means starting over.
This is why "keep your head down to pay rent" becomes the default. The system is designed to make speaking up too expensive.
You are in a completely normal dev shop. What's happened is that the start up mentality of "ship something - anything, and ship it NOW" has infected everything. Maybe over time you can make it better. But educating management can be a slow and frustrating process. Good luck!
I believe its a mix of three factors, (a) lack of transfer of institutional knowledge (b) lesser fundamental incentives for people to get better at fundamental skills/gaps (c) rise in hotfixes as we deal with time/scales that operate much faster, burn faster, and want to expand faster.
All of the above is multiplied 1.3x-1.5x with accelerating ways to get upto speed with iterative indexing of knowledge with llms. I believe we are reliant on those early engineers whose software took a while to build (like a marathon), and not short-sprinted recyclable software we keep shipping on it. The difference is not a lot of people want to be in those shoes (responsibility/comp tradeoffs.
Shoulda posted your link as a link instead of hiding it in text where we can’t click on it. If your blog post doesn’t stand on its own without an explanation you should rewrite it.
Thank you for your advice, but I’m new here and do not know how the things works here yet. And probably I’m not allowed to post links yet, because of new account
Ah welcome to HN. This particular etiquette is oddly not covered in the guidelines but instead in the FAQ:
> How do I make a link in a text submission?
> You can't. This is to prevent people from submitting a link with their comments in a privileged position at the top of the page. If you want to submit a link with comments, just submit the link, then add a regular comment.
"Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity." https://news.ycombinator.com/newsguidelines.html
The first and only activity of your account, created today, is to promote your own blog post.
You said, "I’m not allowed to post links yet, because of new account". There's a reason for that, and you're trying to bypass that restriction by misusing Ask HN instead.
If you turn on the "showdead" flag on your profile you will get a lot of insight into how HN works. Particularly there are a lot of people who post a link to their blog and another link to their blog and yet another link to their blog and these always start out [dead].
If, on the on the other hand you regularly post links to other people's blogs and participate in discussion you won't have any trouble slipping in a link to your own blog. The key is you're expected to be part of the community.
Also, part of the etiquette is to only post when you actually have something to add to the discussion. If you just want to say "thank you", that's what the upvote button is for.
One possible factor is the proliferation of LLMs to write code for us. I noticed that a few versions after Jetbrains implemented LLM integration, the bugs in their products skyrocketed. Also, in my own job, it's often tempting to use LLMs where I really shouldn't, by which I mean where I can't easily check the code to ensure that it's fully and subtly correct. For example, if I'm working with an unfamiliar library I might ask an LLM what the incantation is to do something, and then check it by reading the docs of each thing that it references. I regularly find issues when doing so, but a different developer who didn't check these docs may miss this and ship a subtle bug.
Speaking for myself, about my own software that I write alone: building new things is more exciting. Even with all financial incentives aside, I just like building new things more than I like polishing old ones.
I could improve the quality infrastructure, write more tests and clean up the code, but the work is not as fulfilling.
To me, a big reason is that people who don't understand how it works have a say in how it should be done.
It's not the case in traditional engineering fields: when you build a dam, the manager cannot say "hmm just use half as much concrete here, it will be faster and nobody will realise". Because people can go to jail for that. The engineers know that they need to make it safe, and the managers know that if the engineers say "it has to be like that for safety", then the manager just accepts it.
In software it's different: nobody is responsible for bad software. Millions of people need to buy a new smartphone because software needs twice as much RAM for no reason? Who cares? So the engineers will be pushed to make what is more profitable: often that's bad software, because users don't have a clue either.
Normal people understand the risks if we talk about a bridge or a dam collapsing. But privacy, security, efficiency, not having to buy a new smartphone every 2 years to load Slack? They have no clue about that. They just want to use what the others use, and they don't want to pay for software.
And when it's not that, it's downright enshittification: users don't have a choice anymore.
I agree there is this problem, which is why I try to write software that is actually good (and use software that is actually good, if it is available). It is also one reason why I still use some DOS programs (even with emulation, which is slow, it is still better than the bad quality they have in newer computers).
I do not use any of the software mentioned in that article, and I also do not have that much RAM in my computer.
This is the core philosophical divide in modern software development.
The "vibe coding MVP" crowd treats code as disposable—ship fast, validate the idea, and discard it if it doesn't work.
The problem: most MVPs never get thrown away. They get customers, then funding, then "we'll refactor later" becomes "we can't afford downtime to refactor."
That's how you end up with production systems built on prototypes that were never meant to handle real load.
I'm with you: if you're not willing to build it properly, don't build it at all. Technical debt compounds faster than most founders realize.
The Twitter mob defends vibe coding because it worked once for someone. However, for every success story, there are thousands of companies struggling with unfixable codebases.
Do it right or don't do it. There's no "we'll fix it later" in production.
I’m in the same position deep in Apple’s ecosystem. Switching means losing iMessage, AirDrop, years of purchases. The cost is $2,000+ or even more if combined all devices and dozens of hours.
This is definitely not true for the client-side software under discussion. Millions of devices requiring more resources and energy. The problem is that’s an externality to the developer.
My MacBook Pro M1 16" seems to be averaging about 13 watts of power, about the same as previous i7. My house idles at around 200 watts (lots of smart devices, etc). Hardly worth obsessing over it.
Irrelevant because Spotify doesn't pay for, nor do they have to care about user's resources.
If a user doesn't have enough ram to use Spotify, Spotify doesn't care. That user canceling their service is lost in the normal user churn. Spotify most likely has no idea and doesn't care if resource wastage affects their customers. It isn't an immediate first-order impact on their bottom line so it doesn't matter
This is antithetical to capitalism's founding principles. Resources (profit potential) will always increase unbounded forever. That's the only way the scam works
IDEs I think are a bit of a special case. Under the hood, it's constantly re-compiling and re-analyzing everything. That truly does take up a lot of memory and CPU. Probably not as much as if it were super aggressively optimized, sure, but still very heavy.
Full featured IDEs like this have always been heavy, as far as I know. It's only the pure-text editors without advanced full code analysis that can get away with low resources.
Ok, fair enough with IDE, but for instance I was using web storm ide for years and sometimes struggled with memory issues. But when I switched for cursor, I forgot about this at all
Quality is not immediately economically useful in a way an average MBA would understand and be able to communicate to shareholders. Also, we have spent many decades wrapping layer after layer of complexity on top of the cpu. That is starting to show, nobody really understands what’s going on any more.
Spotify and Youtube Music on desktop are so bad (esp CPU) that I vigilantly shut them down the second I'm not listening to music which isn't something I've done wrt computer resources since I had a 10" Atom netbook fifteen years ago with 2gb RAM.
I'm sure they're no better on my iPhone but I don't even have the appropriate tools to gauge it. Except that sometimes when I use them, another app I'm using closes and I lose my state.
There's no pressure to care. Most users can't tell that it's your app that's the lemon. The only reason I know anything about my Macbook is because I paid for iStatMenus to show me the CPU/RAM usage in the global menubar that can quickly show me the top 5 usage apps.
This basic info should be built in to every computer and phone.
I love the subscription model for music because I have wild tastes that swing so much but I'm about done with it just because the players are too frustrating/horrible to use.
I think one big issue is that companies got massive. Apple's revenue, for instance, would make them the 40th largest country in the world by GDP - larger than Portugal, Hungary, Greece, and the overwhelming majority of countries in existence. And it seems to be essentially a rule that as companies reach a certain threshold of size, the quality of what they produce begins to trend downward until they're eventually replaced by an upstart who then starts their trek towards reaching the threshold of fail.
But in modern political and economic times where number must always go up, too big to fail is a thing and anti-trust enforcement isn't (to say nothing of the FTC mostly just ¯\_(ツ)_/¯ with regards to basically any merger/acquisition of big tech), the current batch of companies just keeps growing and growing instead of being naturally replaced. To say nothing of the fact that a lot of startup culture now sees being acquired as the endgame, rather than even dreaming of competing against these monstrosities.
Look at Intel—an absolute monopoly for decades, everyone said they'd never fall. But they did. The government bailed them out with billions because they're "too important to fail."
Apple is even bigger. When they start truly failing, governments won't let them collapse—too much infrastructure depends on them.
But "not allowed to fail" doesn't mean "allowed to thrive." Intel is a zombie now—alive but not really competing. Apple will eventually become the same: massive, declining, propped up by governments, but no longer innovating.
You're right—nobody falls anymore. They simply become permanently mediocre monopolies that extract rent while quality continues to degrade.
I was going to say that from my anecdotal experience it's not much collapsed. Like macOS 15 basically doesn't crash while mac OS 7 on my first mac crashed roughly daily.
I guess a lot depends on which software you focus on.
Not really answering your question, but:
One completely imo unnecessary category of sloppy software is electron apps.
It's totally ridiculous how little resources are put into alternatives like tauri given how most dekstop apps run on electron, and we know how bad it is.
Oh dear, starting a conversation about software quality on HN.
Sadly, it won't fare well. You'll get a mix of flags and downvotes, along with "There's no problem! This is Fine!".
I feel that software has become vastly more complex, which increases what I call "trouble nodes." These are places where a branch, API junction, abstraction, etc., give space for bugs.
The vast complexity means that software does a lot more, but it also means that it is chock-full of trouble nodes, and that it needs to be tested a lot more rigorously than in the past.
Another huge problem is dependence on dependencies. Abstracting trouble nodes does not make them go away. It simply puts them into an area that we can't test properly and fix.
Ok, I agree with most what you are saying, but most part of those issues can be find with a proper testing. For instance, Steve Jobs Apple era is much more focused on quality and attention to detail. Under Tim Cook, Apple shifted from “it just works” to “ship fast, fix in updates.”
The difference isn’t complexity it’s priorities. Jobs-era Apple had smaller teams building fewer products with obsessive quality standards. Cook-era Apple has massive teams shipping constantly with “good enough” as the bar.
You’re right that testing helps. But when quality becomes optional, no amount of testing infrastructure fixes the cultural problem. We test for “does it work?” not “is it excellent?”
These issues passed all automated tests. They just didn’t pass the “would we be embarrassed to ship this?” test. That test doesn’t exist anymore at scale.
There are far far too many sloppy devs who slip through the cracks. They never get the mentoring they need or shown the door.
All sense of teamwork was murdered about a decade ago by people with clipboards and other dead weight staff who don't give a rat's ass about anything.
Most devs under 30 don't have the same enthusiasm previous generations did because the opportunity being proposed just isn't the same. The room for creativity isn't there, and neither is the financial reward. Do more with less and these problems tend to go away.
Well, Agile said that we don’t need testers (because everybody owns quality and slogans are magic). DevOps said we don’t have time for testers (because we reaaaally feel like shipping now). AI people said AI will do all the testing (because of course they did).
Nobody likes thinking critically and admitting that they haven’t achieved a responsible standard of care. If they aren’t forced to do it, why bother?
You must not have actually experienced computing in the past.
In the dark, distant past, we wrote programs that ran in kilobytes of memory on a double-digit-MHz CPU. Multiple cores or threads did not exist.
Today, the same program requires gigabytes of RAM and takes multiple seconds to do the same work with 32 4GHz CPUs.
This is truly not an exaggeration. Everyone who actually handled a Windows 95 machine in its natural environment will tell you that the experience of using a computer today is ten times slower and forty times more frustrating. Computers are slower than they ever have been, despite having hardware that is fast beyond the limits of anything we even dared to dream of in the 90s.
I'm willing to bet you are just simply not exposed to bleeding edge tech, and thus dont understand the need for it.
Its definitely overused in certain circumstances, when you could just roll out a monolithic code base on a single server, but in many cases now, systems get built that were impossible to build in the past
Some of it is the fault of AI and the belief that software is super easy to create now and maintenance won't be an issue in the future, mainly by folks who have very little to no experience with writing software but attempt it via the many ways you can do it with AI these days.
Then there's the shiny object syndrome of humanity in general even if we just look at websites they went through so many different cycles, plain html, flash, everything built with bootstrap.css, then came the frameworks, then back to SSR/SSG, etc... etc..
Both of those are just symptoms of a larger disease , namely lack of enthusiasm in general has fallen, a lot of it has to do with how demanding day to day software jobs have gotten, or how financially unstable the younger generations feel so they rarely set aside any time for creative endeavors and passion projects
Human beings are ephemeral. They're born, they die.
Everything human beings create is ephemeral. That restaurant you love will gradually drop standards and decay. That inspiring startup will take new sources of funding and chase new customers and leave you behind, on its own trajectory of eventual oblivion.
When I frame things this way, I conclude that it's not that "software quality" is collapsing, but the quality of specific programs and companies. Success breeds failure. Apple is almost 50 years old. Seems fair to stipulate that some entropy has entered it. Pressure is increasing for some creative destruction. Whose job is it to figure out what should replace your Apple Calculator or Spotify? I'll put it to you that it's your job, along with everyone else's. If a program doesn't work, go find a better program. Create one. Share what works better. Vote with your attention and your dollars and your actual votes for more accountability for big companies. And expect every team, org, company, country to decay in its own time.
Shameless plug: https://akkartik.name/freewheeling-apps
Ecclesiastes 1:2-5
Agreed, but If I can add one more angle: creative destruction is strangled when engineers are glued to the Internet about “how software should be built.”
I've published a blog post urging [0] top programmers to quit for‑profit social media and rebuild better norms away from that noise.
[0] https://abner.page/post/exit-the-feed/
My hot take is that quality is inversely proportional to income. The more someone is paying for something, the more bloodsucking mercenaries are attracted to it that have less consideration for the quality of the output than for their own enrichment. (A corollary to this is that the more a job pays the more it will suck: the only way they can get people to come help and keep them there is to offer high compensation).
Look at trappist brewers. Long tradition of consistent quality. You just have to devote your life to the ascetic pursuit of monkhood. It attracts a completely different kind of person.
It's certainly a provocative thought. But I think it's too blunt. In our commercial world sometimes the cheaper thing works better and sometimes the more expensive thing works better. So the lesson I take away is that price is not a great signal in the absence of other context. Trappist brewers have some other cultural norms going for them, and the focus should be on those norms rather than price. The people attracted to it aren't thinking much about the money. If you value them, why would you?
I think there’s often a misalignment of incentives when annual perf reviews are judged on feature work delivered not quality. Engineers who spend any time polishing or finding and fixing bugs wind up rated mid, while folks who quickly crank out mediocre or bad code that does something new are on a fast track for promotion. This creates a current in the whole org where PMs, engineers, managers, etc., are all focused on new things all the time. Any quality work has to accompany a new feature proposal for traction, and the quality items in that project will never get cross functional support like the new feature work does.
This resource allocation strategy seems rational though. We could consume all available resources endlessly polishing things and never get anything new shipped.
Honestly it seems like the another typical example of the “cost center” vs “revenue center” problem. How much should we spend on quality? It’s hard to tell up front. You don’t want to spend any more than the minimum to prevent whatever negative outcomes you think poor quality can cause. Is there any actual $ increase from building higher quality software than “acceptable”?
I would prefer to pay 20% more for product which works perfectly, than for product which almost works. But no one asked me.
You shouldn't, because that 20% more will not go to the engineers; it will go to parasitic management. Money is better if paid for services than for software. I would find well-maintained open source software instead, perhaps contribute to one, or put in the effort to develop one.
If I'm choosing between 2 products on the market, if I pay for the higher quality product, that's going to increase that company's revenue. That will either allow them to pay the existing engineers more, or hire more engineers. The lower quality product won't get my revenue, and thus won't be able to do that.
More revenue -> company grows
Less revenue -> company shrinks
Open source software has no revenue at all, yet it often is superior in many ways.
That's a different situation from what this conversation was about. The conversation was about if there are 2 for-profit products, one costing 20% more than the other, would paying for the more expensive one provide money to engineers or not.
Regarding the new conversation topic: some open source software does have revenue. Forms of revenue include: donations, selling support, selling licenses that are less restrictive than the original open source license, ads, and selling addons. Yes, revenue for open source software is generally less than for-profit software, and despite that the open source software is often higher quality. I didn't claim that a higher quality product will always have more revenue than a lower quality product. I just made a claim about where the money goes.
I think we're getting somewhere now. The solution could be to fund the developers of open source software, actively paying them to implement the desired features that otherwise don't get picked up (while maintaining good quality and their continued ownership over the software). Micromanaging them as employees wouldn't work.
I'm not sure this really matters at this point. It's like filming yourself giving food to the homeless. Is it better if you didn't? Yeah, probably. But at the end of the day does that person have food when otherwise they wouldn't? Also yeah.
I'd rather take a step in the right direction than none at all. If the management can be convinced that there's more money to be made this way then that gives us engineers more power to convince them to solve other such problems. If they care about quality then that gives us back negotiating power. You don't outsource to a third world software mill or AI when your concern is quality. But you do when you were trying to sell the cheapest piece of shit that people will still buy. So yeah, I'm okay with this
You can be okay with it, but it's not going to solve the problem. Management will fix the issue, then soon revert to enshittification and exploitation, so your next major issue will stay unfixed. In the best case, your software will become an annual subscription where you've to keep paying an obscene amount for no new features at all. Overall, it would be a step in the right direction, but only a single step.
> You don't outsource to a third world software mill or AI when your concern is quality.
That's a disastrously fallacious set of presuppositions. A good engineer will use AI well to improve their software, whereas a bad engineer will use it to produce junk.
I want to stress that this is a highly complex problem that needs to be solved and that means we need to break it down into smaller manageable tasks. You're not going to change everything overnight, a single person won't change things, nor will a single action change things. There's no clear definitive objective that needs to be solved to sole this problem. Nor is there a magic wizard in the tower that needs to be defeated.
In other words, I gave you my explanation for why I think this can be a step in the right direction (in a sister comment I said even more if you want to read that). But you have complained and given no alternative. Your only critique is that it does not solve the problem in one fell swoop. That was never an assumption I made, it is not a reasonable assumption to make (as you yourself are noting), and I explicitly said it is not an assumption being made. Do not invent problems to win an argument. All you've done is attempt to turn a conversation into an argument.
So don't stop after one step. Read more carefully. I did not say "use AI" I said "outsource to AI". There is a huge difference in these two things.Do we need to fight or can we actually have a discussion to help figure out this problem together? You do not need agree with me, contention can be beneficial to the process, but you do need to listen. I have no interest in fighting, so I'll leave the choice to you.
As a simple version think about it this way: if a customer can't tell the difference in quality at time of purchase then the only signal they have is price.
I think even here on HN if we're being honest with ourselves it's hard to tell quality prior to purchase. Let alone the average nontechnical person. It's crazy hard to evaluate software even hands on. How much effort you need you put in these days. The difficulty of differentiating sponsored "reviews" from legitimate ones. Even all the fake reviews or how Amazon allows changing a product and inheriting the reviews of the old product.
No one asks you because all the sellers rely too heavily on their metrics. It's not just AI people treat like black boxes, it's algorithms and metrics in general. But you can't use any of that effectively without context.
At engineers I think we should be a bit more grumpy. Our job is to find problems and fix them. Be grumpy to find them. Don't let the little things slip because even though a papercut isn't a big deal, a thousand is. Go in and fix bugs without being asked to. Push back against managers who don't understand. You're the technical expert, not them (even if they were once an engineer, those skills atrophy and you get disconnected from a system when you aren't actively working on it). Don't let them make you make arguments about some made up monetary value for a feature or a fix. It's managements job to worry about money and our job to worry about the product. There needs to be a healthy adversarial process here. When push comes to shove, we should prioritize the product over the profit while they should do the opposite. This contention is a feature, not a bug. Because if we always prioritize profits, well, that's a race to the bottom. It kills innovation. It asks "what's the shittiest cheapest thing we can sell but people will still buy". It enables selling hype rather than selling products. So please, be a grumpy engineer. It's in the best interest of the company. Maybe not for the quarter, but it is for the year and the decade. (You don't need to be an asshole or even fight with your boss. Simply raising concerns about foreseeable bugs can be a great place to start. Filling bug reports for errors you find too! Or bugs your friends and family find. Or even help draft them with people like on HN that raise concerns about a product your company works on. Doesn't need to be your specific team, but file the bug report for someone who can't)
And as the techies, we should hold high standards. Others rely on us for recommendations. We need to distill the nuances and communicate better with our nontechnical friends and family.
These won't solve everything but I believe they are actionable, do not require large asks, and can push some progress. Better something than nothing, otherwise there will be no quality boots to buy
https://en.wikipedia.org/wiki/Boots_theory
Exactly. It's an industry shift, and one person can't reverse it alone. But I disagree with "better something than nothing" when it comes to quality. That's how we normalized catastrophes in the first place. The lemon market problem you described is real—users can't evaluate quality, so price becomes the only signal. But engineers can evaluate quality. We're the ones who should refuse to ship garbage, even if management pushes back. Being grumpy works locally. It won't fix the industry, but it fixes your team. And when enough teams refuse to normalize this, the pattern shifts.
This just makes it sound like software engineering hasn't matured yet to realize we're building real world systems. It's still a pretty new field, comparatively, but big companies can't run like startups. They should have groups in them that look like that, but not for most groups
My view is fairly simple - demand for technology is always increasing at a rate which far outstrips supply of _good_ engineers (by a significant factor). The lure of a well paid career tempts many to the world of software engineering even if they're not very good at it.
Look at the construction industry. Many buildings on this planet were built hundreds, sometimes a thousand or more years ago. They still stand today as the quality of their build quality was excellent.
A house built today of cheap materials (i.e poor quality software engineers) as quickly as possible (i.e urgent business timelines) will fall apart in 30 years while older properties will continue to stand tall long after the "modern" house has crumbled.
These days software is often about being first to market with quality (and cough security) being a distant second priority.
However occasionally software does emerge as high quality and becomes a foundation for further software. Take Linux, FreeBSD and curl as examples of this. Their quality control is very high priority and time has proven this to be beneficial - for every user.
> Look at the construction industry. Many buildings on this planet were built hundreds, sometimes a thousand or more years ago. They still stand today as the quality of their build quality was excellent.
True. And yet, far more buildings built then are not standing. We just don't notice them, because they aren't still here for us to notice.
So don't think that things were built better then. A few were; most weren't.
It's a simple, timeless, inescapable law of the universe that failures, while potentially damaging, are acceptable risks. The Pareto principle suggests that addressing only the most critical 20% of issues issues yields a disproportionate 80% of the benefits, while the rest of the big bounties yield diminishing marginal returns.
We're seeing bugs in bigger slices because technology is, overall, a bigger pie. Full of bugs. The bigger the pie, the easier it is to eat around them.
Another principle at play might be "induced demand," most notoriously illustrated by widening highways, but might just as well apply to the widening of RAM.
Are we profligate consumers of our rareified, finite computing substrate? Perhaps, but the Maximum Power Transfer Theorem suggests that anything less than 50% waste heat would slow us down. What's the rush? That's above my pay grade.
I guess what I'm saying is that I don't see any sort of moral, procedural, or ideological decay at fault.
In my circles, QA is still very much a thing, only "shifted left" for tighter integration into CI/CD.
Edit: It's also worth reflecting on "The Mess We're In."[0] Approaches that avoid or mitigate the pitfalls common to writing software must be taught or rediscovered in every generation, or else wallow in the obscure quadrant of unknown-unknowns.
0. https://m.youtube.com/watch?v=lKXe3HUG2l4
>… are acceptable risks.
Close. Failure-free is simply impossible. And believing the opposite fails even harder and dies out.
This is not "acceptable", because there is no alternative, there is no choice or refutation (non-acceptance). It is a fact of life. Maybe even more so than gravity and mechanical friction.
Where I’m at, needless complexity is forced upon us. At the same time, we are constantly pushed to deliver new capabilities on timelines that are dictated to us, devoid of any grounding in reality. There is no room to even have the conversation about proper design or solving the right problems. It’s all about hitting arbitrary dates with “features” no one really cares about, while ignoring the foundation it all has to sit on.
The more loudly someone speaks up, the faster they are shown the door. As a result, most people keep their head down, pick their battles carefully, and try to keep their head above water so they can pay the rent.
I wouldn't work at a place that doesn't value my perspective either. The problem is that most engineers realize this 6-12 months in, after they're already invested, and leaving means starting over. This is why "keep your head down to pay rent" becomes the default. The system is designed to make speaking up too expensive.
You are in a completely normal dev shop. What's happened is that the start up mentality of "ship something - anything, and ship it NOW" has infected everything. Maybe over time you can make it better. But educating management can be a slow and frustrating process. Good luck!
I believe its a mix of three factors, (a) lack of transfer of institutional knowledge (b) lesser fundamental incentives for people to get better at fundamental skills/gaps (c) rise in hotfixes as we deal with time/scales that operate much faster, burn faster, and want to expand faster.
All of the above is multiplied 1.3x-1.5x with accelerating ways to get upto speed with iterative indexing of knowledge with llms. I believe we are reliant on those early engineers whose software took a while to build (like a marathon), and not short-sprinted recyclable software we keep shipping on it. The difference is not a lot of people want to be in those shoes (responsibility/comp tradeoffs.
Shoulda posted your link as a link instead of hiding it in text where we can’t click on it. If your blog post doesn’t stand on its own without an explanation you should rewrite it.
Thank you for your advice, but I’m new here and do not know how the things works here yet. And probably I’m not allowed to post links yet, because of new account
Ah welcome to HN. This particular etiquette is oddly not covered in the guidelines but instead in the FAQ:
> How do I make a link in a text submission?
> You can't. This is to prevent people from submitting a link with their comments in a privileged position at the top of the page. If you want to submit a link with comments, just submit the link, then add a regular comment.
https://news.ycombinator.com/newsfaq.html
"Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity." https://news.ycombinator.com/newsguidelines.html
But it is, and according to what I see in discussion, I’m not lonely at that feeling. And I’m really glad that I started this conversation
The first and only activity of your account, created today, is to promote your own blog post.
You said, "I’m not allowed to post links yet, because of new account". There's a reason for that, and you're trying to bypass that restriction by misusing Ask HN instead.
Oh, I see your point, I didn’t think from this angle
If you turn on the "showdead" flag on your profile you will get a lot of insight into how HN works. Particularly there are a lot of people who post a link to their blog and another link to their blog and yet another link to their blog and these always start out [dead].
If, on the on the other hand you regularly post links to other people's blogs and participate in discussion you won't have any trouble slipping in a link to your own blog. The key is you're expected to be part of the community.
Thank you , I appreciate your advice
Also, part of the etiquette is to only post when you actually have something to add to the discussion. If you just want to say "thank you", that's what the upvote button is for.
One possible factor is the proliferation of LLMs to write code for us. I noticed that a few versions after Jetbrains implemented LLM integration, the bugs in their products skyrocketed. Also, in my own job, it's often tempting to use LLMs where I really shouldn't, by which I mean where I can't easily check the code to ensure that it's fully and subtly correct. For example, if I'm working with an unfamiliar library I might ask an LLM what the incantation is to do something, and then check it by reading the docs of each thing that it references. I regularly find issues when doing so, but a different developer who didn't check these docs may miss this and ship a subtle bug.
It began way more earlier then llm era, but yes , with llm it became even worse
Speaking for myself, about my own software that I write alone: building new things is more exciting. Even with all financial incentives aside, I just like building new things more than I like polishing old ones.
I could improve the quality infrastructure, write more tests and clean up the code, but the work is not as fulfilling.
To me, a big reason is that people who don't understand how it works have a say in how it should be done.
It's not the case in traditional engineering fields: when you build a dam, the manager cannot say "hmm just use half as much concrete here, it will be faster and nobody will realise". Because people can go to jail for that. The engineers know that they need to make it safe, and the managers know that if the engineers say "it has to be like that for safety", then the manager just accepts it.
In software it's different: nobody is responsible for bad software. Millions of people need to buy a new smartphone because software needs twice as much RAM for no reason? Who cares? So the engineers will be pushed to make what is more profitable: often that's bad software, because users don't have a clue either.
Normal people understand the risks if we talk about a bridge or a dam collapsing. But privacy, security, efficiency, not having to buy a new smartphone every 2 years to load Slack? They have no clue about that. They just want to use what the others use, and they don't want to pay for software.
And when it's not that, it's downright enshittification: users don't have a choice anymore.
I agree there is this problem, which is why I try to write software that is actually good (and use software that is actually good, if it is available). It is also one reason why I still use some DOS programs (even with emulation, which is slow, it is still better than the bad quality they have in newer computers).
I do not use any of the software mentioned in that article, and I also do not have that much RAM in my computer.
This is the core philosophical divide in modern software development.
The "vibe coding MVP" crowd treats code as disposable—ship fast, validate the idea, and discard it if it doesn't work.
The problem: most MVPs never get thrown away. They get customers, then funding, then "we'll refactor later" becomes "we can't afford downtime to refactor."
That's how you end up with production systems built on prototypes that were never meant to handle real load.
I'm with you: if you're not willing to build it properly, don't build it at all. Technical debt compounds faster than most founders realize.
The Twitter mob defends vibe coding because it worked once for someone. However, for every success story, there are thousands of companies struggling with unfixable codebases.
Do it right or don't do it. There's no "we'll fix it later" in production.
Short answer: because lock-in disables competition, and cloud-based business models enable almost perfect lock-in.
Software quality only matters when users can switch.
I’m in the same position deep in Apple’s ecosystem. Switching means losing iMessage, AirDrop, years of purchases. The cost is $2,000+ or even more if combined all devices and dozens of hours.
And not only this, Jetbrains, which consumes tons of RAM, Chrome, and so on. When did we decide that resources are free?
Did JetBrains and Chrome apps leak too? Or were they just RAM-hungry and didn't leak?
No, I don’t think that it’s leaking, it is just “normal” behavior. But why it is became a normal? When optimization stopped be mandatory?
Perhaps computer resources are cheaper than developer time spent on optimizations.
Computer resources aren't cheaper, they're externalized (to users, environment, etc.)
This is definitely not true for the client-side software under discussion. Millions of devices requiring more resources and energy. The problem is that’s an externality to the developer.
> energy
My MacBook Pro M1 16" seems to be averaging about 13 watts of power, about the same as previous i7. My house idles at around 200 watts (lots of smart devices, etc). Hardly worth obsessing over it.
They are not, if you consider all the instances of the bloatware running.
How many users does Spotify have? Multiply that by the 79GB mentioned above. Is it still cheaper?
Irrelevant because Spotify doesn't pay for, nor do they have to care about user's resources.
If a user doesn't have enough ram to use Spotify, Spotify doesn't care. That user canceling their service is lost in the normal user churn. Spotify most likely has no idea and doesn't care if resource wastage affects their customers. It isn't an immediate first-order impact on their bottom line so it doesn't matter
But let’s agree, that you will not be able to scale just with resources forever. Basically AI already struggles because of lack of resources
This is antithetical to capitalism's founding principles. Resources (profit potential) will always increase unbounded forever. That's the only way the scam works
IDEs I think are a bit of a special case. Under the hood, it's constantly re-compiling and re-analyzing everything. That truly does take up a lot of memory and CPU. Probably not as much as if it were super aggressively optimized, sure, but still very heavy.
Full featured IDEs like this have always been heavy, as far as I know. It's only the pure-text editors without advanced full code analysis that can get away with low resources.
Ok, fair enough with IDE, but for instance I was using web storm ide for years and sometimes struggled with memory issues. But when I switched for cursor, I forgot about this at all
My work machine is so full of bloatware that its 64gb of RAM is always full.
Exactly. I’m using Mac m3 with 32gb ram and constantly receiving notifications from clean my Mac that ram is full
Quality is not immediately economically useful in a way an average MBA would understand and be able to communicate to shareholders. Also, we have spent many decades wrapping layer after layer of complexity on top of the cpu. That is starting to show, nobody really understands what’s going on any more.
Exactly. That's why I wrote this—to confirm we're not crazy for noticing the pattern.
Spotify and Youtube Music on desktop are so bad (esp CPU) that I vigilantly shut them down the second I'm not listening to music which isn't something I've done wrt computer resources since I had a 10" Atom netbook fifteen years ago with 2gb RAM.
I'm sure they're no better on my iPhone but I don't even have the appropriate tools to gauge it. Except that sometimes when I use them, another app I'm using closes and I lose my state.
There's no pressure to care. Most users can't tell that it's your app that's the lemon. The only reason I know anything about my Macbook is because I paid for iStatMenus to show me the CPU/RAM usage in the global menubar that can quickly show me the top 5 usage apps.
This basic info should be built in to every computer and phone.
I love the subscription model for music because I have wild tastes that swing so much but I'm about done with it just because the players are too frustrating/horrible to use.
I think one big issue is that companies got massive. Apple's revenue, for instance, would make them the 40th largest country in the world by GDP - larger than Portugal, Hungary, Greece, and the overwhelming majority of countries in existence. And it seems to be essentially a rule that as companies reach a certain threshold of size, the quality of what they produce begins to trend downward until they're eventually replaced by an upstart who then starts their trek towards reaching the threshold of fail.
But in modern political and economic times where number must always go up, too big to fail is a thing and anti-trust enforcement isn't (to say nothing of the FTC mostly just ¯\_(ツ)_/¯ with regards to basically any merger/acquisition of big tech), the current batch of companies just keeps growing and growing instead of being naturally replaced. To say nothing of the fact that a lot of startup culture now sees being acquired as the endgame, rather than even dreaming of competing against these monstrosities.
Look at Intel—an absolute monopoly for decades, everyone said they'd never fall. But they did. The government bailed them out with billions because they're "too important to fail." Apple is even bigger. When they start truly failing, governments won't let them collapse—too much infrastructure depends on them. But "not allowed to fail" doesn't mean "allowed to thrive." Intel is a zombie now—alive but not really competing. Apple will eventually become the same: massive, declining, propped up by governments, but no longer innovating. You're right—nobody falls anymore. They simply become permanently mediocre monopolies that extract rent while quality continues to degrade.
The 'upper' end of 'society' isn't holding itself to account. Who is left to look up to?
Old software also sucked.
I was going to say that from my anecdotal experience it's not much collapsed. Like macOS 15 basically doesn't crash while mac OS 7 on my first mac crashed roughly daily.
I guess a lot depends on which software you focus on.
The majority of old software was arguably much worse. I agree the survivorship bias is strong.
Not really answering your question, but: One completely imo unnecessary category of sloppy software is electron apps. It's totally ridiculous how little resources are put into alternatives like tauri given how most dekstop apps run on electron, and we know how bad it is.
Totally agree, lol. Especially when it forced like solution when it is not necessary at all. Windows UI, for instance
Oh dear, starting a conversation about software quality on HN.
Sadly, it won't fare well. You'll get a mix of flags and downvotes, along with "There's no problem! This is Fine!".
I feel that software has become vastly more complex, which increases what I call "trouble nodes." These are places where a branch, API junction, abstraction, etc., give space for bugs.
The vast complexity means that software does a lot more, but it also means that it is chock-full of trouble nodes, and that it needs to be tested a lot more rigorously than in the past.
Another huge problem is dependence on dependencies. Abstracting trouble nodes does not make them go away. It simply puts them into an area that we can't test properly and fix.
Ok, I agree with most what you are saying, but most part of those issues can be find with a proper testing. For instance, Steve Jobs Apple era is much more focused on quality and attention to detail. Under Tim Cook, Apple shifted from “it just works” to “ship fast, fix in updates.”
The difference isn’t complexity it’s priorities. Jobs-era Apple had smaller teams building fewer products with obsessive quality standards. Cook-era Apple has massive teams shipping constantly with “good enough” as the bar.
You’re right that testing helps. But when quality becomes optional, no amount of testing infrastructure fixes the cultural problem. We test for “does it work?” not “is it excellent?”
These issues passed all automated tests. They just didn’t pass the “would we be embarrassed to ship this?” test. That test doesn’t exist anymore at scale.
I’m not a huge fan of automated-only testing; especially wrt GUI and device control.
I tend to prefer test harnesses: https://littlegreenviper.com/various/testing-harness-vs-unit...
There are far far too many sloppy devs who slip through the cracks. They never get the mentoring they need or shown the door.
All sense of teamwork was murdered about a decade ago by people with clipboards and other dead weight staff who don't give a rat's ass about anything.
Most devs under 30 don't have the same enthusiasm previous generations did because the opportunity being proposed just isn't the same. The room for creativity isn't there, and neither is the financial reward. Do more with less and these problems tend to go away.
Well, Agile said that we don’t need testers (because everybody owns quality and slogans are magic). DevOps said we don’t have time for testers (because we reaaaally feel like shipping now). AI people said AI will do all the testing (because of course they did).
Nobody likes thinking critically and admitting that they haven’t achieved a responsible standard of care. If they aren’t forced to do it, why bother?
I might be going further than most, but my personal take is that it happened when Woz stopped being involved: https://lists.sr.ht/~vdupras/duskos-discuss/%3CZ4p_GHsw5arWG...
The rest is just a downhill trend.
we keep buying the stuff
its not, software quality is better than ever, far more sophisticated than the simple programs of the past.
You must not have actually experienced computing in the past.
In the dark, distant past, we wrote programs that ran in kilobytes of memory on a double-digit-MHz CPU. Multiple cores or threads did not exist.
Today, the same program requires gigabytes of RAM and takes multiple seconds to do the same work with 32 4GHz CPUs.
This is truly not an exaggeration. Everyone who actually handled a Windows 95 machine in its natural environment will tell you that the experience of using a computer today is ten times slower and forty times more frustrating. Computers are slower than they ever have been, despite having hardware that is fast beyond the limits of anything we even dared to dream of in the 90s.
I'm willing to bet you are just simply not exposed to bleeding edge tech, and thus dont understand the need for it.
Its definitely overused in certain circumstances, when you could just roll out a monolithic code base on a single server, but in many cases now, systems get built that were impossible to build in the past
Yes, that is exactly how I feel
You are claiming sophistication is quality, it is not.