If you’ve used pretty much any consumer software or internet search engine in recent years, you’ve noticed a decline in user friendliness and utility. What was once a simple matter of navigating intuitive interfaces and obtaining relevant information has become a frustrating ordeal.
From constant updates that seem to break more than they fix to search engines returning an avalanche of ads instead of relevant answers, the slow death of corporate software has left many asking:
Let’s dig into the mystery of tech companies’ baffling self-sabotage and try to unearth some answers.
One of the primary culprits is the obsession with change for sheer novelty’s sake. Tech companies, in their desire to stay ahead, often push changes that don’t improve quality of life for users. Instead, these updates are driven by worship of innovation or attempts to chase trends with no thought for whether or not they benefit users.
The result? Features that once worked smoothly are altered beyond recognition or removed entirely, sowing confusion that frustrates even seasoned users.
How often has a simple software function that’s become second nature to use been saddled with needless complexity in the name of “improvement”? Time and again, familiar, reliable functions are buried under pointless clutter. And more often than not, these changes are superficial, lacking any real benefit. Nonetheless, you are forced to re-learn tasks you’d once mastered. Rather than making your job easier, these updates introduce more friction into everyday use, and for no good reason.
Related: Dead Internet
Another major factor in the decline of user-friendly software is the loss of institutional knowledge. Many of the original programmers who built the foundations of today’s software and search engines have retired or moved on without passing down their expertise to the next generation. The engineers who created pioneering systems had intimate knowledge of the architecture and purpose behind their design choices. They focused on utility and usability because they understood the needs of the user.
As newer developers take over, this crucial knowledge is being increasingly lost. Without a proper transfer of insight and context, software becomes prone to inefficiencies and errors. Newer developers, lacking the deep understanding of why certain design decisions were made in the first place, may inadvertently sabotage important functions . This erosion of expertise results in bloated, less efficient software that breaks more often and becomes a nightmare to use.
But that’s not even getting into search engines.
Once a go-to source for information, former internet pillars like Google, Yahoo, and Amazon have fallen victim to declining utility, thanks in large part to the glut of paid advertising.
Google’s search results, for example, have become overrun with ads, sponsored links, and SEO-driven results that push the answers you actually want farther down the page. What used to be a straightforward search for information now requires sifting through several layers of irrelevant or sales-driven noise.
This problem extends beyond search engines to platforms like Amazon, where users often find that the top-listed products aren’t the best or most relevant, but those from sellers who’ve paid the most to be there. The overwhelming presence of ads has warped search algorithms, undermining their original purpose of bringing users the most useful and accurate information with speed and efficiency.
And it would be professional negligence not to point out the digital elephant in the virtual room: big tech censorship.
Because somewhere along the line, software companies started by nerds in garages appointed themselves the Internet Police. Under the guise of protecting users from misleading information, the billionaires in charge of these megacorps have launched a crusade to suppress ideas they disagree with. And since their insular worldview clashes so brazenly with reality, the results are ever clumsier and weirder. Take the recent fracas over Google’s Gemini A.I.
With search engines becoming less reliable by the second and software growing ever more unwieldy, people are seeking alternatives. Perhaps surprisingly, large language models like ChatGPT are gaining popularity. The rising preference for consulting LLMs, despite the much-publicized Google Gemini debacle, testifies to just how ad-cluttered and unusable old style search engines have become.
Contra the popular preductions of doom, we’re seeing rapid improvement in A.I. models akin to Google’s boom phase twenty years ago. If internet users’ past behavior is any indication, the shift toward LLMs will not only continue but grow. As search engines hemorrhage user confidence, and software disappears farther into its own navel, A.I. will increasingly draw users away. And while valid criticism has been lodged against the accuracy of A.I., in the final analysis people want tools that make their lives easier. The lessons of DoorDash, on-demand streaming, and online shopping dictate that convenience beats quality.
The decline of consumer software and search engines isn’t a passing phase. It’s part of a broader trend fueled by unnecessary changes, the loss of foundational knowledge, the pervasiveness of paid advertising, and tech censorship. As users face mounting frustrations, they are seeking out alternatives—most notably large language models—to find easier answers. No matter that the easy way and the right way aren’t always the same.
The deep lore of Tolkien meets the brutal struggle of Glen Cook in the dark fantasy prelude to the acclaimed Soul Cycle.