Eduardolucas1 wrote: ↑2024-04-25, 10:30
This answer will sound very condescending to those who see it in a light it has not the intention to cut to, so, i will try to be very cautious:
The reason windows, linux and interfaces have gone to the lowest common denominator regarding to UI design, architecture-level decisions, OS design, userland decisions and code is due to
1 - Cutting costs to improve margins and financeirization
2- Attracting a public which suffers from a rising trend of degradations in the public education system worldwide
I have a slightly different perspective on this, but I won't discount your perspective entirely. Here' s what I see as the main two problems.
1. Security is being touted as a feature rather than being recognized as something that introduces inconvenience and makes the user experience suck. Because most of the stuff that gets in the way of a hacker also gets in the way of average users. It's just a nicer sounding word for "not letting people do stuff," and people don't like it when you don't let them do stuff they want to do, whether they're users or hackers, so security doesn't make people happy and pisses them off more often than not. I'm talking about things like "reducing the attack surface" by removing code for less commonly-used features, paring down the number of possible ways to do something so everyone has to go through one path regardless of what they're used to, just removing any functionality that could be helpful to a potential attacker even if a lot of people found it helpful or convenient if the risk was deemed too great, etc. And then there's extreme stuff like basically nuking speculative execution in processors entirely and going back to 2000's era designs because of vulnerabilities like Spectre. Overall, it seems like the days of prioritizing user experience are over, and instead the demands of OEMs, governments, enterprise, and ISPs are taking a lot more precedence in how computers and operating systems are designed. ISPs don't want hacked computers on their networks, governments worry about cyberattacks, OEMs want to maintain a certain degree of control over what features are available on their hardware to improve product differentiation, enterprise wants to cut IT costs and make maintenance simpler and possible with less skilled personnel, etc. And so it no longer matters what you, the user, want... it's all about what the bean counters and risk assessment experts at big institutions who feel the impacts of bad user decisions want, and they want everyone's choices to be limited so that they have fewer variables to deal with and their jobs are easier. They're slowly moving us towards a world with slow, laggy processors that require a lot of parallelism to be remotely useful and locked-down, limited, always-online applications with set user interfaces that can barely do anything beyond the bare minimum specification. In other words, everything is becoming a smartphone, from your vending machine, to your ATM, to your desktop computer.
2. I see part of the problem as actually more related to the increased desire to include people from underdeveloped countries with low literacy rates who have never used a computer before. They can't use words in the interfaces because a lot of the people they want using their products that represent an untapped market for them, are from the poorest countries in Africa or some place where people can't read or anything, and it's easier to make everything as visual as possible to cater to people with no computer experience and a limited ability to read words. Which means everything is done using overly simplified, culturally non-specific graphics to cut down on translation time and allow toddlers and people in developing countries to use their stuff. In other words, I don't think the public education system in first-world countries has declined by quite that much, at least not relative to society as a whole, but rather the problem is that the primary untapped market for new smartphones and PCs is in third-world countries (like those in Africa, for instance) and it's easier to dumb down the system than to make their potential new customers more educated... meanwhile their existing markets are saturated, competition is dead, and we have no choice but to sit around and take it for the most part while they focus on the needs of people who aren't us.
So basically, it's a confluence between those who hold infrastructure influencing the design of operating systems in a direction users don't like, hackers getting too smart and forcing everyone's hand, and also the fact that first-world countries are taken for granted as a market and the technology is being dumbed down so it can be sold to third-world countries rife with illiteracy more easily. The incentives to make software that users like seem to be mostly gone, because all the money is coming from interests who have very different goals from the average user, like protecting investments or being appealing to the illiterate and uneducated people of the world and giving them a decent computing experience. Earlier in computing history, there was a lot more competition, there weren't so many established interests getting in the way, and the market had room to grow among people who were excited about computers and wanted to learn more about them. Now most of the growth is among people who are too ignorant or illiterate to use a computer and every further gain made in market share is at the expense of making something less suitable for average or advanced users.
I don't know if my analysis is 100% correct, but this is how I see it.
"The Athenians, however, represent the unity of these opposites; in them, mind or spirit has emerged from the Theban subjectivity without losing itself in the Spartan objectivity of ethical life. With the Athenians, the rights of the State and of the individual found as perfect a union as was possible at all at the level of the Greek spirit." -- Hegel's philosophy of Mind