...anyways that rate of change in software innovation seemed to me to drastically slow about 20 years ago. There has been no innovation on the desktop. In mobile there was one great leap with the widely adopted touch-optimised UI at the release of the iPhone but after that? Nothing really.
Have computers (desktop or mobile) *really* become 50 times more useful, or even 5 times more useful?
I cannot think of a single function on an app that I regularly use now that I didn't have 20 years ago on a desktop or 10 years ago on a mobile.
I really think that, if given enough time and resources, a "clean sheet re0design" of our software ecosystems would give us truly useful computing experiences at 10% of the clock speed and like 5% of the RAM. It would certainly be more affordable and efficient.
@msh Totally agree with this line of thought. Almost all of what we all do regularly could be done with a fraction of the resources if the software were just not so bloated and unoptimized. I mean I can accomplish 90% of the stuff I do on a PocketCHIP in the terminal. That's probably an extreme that most wouldn't want to move to but there is certainly an achievable middle ground much closer to that than needing an i7 and 16GB RAM to do word processing.
@kelbot I think the majority of computing resources used today goes to support rapid development infrastructure rather than actual useful functionality or user experience (even not counting cryptocurrentcy proof of "work" which is yet another tragedy).
We used to rely on hardware+BIOS+OS/drivers to provide enough abstraction fro compatability. Now we layer on top of that virtual machines, containers and frameworks. All of these layers larded on top of each other to do what exactly? It certainly doesn't make the app more functional or easy to use.
Servers are all "clouded up". Desktops have everything from java VMs to dot-net framework to Electron to apps distributed in self-contained containers like Snap and Flatpak--many of which may encapsulate the same depenencies but run their own copies.
All of this cruft exists to paper over developer issues like dependency hell...NONE of it makes apps more useful to end users.
that said, containers and virtual machines have astoundingly low overhead. we might be able to make chips 10% faster if we cut out all the hardware support, but their overhead is amazingly low when in use.
we really should try to better share resources, & stop taking the easy route. i had a long long thread on this a week ago,
As someone who had a not-so-great Internet connection till a year ago, I can definitely see that websites have become unreasonably bloated too. I don't know *what* they're loading, but a lot more MBs go into rendering even a simple web page. So I can well imagine this happening for desktop software as well!
@msh If we didn’t need to keep upgrading then we could just keep our devices longer and that just wouldn’t do!
@TinBee Planned obsolescence does play a major role right alongside the "move fast and break things" ethos. Gotta make New Shinies and Make Them Fast. The lack of thoughtfulness massively drives inefficiency.
People are catching on though, and I think things have gotten *so* bloated that we are reaching a limit where software would just collapse of it kept going. My servers are 9 years old and I still use them, and I have a 12 year old laptop that is still usable save for zero battery capacity left. People would get twice the life or more out of mobiles if they weren't forced to upgrade their locked down hardware to get security updates for the bloated buggy crap they have to run.
But we can still do better...FAR better.
@msh I've been musing about how Apple's enormous success with the iPod/iMac marketing model has kind of doomed the company (and by extension the rest of the tech industry because everyone else decided to try the same thing)
the tipping point for me was around macOS Lion, or maybe Mountain Lion - they reached a point where OSX was a solid, mature, well working operating system but they locked themselves into this pattern of doing a big marketing song and dance every year
@msh and so they keep having to come up with more nonsense that can be put on the WWDC slides every year, whether it helps the operating system or not, and keep doing change for change's sake, while the regular and infuriating bug where sometimes unlocking the computer doesn't kill the screen saver has been hanging around for literally 10 years now
Microsoft have fallen into the same trap post-Windows 7 (maybe 8 depending or not whether one likes Metro)
@msh and smartphones are just repeating that cycle, except it's not just the software but the hardware too
@msh I can create a high-quality 300 page PDF from Org-Mode within a few minutes.
I have the comparison to 10 years ago with my homeserver. I could not do it with its lower performance.
Ripgrep can search 150k lines of code interactively (thanks to SSDs and Rust).
@msh However I normally restrict my CPUs to 1.5GHz, because that cuts the power consumption from 120W to 30W. I only need more when compiling or during software development (the codebase at work is too big for efficient work otherwise).
@msh I mean, sure, I guess.
There isn't a system in existence that wouldn't benefit from a redesign with the benefit of hindsight.
But since we do have all this compute power and RAM....what do we gain exactly by a complete redesign?
Software grows to take over all available resources, just like people's expenses and expenditures tend to grow to match their paychecks.
I'm all for better software. But I'm not sure smaller is the virtue I'm looking for.
Executing a filter on a photo wasn’t something you could do live with a slider. It was something you would hit OK on and then wait several minutes.
Windows didn’t typically contain search. Pervasive search only appeared in the late 2000s.
GUIs lacked compositing. Resolutions were lower, so you could fit far less on your screen.
Distributed version control was still fledgling.
Copying photos with USB 1.1 also wasn’t fun. And forget about any automatic kind of geotagging.
SSDs were also a massive leap.
Laptops were about thrice as thick and heavy.
I think what the most experienced s/w devs are bemoaning here is that we know that underneath our code are GBs of complexity that contribute butkus to what we are trying to do in most of our functions.
The riffs on how how WP 5.1 running on our DOS boxes formatted our README docs just fine thank you very much are just the cheese with our whine. 🧀 🍷 😀
@puzzled @chucker @mood yeah there is some of that I guess, given I seldom do things like render 300 page PDFs or photo editing. But I'm not really against powerful hardware as such--what irks me the most is the *incredible* inefficiency. I still think that 80% of what we do on computing devices could be done with 20% of the hardware (maybe less).
We are really at the point where a dollar-store pocket computer device could be capable of browsing, email, simple documents etc. but there is so. much. crap. all the way down.
I also think that people forget that 20 years ago wasn't all that bad. I'm not talking about 40 years ago here. The "wait several minutes" for example is a bit of an exaggeration. In the late 1990s I was applying filters for image processing and they only took a few seconds on 24 bit 1 megapixel images, and MS Office 2000 performed similar to today's most recent version and is still capable enough for what I use office apps for today.
@msh @puzzled @mood I did exaggerate, especially for 2001, but I don’t think the way I can just freely move sliders around and have that apply to a 10bpc 12MP image, no sweat, with compression already applied, oh and reversible at any time because the effects are metadata rather than hard-burned was feasible then.
You’re right about Word. I just fought it today, and some of its sluggishness is puzzling.
@msh @puzzled @mood in defense of Word (LOL), though: its collaborative features have improved vastly in recent years, and again, that’s something that was really more of a tech demo / pipe dream 20 years ago. Today, it’s an actual viable mass-market feature.
I, despite being a dev, was positively surprised how well video conferencing worked last year (y’know… 🦠). Part of that is iteration, but I bet much increased baseline specs help.
@msh I used to have a list of features that I would kill to have. Got them all years ago ;)
Now, better battery life (back to the future replaceable batteries?) and maybe a better camera.
@msh I would say that in some ways computing has even gone backward, because so much of what we do now involves "treadmills" of various kinds that at best waste our time and brainpower but more likely make us feel anxious and sad. I once told my 9 year old that the whole point of marketing is to make you feel dissatisfied or inadequate because you don't have whatever thing they're trying to sell you.
@msh I think the root cause of everything that's wrong with computing is IP law. IP law turned computers from a tool the user was expected to program into an appliance whose main purpose is to run commercial "applications". Everything since then has just been further evolution along that line, to the point that most devices are bundled with a single "app store" that it's difficult or impossible to avoid.
@msh I think that if it were difficult or impossible to make money selling bits because they were too easy to copy, people would still find plenty of ways to make money off of computers. But the experience of computing would be very, very different than it is today.
Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.