Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Perhaps I'm misunderstanding you, but what you're saying doesn't match up with my experience. Pretty much any computer built in the last decade has been fast enough to do everything I need without making me wait. Screen resolution has improved somewhat, lately, but new machines feel just like old ones when it comes to performance, and that definitely was not the case from 1985-2005.


Computers have reached the same state as cars and household machines, nobody needs a 800hp car, Having 200+ is nice and fun but most people can probably get by with 150 for most daily use. A model from 10 years ago will fulfill the same needs as one from today, the new one might be a bit more efficient, quiet and comfortable but in essence they both just take you from point A to point B in the same time. </inevitable car analogy>


In point of fact I'm using a computer from ~2002ish, and it works more or less fine. The biggest issue is websites that pull in like 1000 external resources and constantly run scripts.


Yeah, that's been my experience too. The NoScript plugin helps; I'm generally not interested in dealing with complicated websites full of scripts anyway, and prefer to opt-in rather than leave everything running by default.


A machine from 10 years ago will run software developed 10 years ago with perfectly adequate performance -- probably true even prior to 2005 as well as any reasonable point in the near future. But the bit I quoted is breaking out of that completely, basically claiming that we're at some sort of hockey stick curve where that software-hardware vintage parity won't be required anymore.


Well, yes, that's exactly what I was saying: we reached that point on the curve somewhere close to a decade ago, and old machines are perfectly capable of doing everything I need them to do now, today, with current software.

The changes which have actually made a difference to my computing experience over the last decade were the arrival of affordable SSDs and a jump upward in LCD pixel density.


I certainly agree that we can do most of the same things if our needs don't change, but usually our needs do change. Even if the software isn't simply more bloated than it used to be, we usually have new needs: say, creating or consuming something in a crazy new format. I guess we've pretty much maximized the number of pixels that fit on a hand/pocket-sized screen, but if filling our entire living room wall with 16K resolution video is the new norm someday, and we're showing off a video we just took using our pocket/wrist device, we need serious horsepower. This example is hyperbole but the point is, for the most part, I find it hard to predict that we won't find a way to use serious increases in processing whenever it's available.


Well, sure we CAN use it, but what he's saying is that we no longer notice. 15 years ago, ads for new computers and components had kids with their hair slicked back because of how BLAZING FAST they were going, because that's how it felt when you upgraded. Now, it's more like 'oh hey, it doesnt run like a slug anymore because this device is brand new'. People don't get new laptops because they need a faster device nowadays, they do it because they actually wore down the old one. That meme that computers are obsolete within 6 months of purchase is no longer valid, they're good pretty much until the physical components break down. Most people won't replace their smartphone until the screen, charging port, or digitizer breaks (barring the very real bleeding edge crowd that holds less ground these days).

Everything is just...adequate. People don't log on to an older machine and lament how slow it is anymore, they don't balk at how huge the new hard drives are, and consumer-grade disk speeds havent gone up since 7200rpms were introduced. Video hardware is pretty much the only technology in most computers that still has reason to march on, because game developers can always add higher textures and cooler shaders.


Consumer grade disk speeds have gone up a ton since 7200rpm disks with the introduction of SSD which makes a huge difference.

Furthermore, while people aren't really after bigger screens, getting to a point where a laptop or desktop has similar DPI to a smartphone would require vastly higher resolutions for which there isn't the technology yet.


Until we gladly accept brute force algorithms as usable solutions, I don't think I can agree with the sentiment.

Almost nothing I want to do on computers runs fast enough for me, and I'm too stupid to invent good algorithms. But I haven't given up on solving hard problems, so naturally I will look around and see machines incapable of even getting started.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: