Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A basic phone blows away the early computers. 40mhz one core cpus (spark, mips, 80486...) used to do a lot of work and be fast. What has changed is bloat.


We also seem to have picked up a few features along the way. Rendering screen resolutions beyond 640x480, network speeds above 9600 baud, video, displaying images that each would fill one of the hard drives of that age, video and music editing, running programs that were unthinkable in terms of features set. Clearly, inefficiencies have crept in, but it’s not as if the software today wasn’t way more capable than what we had at that time.


We had much better specs than this available at an aerospace company in the mid 90s, not to mention LAN storage and direct T1 to the internet.


lan storage at 10mbs? T1 is only 1.544 Mbps. Those are incredibly slow by today's standards


We often had 100baseT and it was quite snappy for text heavy coms and reasonably sized images from the net. 9600 was already faster than you can read, this was an order of magnitude faster. Quake deathmatch was incredible, I remember getting almost twenty folks on a server once. :D

At some point I procured a 1600x1200 monitor… but not until a later job after 2k with an SGI was my computer able to handle that resolution comfortably.


> was already faster than you can read,

You make an interesting point: This *is* some sort of breakthrough point: When "average human textual input bandwidth" was matched ...

(This might be comparable to AI: Now we are trying to match "average human processing capacity" ...


Yep, I think my first modem was 2400 baud and even that was slightly faster than I could read.

Sluggish modem websites have no one to blame but themselves.


I bet those cost more than today’s cellphones…


Couple thousand per, in the 90s.


> images that each would fill one of the hard drives of that age

This really puts things in perspective ...


So much this. I could edit documents on my first 8088 PC with 512K memory. And people wrote novels on computers like this.


People also used to write novels on pen and paper, later typewriters, then word processors. What we have now is far, far better in every conceivable way.

I think nostalgia gives rise colored glasses when if you were to put these things side by side you’d never actually go back.


No, what changed is our expectation of what such a device should be capable of doing. You're not gonna load a 1080p YouTube video on 400 MHz.


The parent wrote 40mhz. You could probably do 1080p on 400mhz if you’re clever


We could and did. DVDs played on 400mhz machines


DVDs are 480p


Damn. You're right.

They felt like 1080p at the time.

That said, I could have and should have said Bluray, which came out in 2006 and did do 720 and 1080


Some would say 1080p is bloat when you can watch videos perfectly fine in lower resolution ;)


I remember when I got a 1080p monitor and watched some slightly old content (~480p) on it. The experience was very lackluster. Now I'm getting a similar feeling with a recently-bought 4k-capable laptop, and watching 720p content that looked perfectly fine on the 1080p monitor. I don't even want to think about the 480p archives. Almost makes me not want to upgrade devices.


(Can totally sée some AI-based solutions performing some heavy "interpolation" to "upsample" content in realtime ...)



I would, perhaps ...

PS. I quite distinctly remember the point in time when I finally said my then desktop machine couldn't move video, and it really was time to consider an upgrade ...

(My 500 Mhz K6 II - DAE remember those AMD chips? - had finally become too slow. Video was 'unstoppable' ...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: