This is still cool, but has anything updated lately? I don't see anything newer than the Atari 2600 simulation, but I also can't get the "Recent Changes" page on the wiki to show anything, so I might be missing something.
One of the most interesting parts I found was it's not created from a spec, but from scanning and etching an actual chip, useful work for all the chips without complete documentation.
The 6502 only had a few layers of metal connections. I'm not sure how well the etching would work on modern processors (though in theory of course it should be possible).
I found this while searching for any kind of material that could help me understand how computers work, in the sense of how the flow of electrons becomes stuff on my screen. It's a question that neither my academic nor my personal studies have answered so any suggestions on what to read/watch/study would be more than appreciated!
It really depends on how deep you want to go. This is a topic that requires tunneling through surprisingly few layers before you hit quantum mechanics. I haven't gone through it, but "From NAND to Tetris" [0] is often recommended these days. My own education in these topics was a lot less organized and took quite a few years to really click.
Thinking about "the flow of electrons" is more of an analog topic, and that class of behavior is usually treated as non-ideal inefficiency for purposes of digital logic systems. This stuff is extremely important in figuring out how fast you can make things go when designing a real chip/board, but the gory details are fairly unimportant to computer architecture concepts beyond accepting the more abstract models of gate delay (each gate a signal passes through delays it) [1] and fanout (the more inputs you drive with a given output, the slower those inputs are switched) [2].
I wouldn't call my "education" on the topic organized either - I wouldn't call it education anyway, to be honest. "From NAND to Tetris" sounded like what I wanted but once I skimmed through the material it looks like it not as abstract-less as I want to go.
First time I see this. I understand jack of it but it's really amazing. On one side, it seems not too different from an abacus (it's just a machine!), on the other side there's a connection to the processor that I use right now. Very cool.
Digital electronics means thinking of a transistor as a switch. It's got three terminals, if there's a positive voltage on the control terminal then the other two are connected and power can flow through it. That's basically it.
If you connect billions of switches in the right pattern you get a computer. Billions of little switches clicking on and off, no more, no less. No pixies or magic dust involved.
HN does perform duplicate URI checks, where duplicated submissions count as an upvote towards the original submission rather than a new thread. However HN also allows re-submissions after a certain time period has passed, which I think is what happened here since the next newest instance I can find is ~4 months old.
I think the time limit is longer, but the exact same URL has been posted last a year ago, the submissions inbetween weren't recognized as dupes because they have different URLs.