Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Playboy Interview: Steve Jobs (1985) (longform.org)
284 points by gscott on Oct 28, 2019 | hide | past | favorite | 99 comments


People talk about CEOs with "vision", and usually it's a substitute for a more concrete explanation of what the value-add of non-technical upper management is.

Reading this exchange is fascinating. Jobs doesn't spell out the future - he doesn't know it either - but he has an unwavering faith that computers will be central to it, and can even quite effectively explain how and why. The interviewer seems in retrospect hopelessly naive, but he no doubt represented the mainstream opinion of the time. For Jobs to understand this and bet his career on it (remember, most people betting their careers on computers would have been able to find work even if they had never left the science lab or big corporations - Jobs was speculating on a completely different future) shows, I think, that he deserves the place and high-regard in technological history he has been given.


"Jobs: The most compelling reason for most people to buy a computer for the home will be to link it into a nationwide communications network. We’re just in the beginning stages of what will be a truly remarkable breakthrough for most people—as remarkable as the telephone."

Steve saw the internet coming in 1985. Wow.


>Steve saw the internet coming in 1985. Wow.

This will probably blow your mind:

https://youtu.be/Ao9X1GsUPys?t=125

1988 presentation where Alan Kay confidently predicts media-driven websites that became the norm only around 2010. He also talks about virtual reality and real-time 3d graphics.

We still haven't caught up with his vision for exploratory interactive computing and user-driven simulations.

The fact that computers can/will/should be networked and also used for interactive collaboration rather than just raw number crunching was evident to some people as far as 1962. Maybe even earlier.

http://www.dougengelbart.org/content/view/138/

Augmenting Human Intellect is not an easy read, but it is well worth it, and not just as a matter of historic curiosity.


> 1988 presentation where Alan Kay

Great find! He nailed it.

"Man is the animal that learned to shape tools, and then the tools shaped man. We're very aware of the first part as ourselves as tool shapers, but we’re very unaware of how the tools come back and shape us..."

Then ties that into what will become the internet:

"That communications net is going to take us from a situation where we use computers from the outside, to a situation where in effect we'll be living inside one vast computer.

It sounds awful, but in fact we already live inside a number of grids. We live inside a power grid, where we wouldn't know what to do if it was taken away. Soon we’re going to be living inside of an information and knowledge grid. Such that wherever we go, we’re always going to be in contact, with power to make things happen, communication to get people to do things and ways of processing the information that comes around."


Excellent links, haven't seen the Kay presentation before :)

For anyone interested in going down this (extraordinarily interesting) rabbit hole: the book What the Dormouse Said [0] by John Markoff paints a vivid picture of early (1945-1984ish) computer history. Featuring Engelbart, Jobs, Jim Fadiman, Stanford, MIT, Xerox, Tim Leary, and more. A light yet informative read; highly recommended.

[0] https://www.amazon.com/What-Dormouse-Said-Counterculture-Per...


> https://youtu.be/Ao9X1GsUPys?t=125 > 1988 presentation where Alan Kay [...]

I'm getting no sound throughout the whole video, and youtube lists it as having music from a particular Debussy recording, which is then also unavailable (https://www.youtube.com/watch?v=vhogGOTnT9I)

Muting the sound is the usual response to streams and videos that plays audio that someone claims to own the rights to. Is this another casualty to how easy youtube makes it for copyright trolls (or people indistinguishable to copyright trolls)?


It plays fine for me, but it does have a 3D demo with some classical music at the end, so maybe YouTube disabled sound based on your location.



Many of us (I was 12 at the time) thought this was fairly obvious in 1985. I had a 2400 baud modem, and an account on a local university minicomputer. It didn't take long to learn how to telnet to other computers all over the world, play MUDs, etc, all for free (whereas just calling the next town over cost $$$).

At the time he said that, he almost certainly had access to Stanford's infrastructure and PARC, and could see what people were doing with internetworking. I think what was interesting at the time was the comparison of the walled gardens (CompuServe, GENIE, etc) to the wild west (Internet), where (at least to me) it was "obvious" that the internet was superior because it was open and free and made it easy for kids like me to learn how to write code/apps that used the internet (as example, internet remembers my email address and code from that time, http://mudmagic.com/codes/download/other/DINK20.com)


Besides the fact that the internet was already around by 1985, and e-mail and Usenet groups were a known thing in nerd circles, in France a network for the masses, Minitel, had already been rolled out. Jobs was not so much prescient as simply aware of what was going on around him.


I recall circa 1985 I really wanted to sign up for QuantumLink, aka Q-Link, which I had seen advertised on the back of a C64 video game package (maybe M.U.L.E.?). The idea of chatting with other people through my computer was fascinating. I was six. I asked my mother about it and with splendid foresight of the modern world, she said no, that I would probably get molested or something.

"Quantum Link users could send electronic mail, use on-line chat rooms, download files, read the news, send instant messages to other users, and play a number of multiplayer games, including classic board games and games inspired by contemporary TV game shows. In late 1986, Q-Link expanded, adding casino games, a database dedicated to rock music, an on-line auction service and the ability for users to have customised avatars digitised from photos they sent in to the service."

https://paleotronic.com/2018/07/01/a-1980s-quantum-link-to-a...


> the internet was already around by 1985

Sort of.

https://en.wikipedia.org/wiki/History_of_the_Internet

"In 1986, the NSF created NSFNET, a 56 kbit/s backbone to support the NSF-sponsored supercomputing centers."

That was a year after the interview. I think you have to give the man some credit for forecasting that this particular technology would be the one to break out of the geek circle to really change the world for everyone.


NSFNET was just a long-haul backbone, part of the underlying physical network at the time.

The Internet is really defined by the inter-networking protocols, since the very term "internet" implies it can ride on different physical networks. To be a part of the Internet you just have to speak the TCP/IP protocol stack. The same wikipedia page points out these were standardized around 1982.


yep, I remember the email that went out at the time my local university switched over to NSFNET. Before that they had a bunch of leased lines talking to various other local universities. Still Internet.


Which is still leaps and bounds ahead of today's C-level exec


Every C-level tech executive schedules two hours a day of reading HackerNews, right?


that depends on the exec, but on average? sigh.


In my experience, it's a rare and desired trait.


Arthur C Clarke did the same thing in 1974 [1]. The reprint of the Steve Jobs interview is a good read nevertheless.

[1] http://youtu.be/OIRZebE8O84


On one hand, yes, he did (in a way) and that helped him when he came back to Apple with the iMac.

In another, a lot of people did. I had a TRS80 Model 100 and was blown away by CompuServe and BBSs in the mid 80s. I managed to finagle a SLIP connection to a local university in 1990, and knew that was the future. Anyone who had access to it did, because those were the people building Netscape and Yahoo, etc. It wasn't hard to see that was the future once you knew it existed. Even Bill Gates did, but he got distracted by CD ROMs and their huge local storage capacity and took his eyes off the ball. MS had an Internet connection very early on.


>For Jobs to understand this and bet his career on it ... he deserves the place and high-regard in technological history he has been given

I'm sure Jobs was brilliant, but this kind of praise still reminds of this Enrico Fermi's story [1]:

During the "Manhattan project" (the making of nuclear bomb), physicist Enrico Fermi asked General Leslie Groves, the head of the project, what is the definition of a "great" general. Groves replied that any general who had won five battles in a row might safely be called great. Fermi then asked how many generals are great. Groves said about three out of every hundred. Fermi conjectured that if the chance of winning one battle is 1/2 then the chance of winning five battles in a row is 1/2^5 = 1/32 = 3%. "So you are right, General, about three out of every hundred. Mathematical probability, not genius."

[1] https://arxiv.org/pdf/physics/0607109.pdf


I can't believe that a brilliant man like Fermi would actually say that.

The outcomes of battles are not random events. It's equivalent to saying that Floyd Mayweather, who so far has won 50 out of 50 of his fights, is not a great boxer, because, out of millions of people boxing, there mathematically ought to be one person who just won 50 times in a row. The thing is winning battles (or boxing fights) is not flipping a coin - it involves little more skill than that.


What Fermi is saying is that if 3 out of every 100 generals are great, these 3 should be able to win more than 5 consecutive battles in order to beat the statistical noise.

In your example, Mayweather's chance of winning 50 fights by sheer luck is 10^-15. We don't have 10^15 fighters around, so his achievement cannot be attributed to chance.


Unlike a general, Jobs didn't enter into a series of battles with binary outcomes. He built a company - Apple - from scratch which changed the shape of the world.

I'm not saying that Jobs was the only person with that vision and that he was rewarded accordingly, but that his vision and insight were obvious to anyone who cared to listen, and that he followed through on that vision with incredible success.

Imagine instead of a general a tank designer whose tanks went on to change the face of warefare. That's Steve Jobs.


I'd probably use the term quaint over naive, in 1985, only the most hardcore of households had an Apple IIe, IIc, IBM PC, etc. And no cell phones, so it really was so early on. Consumer tech hadn't really come into its own yet. It pre-dated CD (for music), and the Sony Walkman with cassettes and an Atari/Intellivision game console were about the extent most people had with computers. I'm not even sure if there was traditional tech journalism at the time? I mean, my family had some Apple-centric magazine that came every now an then....


Not sure what your context of "traditional" is, but Computer Shopper and Dobbs were around then.

1985 is also the year the Amiga came out. Got chills when I saw the commercial.


I love the analogy between early command-line computer interfaces and the telegraph, versus graphical computer interfaces and the telephone.

It's also funny to that when he elaborates on this by saying we prefer not to describe where something is but just point to it, he did so with his finger. Of course in context he meant on a computer it's better to point with a mouse than enter commands, but it so nicely foreshadows his intuitive embrace of touch interfaces decades later.


> It's also funny to that when he elaborates on this by saying we prefer not to describe where something is but just point to it, he did so with his finger

"One evening, Master Foo and Nubi attended a gathering of programmers who had met to learn from each other. One of the programmers asked Nubi to what school he and his master belonged. Upon being told they were followers of the Great Way of Unix, the programmer grew scornful.

'The command-line tools of Unix are crude and backward,' he scoffed. 'Modern, properly designed operating systems do everything through a graphical user interface.'

Master Foo said nothing, but pointed at the moon. A nearby dog began to bark at the master's hand.

'I don't understand you!' said the programmer.

Master Foo remained silent, and pointed at an image of the Buddha. Then he pointed at a window.

'What are you trying to tell me?' asked the programmer.

Master Foo pointed at the programmer's head. Then he pointed at a rock.

'Why can't you make yourself clear?' demanded the programmer.

Master Foo frowned thoughtfully, tapped the programmer twice on the nose, and dropped him in a nearby trashcan.

As the programmer was attempting to extricate himself from the garbage, the dog wandered over and piddled on him.

At that moment, the programmer achieved enlightenment."


Yes, his vision was permanently very clear. In this 1985 interview he already had:

"The most compelling reason for most people to buy a computer for the home will be to link it into a nationwide communications network. We’re just in the beginning stages of what will be a truly remarkable breakthrough for most people—as remarkable as the telephone."


I don't agree with the analogy. Telephones derive their power from the use of language just like command-line interfaces. Point and click interfaces will never surpass the power of language because it is language that makes us human.


You're completely missing the point. Language is about an awful lot more than words, it's about expressiveness and emotional content as well as just textual information. He talked about telephones letting people sing - express themselves though voice beyond mere words.

A command line interface isn't a great medium for singing, but graphical interfaces enabled sophisticated music production software, desktop publishing using beautiful fonts and images (examples he used). It enables expressiveness beyond what can be captured in mere text. But it can do plain text just fine too.

But the main point he made was not that these things can't be done using a command line, it's that a graphical interface democratises access by removing the need to learn special incantations to interact creatively with content.


I am fully aware of what he's trying to say, but it's wrong. We're talking about communication of ideas between humans and computers. The claim that GUIs have enabled music production is nonsense given the vast amount of music produced without it. Same for desktop publishing. I can use beautiful fonts and images using TeX, for example.

A command-line interface is no more a "magic incantation" than pointing and clicking in the right places is. In fact, I would argue that language is less magic than pointing and clicking because the rules of language are available for anyone to learn, while the rules of pointing and clicking are designed and only understood by the application developer.

Damn me for having a different opinion, though.


No analogy is perfect, and I agree that this one falls apart in one important way: if you know what you're doing (!), a good CLI can be legitimately faster than even the best GUIs for certain tasks.

But that's a big caveat.

I've been volunteering at Girls Who Code. My club is doing a project that involves SSH'ing into a Raspberry Pi, and I've spent a significant amount of time getting the kids used to just moving through folders within the command line. Using cd and ls to navigate a file system is actually really tricky to wrap your head around if you're not used to it!


I can relate. One of the things I miss most on the command line is the poor visualization especially of hierarchical stuff (file system, git history,...).


> fact, I would argue that language is less magic than pointing and clicking because the rules of language are available for anyone to learn, while the rules of pointing and clicking are designed and only understood by the application developer.

I guess your command line uses natural language processing and is accessible to anyone who never read a man page or heard of a shell?

The command line interface is no less opaque than "the rules of pointing and clicking".


I think you're still missing the point. It's not that talking on the phone or using a GUI is more efficient, it's that it allows people to use technology in a way that feels more natural to them, and requires much less training to be able to utilize it.

To your point about clicking vs knowing commands, a GUI is more discoverable for someone, because they can try things and see what happens. Trying to guess commands is much harder, because there are no clues to guide you.


> A command line interface isn't a great medium for singing, but graphical interfaces enabled sophisticated music production software, desktop publishing using beautiful fonts and images (examples he used). It enables expressiveness beyond what can be captured in mere text.

Mere text[0][1].

CLI vs GUI in the context of this thread is a false dichotomy, because we’re imagining (I.e, inventing) scenarios where GUIs are better. “Look at this virtual rack of equipment in Reason and see how democratic that is.”[2]

Sure, or a mess that begs to be put into a table in a text processor. Where it’s more truly democratic because a screen reader can be employed if necessary. Jobs was selling sparkles, successfully, not “computing”.

“Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?”, indeed. Why pick only one?

Text isn’t dead or outmoded.

[0] https://drive.google.com/file/d/1D6ZzTeUlk6mwqyZ0GIrhYIFeRkK... - my own, developed in console with vi, tmux, CSound, ffmpeg, custom software for compositing, Makefiles

[1] https://tex.stackexchange.com/questions/1319/showcase-of-bea...

[2] https://youtu.be/PjeFRLcioqk


There’s an awful lot of folk music notated in ABC (the musical equivalent of markdown.)

IMO unless you’re actually drawing or looking at an image GUIs really kind of suck but are very sexy and help people sell a lot of software.


Your examples are not very convincing. Desktop publishing powers tabloids while a great book remains a great book even in pure text format. Expressiveness is not limited to a medium, and if you ever used a command line interface you would understand the concept it pipes and arguments is way more expressive than a round of 100 clicks to do the same thing thru a GUI.

GUI do democratise access to capabilities, but unless you work only with pixels GUIs are inferior in every single way.


>GUI do democratise access to capabilities, but unless you work only with pixels GUIs are inferior in every single way.

Except in the way that matters: symbolic representation and manipulation (and discoverability).

>Expressiveness is not limited to a medium, and if you ever used a command line interface you would understand the concept it pipes and arguments is way more expressive than a round of 100 clicks to do the same thing thru a GUI.

Actually, a GUI can trivially include a pipe system (tons of VFX and Color gradings apps for example offer one, MAX/MSP and others).

A GUI can also trivially include a text console, and have all the capabilities of the command line.

The command line, on the other hand, can't have all the capabilities of the GUI. Even auto-completion is implemented crudely with curses interfaces (which, in any case, are GUIs too).

(Oh, and I've been using the command line since the pre-Solaris days of Sun OS).


"If you ever used a command lien interface".

That's a good one, I've been using Unix/Linux systems daily since the early 90s and test people on their Bash skills in job interviews.


"Command lien interface" - a modality of interaction in which the empowered party (lienor) exerts control over someone with valuable property (lienee) via commands backed by a ephemeral interface that does not need to exist at compile time (lien), but is practical for eliminating bugs that arise from causality in economic systems.


Very strange that bash has still not disappeared in 2019 if GUIs are so superior to every thing a command line interface can do.


You can think of Jobs argument as this way. What `ncurses` is to you, computers are for normal people.

You are essentially arguing the same point Jobs is emphasising. If Jobs have to do it your way, basically you are leaving behind 95% of normal people who are not tech savvy like writers, artists, accountants etc who could benefit from a computer without knowing the complicated inner workings of a computer. Exactly the same way you are benefiting from `ncurses` and not writing all the obsure logic of text rendering from scratch everytime just so you can work in command line ... You misinterpreted this whole discussion as an argument between GUI vs terminals which is not the point at all.


I agree with you, though I did know some non-tech people that were amazingly good with software like Wordstar, Visicalc, etc. Jobs makes fun of Wordstar in the interview, but it was a really productive tool, since your hands stayed on the keyboard.


You're comparing a 18-wheeler to a compact car and saying the 18-wheeler is superior in every way.

Most people need ease of use more than they need raw power. Both of them have their place.


Some of us remember running applications from the command line. These were not the “does one thing well”, POSIX compliant, programmer/admin facing tools you still find in *nix these days. These were typically very limited, weird REPL sort of things where you might have eight or ten or fifty very domain specific commands that didn’t approach a Turing complete language. There was no other way to use the program other than the weird REPL that was provided.


So it's right there in 1985 (emphasis mine):

"Jobs: That’s inevitably what happens. That’s why I think death is the most wonderful invention of life. It purges the system of these old models that are obsolete." (He was speaking about the computer hardware).

20 years later "2005 Stanford Commencement Address":

https://singjupost.com/steve-jobs-2005-stanford-commencement...

"About a year ago I was diagnosed with cancer. (...) Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept: No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new.

Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary."

There is also a coda: in 2010, at D8 conference (Jobs died 2011), Jobs got a question:

"Q: A few years ago you gave a preparation speech at Stanford (...) Now few years later and a couple of years wiser would you add anything to that speech?"

"Jobs: Oh, I have no idea. Probably I would have just turned out the volume on it because the last few years have reminded me that life is fragile."

(Or was it "turn up"? Do we hear what we want to hear there? https://www.youtube.com/watch?v=i5f8bqYYwps&t=4400 )


Definitely sounds like "turn up" to me. Plus "turn out the volume" isn't idiomatic in Standard American English.


> "turn out the volume" isn't idiomatic

Indeed, thanks.


> The famous story about the boxes is when Woz called the Vatican and told them he was Henry Kissinger. They had someone going to wake the Pope up in the middle of the night before they figured out it wasn’t really Kissinger.

These were really different times.



This interview offers so many insights, it’s unbelievable.

Who are some visionaries/ industry leaders that have this kind Realistic future insight that are worth paying attention to?


Elon Musk. How many companies has he created or is running? Zip2, PayPal, SpaceX (a rocket company? that takes guts), Tesla (a new car company to compete against GM, Ford, Mercedes, BMW etc? that's nuts!), SolarCity, The Boring Company. And now he is promising self-driving cars by next year. (That's nuts!)


Donald John Trump


"Jobs: When I went to school, it was right after the Sixties and before this general wave of practical purposefulness had set in. Now students aren’t even thinking in idealistic terms, or at least nowhere near as much. They certainly are not letting any of the philosophical issues of the day take up too much of their time as they study their business majors. The idealistic wind of the Sixties was still at our backs, though, and most of the people I know who are my age have that engrained in them forever."

This was pretty striking for me, the goal of becoming rich, or being 'a millionaire' has been pounded into me from a young age to the point where I have difficulty picturing myself obtaining any kind of happiness without some level of financial success. Hopefully the current generations can find fulfillment in different areas. Tying happiness to money is kind of a cruel road to send entire generations down.


Playboy: That’s what critics charge you with: hooking the enthusiasts with premium prices, then turning around and lowering your prices to catch the rest of the market.

Seems like old tricks are the best tricks, eh?


His response is priceless:

"As to overpricing, the start-up of a new product makes it more expensive than it will be later. The more we can produce, the lower the price will get"

"As soon as we can lower prices, we do. It’s true that our computers are less expensive today than they were a few years ago, or even last year. But that’s also true of the IBM PC. Our goal is to get computers out to tens of millions of people, and the cheaper we can make them, the easier it’s going to be to do that. I’d love it if Macintosh cost $1000."


Wow. And to think that his vision has now put a mini computer in the palm of millions of humans. The early competition of iPhone vs Android really sparked a ton of innovation and a culture shift.


Apple was pretty much always premium though. Way more expensive than equivalent hardware.


Define equivalent.

Every time I go to find an "equivalent" laptop from a PC vendor I end up in the same ballpark of money, give or take (and often give) $200 or so.

As soon as I add the Hi-Dpi screen, 16GB RAM, equally fast SSD, equal GPU, equal CPU, USB-C ports, try to match the size (not some "desktop replacement" monstrosity), and go for a vendor like Lenovo, Dell, Alienware, Razer, etc, it's usually a tie.

And usually I'm still getting a worse build quality, a slightly worse display (gamut, etc), and a worse trackpad.

At least today though you get a better keyboard with PC vendors.


Price wasn't the main driver, either. Dealer network mattered a LOT.

Lots of kids who grew up away from larger cities ended up with computers from Radio Shack, because they were EVERYWHERE, while Apple or Commodore dealers were harder to find.

But I'm still curious, because I don't remember: How did (e.g.) Apple II pricing compare to equivalent-ish, contemporaneous models from Commodore or Tandy?


How did (e.g.) Apple II pricing compare to equivalent-ish, contemporaneous models from Commodore or Tandy?

Apple ][: $1,298

Commodore PET: $795

Tandy TRS-80: $600

At these prices, in 1977, only the Apple had color graphics.


I was one of the young kids (age 15) who saved up and bought a TRS-80 in 1979. Apple IIe was definitely better, but more expensive. PET was a bit of an odd bird, with chiclet square calculator keys and built in small monitor, no real keyboard. Apple II was the only one that was color, and if I recall it had bitmapped graphics with 256 x 192 resolution. TRS-80 had crude 48 x 128 graphics. My school had an Apple II for use by some science students, and I did some fooling around on it. It was definitely much nicer. But TRS-80 was mostly a tool to learn about computers; somehow I scrounged up resources (probably guided by magazines), got a CPU manual, and did some simple Z80 machine language programming, later got an assembler.


More like good tricks are timeless. Lots of old tricks are obsolete because information is so easy to access.


RIP, Steve. I have ached since he died not only for the loss of what he was, but also for the loss of what could have been. Whatever your view of Apple now, I feel strongly that it is a fraction of what it would have been if Steve had been able to continue on for another 10 years.


Tech revolutions are often ushered in by people in their early 20s one pop economist noted. That is when they have some education, lots of energy and few family responsibilities. The PC revolution was by people born in the mid 1950s, dot.com by mid 1970s, mobile by early 1990s ...


Wars have also fought by that same age group since the beginning of time.


1985 haha, that's when I was born!

When I grew up, I had the feeling computers weren't much of a thing in private.

Sure, I had a C64 when I got 8 years old and my first PC with 11, but until I got my PC I knew basically nobody in any age group that had a computer.

To me, the dream Jobs had, that anyone had a computer, became reality end of the 90 and beginning of the 00.

Which is kinda funny. The time everyone really had a computer at home probably didn't even last more than 10 years when it was superseeded by smartphones.

Life is very crazy in retrospect...

I remember talking to some friends at school when I was 16. They told me I should become an administrator and not a developer, because everything we need is already developed. They had games, videos, mp3s what more was there?


"Sure, I had a C64 when I got 8 years old and my first PC with 11, but until I got my PC I knew basically nobody in any age group that had a computer."

Wild. Where did you grow up?

I ask because I was born in 1970, and by the time I got a computer (a TRS-80 Color Computer 2; 16K of RAM, baby!) in about 1982, I was far from the first kid I knew to have one (though I definitely was on the leading edge). I wrote all my high school papers on it, and by the time I graduated in 1988 I'd say at least half the papers being turned in had those tell-tale rough edges from tractor-feed paper run through dot-matrix printers. By then, several friends had moved up from CoCo/C64/Apple II style machines to PCs (or, in one case, an early Mac).

My family was firmly middle/upper-middle class, but we were in a very poor state (Mississippi).

Now, most of these were household computers, shared by everyone in the house, but still: the upshot was "lots of PC penetration in homes in south Mississippi by 1988."


AFAIK the US got computers earlier than most places, even other First World countries. Large disposable incomes and a large, relatively homogeneous market. My family were middle class (teachers) in the UK, and moderately early adopters; we still only got our first PC in the mid-90s.


I guess I knew that. UK also had some pre-PC home computers that were homegrown, right? Acorn, et. al.?


The UK seemed to have all of the good stuff in spades. I wouldn't be surprised if a larger percentage of people had computers at home in the UK than the US in the 80's, so much software and hardware was made there, lots of publications like Amiga Format and Commodore User. When you look at the total market the US probably was bigger back then and probably nothing would compare to larger hubs like SV, NY or even Seattle.


I grew up in Germany.

Yes, the C64 was an old bin then already, but many of my friends had one and "it hass all the games (TM)".

When I got into a new school with 10 (1995), I met a bunch of people who had PCs, but they only had them because their parents were working in IT.

I grew up in a small town. My family is all working class. I was the first who studied at university and nobody in my family, even in my generation, got out of working class.


> Which is kinda funny. The time everyone really had a computer at home probably didn't even last more than 10 years when it was superseeded by smartphones.

To be fair, smart phones are computers. They're just a different form factor.


True.

It's just more "in their pocket" than "at home"


> Then there was the time Wozniak made something that looked and sounded like a bomb and took it to the school cafeteria.

Oh my. That's funny.


In 2019 that would have ended with a SWAT team guns blazing.


And, if not death, prison time for the 'prankster.'


>> When you’re a carpenter making a beautiful chest of drawers, you’re not going to use a piece of plywood on the back, even though it faces the wall and nobody will ever see it. You’ll know it’s there, so you’re going to use a beautiful piece of wood on the back. For you to sleep well at night, the aesthetic, the quality, has to be carried all the way through.

Ironically, the first US IKEA store opened in June that same year.


I really enjoyed description of an environment where Jobs grew up. Now kids see electronics only as a screen of smartphone or tablet. No space for technical creativity unless your parents are doing something in this domain. Selling printed circuit board in European Union is a great adventure taking care of CE and WEEE even for experienced adult.


It is kind of annoying that there's a two tier market: as a consumer, you can buy all sorts of extremely cheap nifty boards to connect to the Arduino, by having them posted from China to circumvent all the rules. But you can't circumvent the rules yourself.

(Well, you can, the enforcement is pretty adhoc, but there is still a nonzero chance of getting spotted and fined and they can actually collect against you in the EU)


Do they really? I'm not sure I agree, these days we have things like Arduino and Raspberry Pi which have been a huge success, and I'd argue they attract a much wider range of children than middle-class white boys who just happened to grow up near innovation centres.


Even the ability to access code has been deprecated almost by the force of financial investment and intellectual property investment.


I used to buy the magazine for the articles - honest


Not gonna lie. Bought for the pics, stayed for the articles.


At least that was a great interview with good questions/answers.


A jerk, but a visionary jerk.


You say jerk, I love the fact he's honest, unfiltered, and usually dead right.


What's really cute is how Jobs expresses contempt for money, for being a millionaire, praising the Sixties idealism (needless to say, the very spirit of the beginning of internet) at the same time he builds up the most closed, proprietary-obsessed company possible.


In the interview at the end Jobs says he loves Sushi. What's the deal here? I thought he only ate fruit or at least certainly not meat/fish. Can someone explain?


Eating sushi wasn't something normal people in the US did in 1985. It was a bit of a class marker that he was sophisticated and did crazy things.

I realize that sushi is pretty commonplace now but in 1985, for white Americans it was probably the craziest thing you could think of doing. It really freaked people out. Raw fish?


Not so much "crazy" because it was in multiple places in my suburban location at that time. The 'Claire' character of "The Breakfast Club" movie of 1985, brought sushi in (what I remember to be) a bento box with her and I only read her as spoiled, not crazy.


Time traveler from the 80s here to confirm, people freaked out about sushi.


There's plenty of sushi that doesn't involve fish or meat. Sashimi is raw fish. Sushi is more about rice.


And do you think people in 1985 understood that distinction?


Pescatarian - fish was the only meat he would eat.


He is said to have been a fruitarian for some time during his youth, but not later.


It really saddens me that the Apple I read here is an Apple that would be saddened by the Apple of today




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: