Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why is EE so bad? On the surface it looks like a degree with a strong grounding in mathematics, exposure to programming, the discipline of engineering, and a diverse range of applications across industry. It kind of seems like the ideal degree if you want to hire someone. I'm always baffled when I hear stories like this.


I have some hypotheses. Note my degree is in physics, perhaps a similar situation though not as widespread. There are a number of things happening.

Most engineers in all disciplines lose their math ability after graduating. The workplace itself allows this to happen: They get so busy with regular design work that they forget their math and theory. A lot of the analysis work is handled by their CAD tools. The work that does need math or deep domain knowledge is handled by one or two experts within the department.

There are some practical limits to the size and complexity of hardware, that limit the amount of hardware work. An electronic board might be designed and tested once, and then a million copies made. The software for supporting that board is maintained constantly. This is partly due to a conscious choice to move functionality from hardware to software. When hardware is obsolete, it's abandoned. When software is obsolete, it's augmented with new software on top of the old software.

There's a strong message from above that software is more important than hardware. Sparkly software is what management sees when they are shown the product. The people who find that they can program well enough to do it for money, have moved into software development.

It's harder for an individual hardware person to capitalize on their own innovation, because they need the infrastructure to test and manufacture new hardware. So we can only move at the pace of the businesses that employ us.

Programming can inflate its own demand through technical debt, and can organize itself to a level just short of full blown collective bargaining.

Note that I'm not talking about pure software businesses, but those businesses don't need hardware engineers at all. ;-)


I don't fully disagree.

Though I don't agree with negative take on software like "sparkly software for management", "inflate its own demand by technical debt" or "organize itself for collective bargaining".

Sparkly software gets you just as far as it is useful usually, less sparkly sells worse but you still need it on hardware to get job done, hardware alone is not enough.

Inflate demand - well people just suck at organizing big projects there is no need to artificially inflate demand it just happens as business needs more and more features.

Software developers are bad at organizing and bargaining - because they all think they are better than others and code of other people always sucks :)

What is my hypothesis:

Hardware has physical limitations as is obvious - even if you build millions of boards - well it takes storage space, you need copper, aluminum, you cannot make transistors smaller into infinity. You can only sell so many phones as there are buyers.

Software on the other hand is limited now mostly by amount of the developers in the world. There is infinite amount of programs that you can run on finite amount of hardware. There is infinite amount of software to be built let alone maintained that is why software developer salaries are going through the roof.

While I can sell 1 phone only once now I can build SaaS solution that I will get cashflow and monthly payments it is not even that individual can capitalize on his own innovation. Basically infinite revenue stream from SaaS model is just so attractive for any business man.


Brilliant comment. It applies to most engineering, or scientific fields. Develop, discover, once. Maintain or improve forever. Explains the massive difference in demand for truly imaginative innovative thinkers, and the maintainers. Come to think of it, other fields too.


Well for me the degree was of mediocre usefulness. The job I had to sit in an office on a factory floor with no windows, fully airgapped on my own 7 hours a day doing test automation and designing test fixtures mostly. That was the entry position for us back then really. I had to wear full static gear head to toe which was hot and uncomfortable and smelled weird. This was broken up twice a day by some shitty machine coffee and to sit in the canteen and stare out the two small windows at people outside at the company over the road all smoking. All while being paid just about enough to eat for the month and keep up with my games habit. There were also 9 layers of bureaucratic nightmare everywhere and as the company was large everyone knew or was related to everyone else as they all lived in the same area so it was politics galore.

It was just sooooo depressing.


Why would test automation / design require static gear?

Did you do the testing on actual hardware?


Board level test ie boundary scan, burn in and ageing on actual hardware. The assemblies I was testing were up to $40k a piece.


A lot of negative responses here but my own experiences having graduated with an EE degree from an average university (in the UK) 3 years ago has been pretty good so far. I was able to get 3 internships at 3 different semiconductor companies throughout the course of my degree and got a job at another semiconductor company immediately after graduating doing digital chip design so I don't think the job market, at least in my niche, can be quite that bad. The pay is pretty good, well above average for a recent graduate in the low cost of living city I'm based in and a bit higher than my software engineering friends in the same city. Still considering maybe doing a masters in computer science to expand my career options though


An EE degree IS awesome. It gives a generalizable mathematical foundation which applies in just about any other field of work. It gives a huge competitive edge. For example:

* Circuits are physical implementations of differential equations, and EE gives a unique way to intuitively think about dynamics, which applies to finance, epidemiology, and just a really diverse range of domains.

* With a rigorous EE background, you can rapidly pick up most domains of engineering (think Elon Musk), since you've got all the mathematical foundations. The reverse isn't true. The way math is taught in EE is broader than e.g. mechanical, civil, or other engineering disciplines, where it tends to be more domain-specific. EE gives you a lot of depth in probability, error analysis, signal processing, controls, calculus, linear algebra, etc. I think the only things missing are statistics and discrete math, and I picked those up elsewhere.

High-performance board-level EE is insanely fun. Incredibly creative. You get to build stuff, design stuff, do math, and it's just a huge diversity of intellectually-fulfilling stuff.

IC design is a bit less fun, due to the many-month turn cycles (develop, wait months and hundreds of thousands dollars, and test/debug), but not bad.

However, the EE industry sucks:

- Pay is not bad, but much worse than other jobs you can get with an EE degree.

- Work culture has all the worst excesses of the eighties -- think of Office Space style cubicle farms, dress codes, conservative management, ISO processes, and paperwork.

- Yet it's somehow adopted some of the worst excesses of the 2010s; it no longer feels like work is a family or a community

- And it's hard to get into. There are virtually no jobs for junior-level EEs (which isn't just BSes -- in the Great Recession, I knew bright newly-minted Stanford/MIT/Caltech/etc. Ph.Ds who couldn't find jobs).

- Even at the senior-level, there's a bit too much of a boom-and-bust cycle, without the booms ever getting that boomy, but the busts being pretty busty.

I spent maybe five years doing EE work after my EE degree, and I think that was enough. I've been out for a long time now. I still do EE as a hobby, and I enjoy it, but the industry culture isn't one I remember with fondness.

I suspect a lot of this stuff will continue to disappear from the US into Asia; that transition is rapidly in progress. US firms maintain specialized knowledge in some areas (e.g. particular types of MEMS devices), but there are plenty of places we've fallen behind. I don't see us on the path to regain leadership. I think some of this is cyclical. Declining industries don't make for good employers, and poor employers don't make for growth industries.


EE is one of those "engineering is really applied physics" disciplines. There's a slant towards standardised EE-specific solutions for PDEs, but it's still much more abstract and mathematical than any field of CS, apart from maybe cryptography and data science.

But career-wise, it's a mediocre choice in most Western countries. (Possible exception is Germany, where engineers have a similar status to doctors and lawyers.)

Most people have no clue what EE even is, or just how much math and engineering goes into building everyday devices and services.

(A friend of the family said "Great! You'll be able to get a job repairing TVs!" when I got my course offer.)


> There's a slant towards standardised EE-specific solutions for PDEs, but it's still much more abstract and mathematical than any field of CS, apart from maybe cryptography and data science.

Here's the thing, though: PDEs are NP-hard. There isn't a generalizable way to model dynamics. On the other hand, dynamics come up everywhere:

- How is the pandemic going to evolve?

- How will incentive structures skew cultures?

- How do I build a suspension for my car?

- How does heat leak from my house?

- How does my understanding evolve with learning?

... and so on.

What EE does -- and I think uniquely -- is given intuitive, graphical tools to think about differential equations, in tools like Laplace, Body, Nyquist, root-locus, and so on.

They also give a lot of applied experience in applying those, including in contexts with nonlinearities. An op amp will clip on both sides, which you model as a linear differential equation (which is easy enough to reason about) and a memoryless, time-invariant nonlinearity. You squint. You kinda ask yourself how it would work if it /were/ linear, and the nonlinearity just cut gain. And at some point, after doing it enough, you have intuition for what it will do.

With the EE-specific stuff, I can intuitively reason about these things think through to design.

EE is all about modeling -- building simpler equations which approximate more complex ones in ways which give intuition -- so this is also usually correct or almost correct. Indeed, if you go onto grad level courses in control theory, you'll see formalizations of this intuition, where for example, a time-variant system or a nonlinear system is modeled as a linear time-invariant system, together with a bounded error.

A lot of the mathy stuff -- which I've learned a fair bit of as well -- is in abstract more general, but in practice, gives much less intuition.

My experience with the real world is that there are rarely actual differential equations handed to me. I kinda get that we've set up some pricing structure, or some incentive design, or whatnot, but I can't model it formally. I know which way things push, and whether those integrate or not. I can draw a block diagram and reason about how it will behave, in a way the math side doesn't let me do.


>Germany, where engineers have a similar status to doctors and lawyers

Errr, no they don't. In terms of pay and status Doctors and Lawyers trump Engineers every day of the week in Germany, the only exception being the engineers with PhDs who are tech leads in some well known research institute or big-brand company like Audi or Porsche.


I think you over promote the utility of the formal methods taught in the degree.

Not once did I use Laplace in EE. It was all cheat sheets or applying data sheets and doing some adhoc calculations in excel or taking a wild guess and iterating.

After twenty years of doing IT related stuff I’ve forgotten how to even differentiate stuff.


I've used them all the time in my career, but:

1. I did learn them well, and so I did use Laplace quite often in EE.

2. I jumped careers not into IT, but into tracks which leveraged both programming and a mathematical skill set.

I'll mention: I'd be bored out of my wits doing just IT. The intersection of IT and math includes computer graphics, visualizations, robotics, machine learning, fintech, image processing, and a ton of other stuff I find much more fun and fulfilling.

That's not an implicit commentary on your path, by the way, just an explanation of mine. We all have different goals, desires, values, constraints, etc.


Yeah, I have an EE degree but have only worked in software, and I have to say you can bring any kind of mathematics to the job if you have the imagination to find the places where it is an advantage.

I sought areas where I could learn more math and use it to stand out from the crowd, albeit to mixed results, because if your manager can’t read your analysis paper he may not be impressed either, and sometimes the reverse.


I feel like there's an uncanny valley.

If you know a little bit of math, there's no benefit.

If you know enough math to jump to e.g. medical imaging, robotics, controls, simulators, image processing, ML, or similar, there's a ton of benefit.

An EE degree ought to give enough background to get there, although it may involve a year or two of study in a particular domain, and a side project to prove you have the skills.


That’s exactly why I never used it. When you’re writing software you need to pretend your successor is an axe murderer.


I think some of it must be 'sticky' culture. I'm another former EE who left the field to develop with much better pay so the inter-discipline competition does exist in some form. What I saw of EE jobs didn't have the dress codes and comfortable offices.


Same answer as any underpaid field: an oversupply of labor in that field vs demand.

The absolute worst fields are the “sexy” or trendy ones. Unless you are strongly driven to enter a field for non-monetary reasons always look into employment opportunities.


Market shrank as it matured and software generally took over most workloads.


Yeah, when I took my Analog EE class, most people’s projects ended up being a race to the microcontroller


"The ideal degree if you want to hire someone" is different from "the ideal degree if you want a high salary".

To get a high salary, you need to be in a good negotiating position, such that if company X won't hire you for US$150,000(/year), then company Y will hire you for US$145,000 (a strong BATNA). There are three possible reasons company Y might not hire you for US$145,000:

1. They are badly managed and making irrational decisions.

2. It would be unprofitable for them to hire someone like you for US$145,000; in an engineering position, this is because their revenues minus cost of sales would go up by less than US$145,000, risk-adjusted.

3. They can hire someone else like you for a lower cost, such as US$130,000.

Item #1 can mostly be discounted as a difference across fields; there are badly managed companies in every field, but generally they aren't the ones who hire a lot of people, and they aren't the ones providing your BATNA. However, in the case of programming, company Y might be you and your college roommate setting up Pinboard or Tarsnap, so there is perhaps a relevant difference here.

A thing about items #2 and #3 is that "someone like you" means "like you" from the company's perspective before they hire you. The fact that you can solve hard leetcode puzzles during the interview in ten minutes figures into this, because that's something they can observe before they hire you, unless they go through a recruiter, in which case it doesn't. If you can do a board layout with 166 MHz DDR and it will work right on the first spin with no signal integrity issues, that doesn't figure into "like you", because that takes at least a week, so you can't do it as part of the interviewing process.

The bigger difference, though, about item #2, is that the returns to NRE work in either EE or programming depends on volume. If your EE innovations give them a working board that costs $3.80 to produce instead of the other guy's $4.30, then if they're producing 100 units, you've produced $50 of value for the company. But if they're producing 100,000 units, that same amount of work on your part has produced $50,000. And similarly if the product brings an $0.50 higher price instead of having an $0.50 lower cost. And similarly if we're talking about lowering the cost per user of operating a server farm, or increasing the ad spend per user.

So why is that a difference if they're both NRE work? Because producing 100,000 units of an electronic device requires a huge amount of up-front capital investment. You can't produce 10 units one day, then 15 units the next day, then 25 units the day after that. So if Company X is investing $3 million in your project and Company Y is investing $0.3 million, the Company X devices are likely going to get produced in ten times higher volume, so every design decision you make is worth ten times as much money.

Now, of course if you're working on Google's backend systems that serve, ultimately, 5 billion people, an 0.1% improvement produces more value than an 0.1% improvement in the systems of a company with only 1 million clients. So FAANG can afford to pay hackers more than the average health plan, ISP, or venture-funded satellite imagery startup. But, even in those cases, the capital investment needed to get a lot of value out of a programmer's work is fairly small, maybe US$10k to US$100k, rather than the US$1M or more that is common for EE projects. This puts programmers in a much better negotiating position.

So, though the two fields have similar diversity of applications, type of work, and intellectual difficulty, electrical and electronic engineers are reduced to begging for scraps from rich employers and then have to sit in bunny suits on factory floors with shitty coffee, while programmers get free massages, incentive stock options, and private offices. Except in countries where companies hire programmers through recruiters.

(Why do they do that? I think it's mostly an industrywide case of #1: companies in England and Australia use recruiters because everybody else uses recruiters, and so the ambitious programmers leave the country, reducing the large advantages obtainable by black-sheep companies who hired directly and could thus hire only competent programmers, to whom they could profitably offer twice as much money. But maybe there are legal reasons or something.)


Because when you sell physical things, your margins will diminsh.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: