I think one of the reasons programmers are not paid as much is that by and large, we are seen as interchangeable cogs. There are definitely exceptions, but I think the companies that think this way are large enough to set the market rate. For example, I used to work for a large defense company that probably had more software engineers than every startup in the world combined (we hired 10% of all CS graduates in the country every year). The prevailing attitude was to pay as little as they could get away with because there is always some new naive college CS grad to replace those who left.
The thing is that programmers largely are interchangeable. Of course, you are a special snowflake etc; but at the end, the majority of programmers do repeatable, simple work, e.g. writing line-of-business Java applications, maintenance work like adapting Cobol code to work with new tax rules, writing generic GUI's that work just well enough to justify the expense given that the app has only a few 10's or 100's of users, etc.
For all of these things, when managed properly (i.e., rotate programmers enough, take care of documentation, have strict coding standards etc), you can get a new programmer who has had a 2 year degree in programming and 6 months of training on the specific technology up to speed in such a project in a week. That's not quite as interchangeable as a guy turning screws in a car factory, but it's not that far off, either.
I realize that everybody likes to thing that they are so special and that without them the world (and certainly their company) would grind to a halt), but for the vast majority of programmers, it's simply not true. And for those where it is true, there is a large part where it's only true because management allowed one person to become so entrenched in one place that they've made themselves indispensable. And not because of the nature of the work.
> you can get a new programmer who has had a 2 year degree in programming and 6 months of training on the specific technology up to speed in such a project in a week.
Sure, but it works well enough for many organizations. Very seldom is there a business case to be made for a beautifully architected, extensible, maintainable, useable (with useability testing etc) expense tracking application. Throw something together that works well enough to produce the output required and is integrated with other systems, write a 10-page manual that explains the input formats (because sane input checking and error reporting is for pussies - just display 'data input error' when the user enters a date as dd-mm-yy instead of dd-mm-yyyy), write a 'policy' that people only can get paid expenses if they use the application and voila, you're done.
And when you need changes in 3 years time - hire a contractor on fixed fee, who cares that he will age 10 years in 10 days time; he'll move on to the next pile of crap soon anyway.
> "Sure, but it works well enough for many organizations."
Which is to say: organizations derive such fantastic value from software, that even slow, buggy, late and over-budget projects, nor a parade of such projects, is enough to cause them to reconsider their approach.
Which is the answer to the original question.
Q: If software delivers so much value, why are programmers typically paid so little and treated so poorly?
A: Because even bad software delivers value far faster than most organizations can incorporate it and there's no shortage of bad programmers.
Following that logic, most doctors and lawyers are also interchangeable for the most part. The majority of work in those fields is applying what you've been taught and trying not to screw it up.
And of course there are exceptions, but from what I've seen it seems like there are more opportunities for programmers to bring personal creativity to their work than doctors and lawyers, just because the field is less established and there are more open questions.
Sure, they are. That's the way hospitals are managed here nowadays - you make an appointment with a doctor with specialization xyz, and you don't know until you show up who's going to be there. They read through your medical history a few minutes before they call you in, and that's that. Centralized medical history tracking through automation is quickly making being a doctor a lot more routine than most doctors would like it to be.
But technically doctors and lawyers are interchangeable as well. Barring specialization, they all have similar educational background and they differ more by reputation rather than basic skills.
Of course, there's the union/doctor's association thing that keeps the supply short whereas programmers and "programmers" turn up almost everywhere.
Similarly programming skills vary a lot. Doctors and lawyers execute more and innovate less and they all have a basic level of knowledge that people are willing to pay for. Hiring a programmer is like hiring an artist: a good one will create lots of wealth while many can just fill in the blanks with something.
There's also the question that many doctors and lawyers are running their own clinic or law office, or they're shareholders of their "employer". There are a lot of programmers merely on the payroll. And being on the payroll only doesn't make you rich unless you can negotiate your salary or compensation to match your perceived personal capability. This hints that doctors and lawyers could be more fairly compared to entrepreneurs.
"Programmers are interchangeable" seems to come up quite a lot, but I don't think we have a monopoly on it: celebrities (where it is the individual's brand that sells) are the only example that aren't interchangeable that comes to mind.
One thing that may disadvantage us is that programmers are still one big blob: we don't define our specialities strongly enough to the outside world (or to ourselves). For example, teachers have their subjects: one might consider two maths teachers interchangeable but not a maths teacher and a French teacher, and that is obvious to a non-teacher. You and I might see the absurdity with swapping a web programmer for an embedded safety-critical systems programmer, but I don't think it is at all obvious to the outside world.
I think this attitude arises in part because upper management in a non-software firm doesn't have any way to measure the quality of the programmers they employ. It may be because they don't use the software, because their process minimizes the influence of individual programmers, or it may just be that they have no appreciation for all the mistakes avoided in the development process.
If management can't see the difference in value-add between a seasoned hacker and a new CS graduate, they won't pay for it.