Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think Zed is right to point these errors out - there are quite a lot of issues with edge case inputs and undeclared assumptions, that people definitely need to hear about.

That said, he appears to be dealing in absolutes too much. If you care about performance (and let's face it, if you're using C you do, otherwise you probably shouldn't be using C) then sometimes you can't handle as much error checking or error correcting as you'd like.

In games (where most of my experience is), it's common to have functions that are 'unsafe' by his definition, but that are hidden in compilation units and not exposed in the header, so that the programmer can control exactly where they're called from. If you have a limited number of 'gatekeeper' interface functions that are actually called from outside the module, and these either check/sanitise/assert on their inputs, then the internal private functions can safely assume that they have valid input and just run as quickly and as simply as possibly.



There are many other good reasons to use C besides performance.

For instance:

  - cross platform portability

  - predictability

  - reliability

  - long term stability / maintainability

  - low run-time overhead, fine grained control of memory


Don't forget the people.

Long-time C coders are among the best programmers around. It's hard to understand just how damn good they are until one has gotten a chance to work with one or more of them.

The good ones know all levels of the hardware and software stack they're working with. Coupling this knowledge with the raw power of C, they can put together amazing and resource-efficient software in very little time, yet without sacrificing maintainability, security, portability and the other factors you've listed.

These are truly the people who make the impossible become possible.


if this is true, then I'm going to take C more seriously. I have to move myself forward from TurboC programming.


It is true. How much of the truth is credited to "longtime" and how much is credited to "C" is up for debate, though. Personally I'd put most of it on the "longtime" side, and observe that if you want to learn how to write really efficient, performant code quickly, there are a number of other options developing where you can skip the part where you stab yourself with the language for five years learning where the sharp bits are. C is not dead yet, but I'm feeling increasingly confident we've finally entered the era where its days are now numbered.


I'm a long time C programmer, and in the embedded space, which is a burgeoning development area, C is indispensable. No libraries or external linkages, just raw close to the metal power.

News of C's demise are premature.


For the foreseeable future there will be application domains in which there will never be enough processing power to satisfy the subject matter experts that work in these domains. A very short list includes: atmospheric modeling, CFD applications, HPC simulations/optimization tools ... And that's in the science world. You also have OSs, DBs, web servers, HFT apps ... So, I'm not sure why you'd say that Cs days are numbered. Do you think the next high performance OS or MPI implementation will be written in some interested language?


based upon what? from what I've seen C is doing great, it has it's particular domains which in my opinion are low-level programming and performant, portable code with a small footprint.

I can't recall seeing a new language challenging C in the aforementioned areas.


"it has it's particular domains which in my opinion are low-level programming and performant, portable code with a small footprint."

I realize you're defending C, here, but I'd just like to use you for a moment to ask why this sort of thing gets stated over and over and over on HN.

C is good for low-level, systems software, embedded systems, and speed.

You know what else it's good for? Hundred and hundreds of desktop applications, userspace tools, web backends, and everything else.

It's like we keep saying it's for low level stuff, but thousands upon thousands of developers haven't got the memo. Who would dream of writing a media framework in C? The folks who do GStreamer, ffmpeg, MLT. User interfaces? Nah, it's for low-level stuff. Except, you know, GTK and Clutter. It's not for Web programming either, except maybe Apache and Nginx. You wouldn't check your email with it (mutt), or edit code with it (Vim), or make music with it (CSound, SuperCollider), or draw pictures with it (Gimp).

Maybe you're right, and that C is a bad language for all this type of stuff, but good grief -- there are a lot of developers shipping working, high-level systems in this language. And it's not like they haven't heard of Haskell or Java or Python -- or C++. In fact, we might ask when Java or Python is seriously going to challenge C in these areas.


Agreed, but I was mainly focusing on the aforementioned domains in which C is the 'de facto' standard due to it's properties.

However I believe there are lots of applications written in C because that was either the 'language d'jour' back when they were written or that the author was very proficient in it, which could be rewritten in something like Go,Java,C# or even Python and similar without any perceptable loss of performance.

Looking at things where C dominates to this day and show no sign of weakening, I see operating-system level code (kernel, driver, userland<->kernel interface code), libraries for cpu intensive workloads (audio/video codecs, compression/decompression, low level game framework components etc) and of course cpu intensive applications (encoders, archivers, graphics manipulation software for 2d/3d etc), I guess I should squeeze in version control software here aswell :)

However, there exists a wide range of applications outside of these areas where I think C isn't particularly competitive these days as it's strengths are dwarfed by it's weaknesses.

That's not to say that there is anything wrong with writing such applications in C, particularly considering aspects such as proficiency and familiarity, if you know C very well and feel very comfortable programming in it then it's less likely you'd want to switch to something else for 'convenience'.


Based upon the increasing number of languages showing up that combine high performance with higher levels of abstraction than C, and the increasing number of serious languages gunning for C specifically, like Rust.

C has ridden for a long time on the fact that we didn't know how to combine high performance and systems functionality with high abstraction languages, so you had your choice of C or C-like and fast, or high abstraction and slow, like Python or Perl or Ruby. This gap is closing, fairly quickly now, and once it does C will start experiencing a rapid decline into a niche language, rather than the lingua franca of computing. It has advantages that have kept it going for a long time, but it has terrible disadvantages too, and once the two are separable, people are going to want to so separate.

Already a great deal of what was once automatically C or C++ has gone to some sort of bytecode (Java or .Net), even on things you would used to simply automatically assume to be "embedded", like your cell phone. The decline has already started.

Of course it won't die. Computer languages never really die. You can still get a job doing COBOL, and in fact get paid quite well. But 10 years from now, I think on the systems considered "modern" at the time, it will not be the "default" language anymore.


Rust is the sole _potential_ C competitor I've come across but the language isn't even finalized last I checked so it hardly constitute a challenge to C as of yet. A new language showing up doesn't mean it will gain any traction.

As for the higher level languages closing the gap, that is certainly not my experience and I've seen no benchmarks to this effect. And I'm not talking about Python or Ruby and the likes which are in an entirely different category, I'm talking about C#, Java, etc. They are still alot slower on cpu intensive code, also they certainly doesn't fulfill other important properties of C like small memory footprint.

The notion that there was some 'magic piece of the puzzle' missing which has been solved in making high level languages perform like low level languages like C comes across as nonsense to me. Higher level languages sacrifice speed for convenience and safety, depending on your demands that can often be a very wise sacrifice, in the areas where C excels it's likely often not.

Also cell-phones (which are really the upper-end of 'embedded') run a kernel and system-level code written in C doing the heavy lifting system-wise, also performance demanding applications (higher-end games) are often still written in native code on these phones.

There's certainly room for a more 'modern' C, maybe something else has replaced C in 10 years, but I don't see it being any of the languages we are discussing today, not even Rust. As it stands, I think C will remain dominant in the aforementioned domains.

Guess we are going to have to disagree, history will prove one of us right I suppose :)


ATS is far better than Rust and C.


I don't know, Rust has Mozilla behind it, and lots of active development and (for a new programming language) a pretty decent community.

ATS is hosted on SourceForge, and looks to be someone's research project. That's totally fine, and it may actually be _better_, but the real world often doesn't care about better.

If I was a betting man, I'd be putting chips down on Rust. (and I sort of am...)

That said, the more programming languages the merrier!


Rust may be more successful but at this time ATS is a better C replacement. Mainly because Rust is currently tied to a runtime and garbage collector. But also because ATS has better features for describing the API boundaries when calling into C and back (ie. FFI).

Rust will grow into these areas I'm sure because Mozilla will need them for Servo and safer programming.


Seems fair.


Care to motivate? I am curious, I don't know ATS


The positives about C:

It's high-level assembly language. Thus, it allows you to write code with a very small barrier between you and the machine instructions, but with enough of a barrier that you can reuse and organize your code logically. At the same time, it gives you most of the speed and raw bit manipulation of machine code.

The negatives about C:

You're almost as likely to introduce completely silent catastrophic memory corruption with every operation than you are to actually accomplish the task you're trying to do. Most of the things which would normally be bugs are also now major security breaches. C's structure also makes it extremely difficult to do static analysis, making it difficult to find bugs, security vulnerabilities, and make performance optimizations.

How ATS helps:

ATS addresses a lot of these issues by emitting efficient C code. However, at a higher level ATS combines C with an extremely strong type system (dependent typing) which allows you to do things like verify statically that there are no out-of-bound memory accesses through typing. It can do even stronger things like prove your code correct.


Here's an example of how ATS makes a C API safer, similar to the API in the K&R example (it deals with memory/string copying): http://www.bluishcoder.co.nz/2012/08/30/safer-handling-of-c-...

ATS provides all the low level hackery that C can do but allows more information to be encoded in types to ensure that is safe. It also provides high level features from functional programming languages like higher order functions, pattern matching, etc.


Any new language will have a maturity gap to make up for that will be hard to bridge.

Java has to some extent displaced C++ (and the .net framework has done some more of that), but I haven't seen any language that displaced C in an arena where C is strong.

For rust to make this happen they'll have to finalize the language spec, gain a decade+ experience in what the quirks are (these things only come out over time it seems), sway a generation of programmers to adopt it rather than the incumbent.

Go is shooting for the same space and it already lost the plot in several aspects (for instance: newer Go versions break older code).


Go is not shooting for the same space. Go is garbage collected.


Pre-version 1, there were no promises that your code would not be broken. In fact, quite the opposite is true. It is hard for me to classify this as a failing of Go when it was exactly according to plan. Now, if future versions break version 1 code...


Acknowledged, however I also think that Cs inertia will keep it going for a long time. Because System X was written in C, the next iteration will be written in C, goto 10. Also, for scientists (and others, I'm sure), it is hard to convince them to learn a new language when they are comfortable with their current language and are not convinced of the benefits of switching.


If you're going to take C more seriously, then I suggest you take Lua and the Lua VM more seriously, too. The reasons why:

* You can put the LuaVM in any C project, quite easily. * The Lua VM is exceptionally easy to understand C, and is highly portable to boot (extreme platform plasticity), thus: a great project to learn from * The field of scriptable VM-hosting has a bright future in software development


Not to mention:

- lightweight ABI, accessible from just about any other language on the planet.

- huge existing base of liberally-licensed code available for re-use

- basically the only language that is not a) legacy-stamped or b) bloated beyond learnability after more than two decades of use (or, in C's case, twice that.)

I seldom use C myself, but its strong points are undersold by the "garbage collection is to slooooow" kiddies.


Though it might come across as 'dealing in absolutes,' Zed is making the broader point that when one starts learning programming, it's not good to put any book on a pedestal and blindly follow their style - without working through the code.

It's similar to literature where people tend to ascribe the word 'classic' and yet rarely bother to read, understand or appreciate them.

Your example of unsafe functions might work for you but might not be the best example in a book which is being revered as "the classic book on C" and which influenced almost every major language and book thereafter.


As an interesting analogy, some great thinkers have heavily criticised Shakespeare for some terrible imagery - notably Wittgenstein and Tolstoy.

http://bloggingshakespeare.com/do-we-really-like-shakespeare


I agree - that's why I started by saying that Zed is right to point them out, as I realise that he is teaching people new to C (and potentially to programming). I added my caveats just to point out to the slightly more experienced programmers here that it's not quite the whole story.


I'm basically replying to this comment so I can find it again.

I think your point about unsafe functions being fine as long as access to them is only possible through controlled access points that verify sane input is one of the most interesting points I've seen in a while.


This is exactly my experience working with engine code--there are some very clean, fast, precise, and horrifically unsafe functions in our rendering system and other places that are acceptable because we can make guarantees about what data gets there from elsewhere.

The mental image is an access panel, behind which lay hundreds of whirling razor-sharp gears--presumably if you open it you know what you're doing and are careful.

This is what people pushing for abstraction are about, and why they're correct in some cases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: