Why C and C++ are Awful Programming Languages (2010)
radford.eduIt's pretty funny that the article starts with "[an error occurred while processing this directive]" repeated three times. Maybe the language isn't the only cause of mistakes.
More seriously, the author makes the pretty decent point that certain classes of errors should be caught automatically. He uses buffer overflows as an example (near the bottom). I agree with his conclusion, but call out a hidden assumption. Does "automatically" include only the compiler? Of course not! We also have static analyzers, automatic test generators, and other kinds of programs that can help with this. We can and should use those, to improve correctness without either cluttering up the language or adding performance-robbing artifacts in the executable code. While it's true that C and C++ make it harder than it should be to write and use such tools, the languages and the tools need only a small nudge to improve that situation dramatically. They don't need to change their essential natures.
This "unbundling" of other language-related functionality from the compiler toolchain is an important possibility that should not be overlooked. "Do one thing and do it well" and let the user decide which things to do in which order. There are plenty of other good reasons to develop or prefer other higher-level languages. The kinds of problems the author cites are almost irrelevant.
Where in this article does the author explain why C and C++ are awful? He only gives one or two examples. The main argument seems to be that Scheme is easier to teach but I don't see how that makes C or C++ bad programming languages. I agree that C++ a horrible mess, but C is not...and it's not hard to learn either (in fact, it was the language I learned programming with). I do agree that Scheme is a very nice language, but that doesn't make every other language out there bad.
Also I don't agree with the notion that programming is all about high level algorithms and abstraction. In the end you're programming a physical computer and you should be aware of that. In some (or even many) cases Scheme or Python or whatever (even C perhaps) may be too abstract for what you're doing, so use what's appropriate.
Yes what the author doesn't realize is that C is still an extremely important language no matter what, just to make computers run properly and fast.
If there were good alternatives then maybe we can discard C. But if you want to become a professional developer you SHOULD know C, even if you dont use it regularly. The same way I believe every professional should know basic OS concepts, threads, memory management, architecture, etc.
I think the author should have called it "Why C and C++ are awful _teaching_ languages".
I think it is very important that computer science students learn C early on. Learning concepts like memory management and word alignment (which he mentions) is very critical imo.
For other majors which have require some programming it is best if they stick to something like Python, etc.
But the author does make a lot of good points regarding how many holes there are in these languages, and how much of a time sink they can. In fact I spent the last few days trying to track down this obscure bug where an object being allocated in C++ was randomly being free'd (according to gdb a valid address would become 0x1 all of a sudden) and I couldn't figure it out. I implemented some nasty WAR which makes the program run now...
A small aside, would be interested to hear how you tracked down the bug? Obviously after you have solved the root cause.
In the embedded world, I would put a breakpoint on a write operation for that address. Once the breakpoint is triggered, I would inspect the trace.
This is in the embedded world, unfortunately the gdb for this platform is rather limited and watch breakpoints don't work as expected.
Essentially here is what was happening: -Create C++ object -Try to point some reference to this object, but object address is suddenly 0x1 (just by doing one step in gdb) when trying to access one of it's properties. -Tried to run it with valgrind, and it works fine with valgrind, which leads me to believe it is some memory allocation issue with C++ on the heap -I modified the C++ class to have a uint64_t variable before the variable declaration. Now program works fine!
I believe the issue is probably with heap corruption at some point, when something overwrites certain addresses.Having that extra unneeded 64-bit int in the heap makes it still be valid.
> This is in the embedded world, unfortunately the gdb for this platform is rather limited and watch breakpoints don't work as expected.
Thanks for sharing. There is nothing like a JTAG debugger with this kind of bugs. Good luck that it does not resurface later on.
I kinda agree with the author if you do your software development purely with programming language. It would otherwise be call hacking or prototyping, and this is not exclusive to C or C++. This could also happen wiht Java, python or <insert your favourite language here>.
With software engineering, there are a plethora of other tasks and checks that needs to be done before releasing it into wild. ie requirements engineering, verification, reviews etc.
In Ensimag, a leading Grande Ecole in computer science and applied maths, the algo teaching starts with Ada 95!