Why I don't like C

This is in response to my last blog post. I commented that while I grew up with C as a lingua franca of computer programming. Despite this handicap, I prefer to avoid C and C like languages for a very good reason. Instead of just explaining why, I am going to quote Bruce Sterling:

As it happened, the problem itself--the problem per se--took this form.

A piece of telco software had been written in C language, a standard

language of the telco field. Within the C software was a

long "do. . .while" construct. The "do. . .while" construct

contained a "switch" statement. The "switch" statement contained

an "if" clause. The "if" clause contained a "break." The "break"

was SUPPOSED to "break" the "if clause." Instead, the "break"

broke the "switch" statement.

Without even explaining what went wrong, it's easy to see the problem with a language with ambiguities. I can't deny the benefits one gains from C, and in this case, writing Telco programs to handle some very high end real time equipment requires a language with those benefits.

I also can't argue that Haskell doesn't have these problems, either. Seeing the kind of complexities using Monads brings, I can't imagine that it's impossible to write obfuscated code. (Actually I would love to see some horribly obfuscated Haskell code; it would be an interesting learning experience.) As actual program binary size grows, though, it's important to reduce the errors per bit ratio. People like to measure projects and code by Significant Lines of Code (SLOC). Assuming one bug for every thousand lines, a one million line banking project would have 1000 bugs, plausibly. There are two approaches found in trying to manage and mitigate this problem. The first falls under 'Design Patterns'. The goal of this solution is to create patterns for putting together ideas, so that the chance of error per LOC goes down. After all, if a good design philosophy could bring the error rate down to one bug per ten thousand lines of code, the bug count would go down, and a manager could get a raise.

The other solution is Abstraction. This actually comes two fold as well. It all means the same thing though. The first is the use of verified standard libraries. One example of this is the prolificness of web frameworks. All the data marshalling, network connection, session management, run time systems access, and connectivity to other services can be abstracted cleanly away with libraries that are tested, packaged, and sometimes sold. In some cases there's only so much that can be done through a library, so many languages come up with convenient structures for handling more esoteric duties, such as concurrency, object oriented programming, aspect oriented programming, certain design patterns, and orthogonality between data structures and language features, all of which require clear support from the syntax. Ultimately, it falls on the shoulders of compiler writers, the duty to make it all come together, whether it's a linked in library, or assembler code generated by a standard language library.

I want to include a new measurement for measuring code complexity, Logical Lines of Code (LLOC). Writing concurrent code in Erlang, with automatic handling of multiple threads that frobnicates foobars amounts to 70 SLOC. Considering that each line of code represents 20 lines of thread safe goodness, this program amounts to 1400 LLOC. A bug rate of 1 bug per 500 SLOC turns into a bug rate of 1 bug per 10,000 LLOC. Suddenly, someone is getting a very big raise. Similarly, a thread safe data container library in java would represent 10 lines of non thread safe container code that has the end developer playing with his own locks and mutexes.

As a side point, lispers have been claiming for years that they don't need to wait for language designers to make up new features, just so they can use them. Through prudent use of defmacro, any new language feature can be designed on the spot. Without going into the sometimes named 'autistic programmer' problem, it leads to alot of fragmentation of the design. LISP seems to have as many design patterns as Java, and they seem to focus alot more on reducing the SLOC count. It also creates a funny problem where no two people's code can look alike. I couldn't even call this balkanization, but it makes LISP the interesting black sheep of the language evolution family tree.

Ultimately languages like Erlang, Haskell, and even F# aren't free of these nasty syntax issues that have bitten the best of the best programmers at least once. Every program is doomed to fail at something, but when trying to reduce the epicness of the fail, you don't want to have low level code hanging around your neck.

4 flames:

Kevin Kofler zei

Where's the ambiguity? break; can never break out of an if. This was just the programmer either not understanding the language or just not being careful.

Yankee zei

It's a matter of language orthogonality. I'm not trying to blame the language, or the programmer per se, but just trying to make the case for using advanced languages for everyday programming.

Again, I don't know if this problem could happen in Haskell, and I would hope so, so no one is ever led into a false sense of security.

Anoniem zei

You shouldn't be blaming the language. Breaks don't "get you out" of if's.

And if the code is ilegible is again the programmer's fault.

Yankee zei

I'm not blaming the language, but that human nature to make silly errors. Using a high level language takes away those chances of silly human erros and instead gives us silly computer errors, otherwise known as high code quality.