Skip to content

blog link: world–

world– by Robert Merkel at Larvatus Prodeo. Published October 14, 2011 at 09:02AM

The technology world has just lost another giant, though one without the towering public persona of Steve Jobs.

If you’re not actually a programmer, you’ve probably never heard of Dennis Ritchie. But the vast majority of software you use was built using a tool that he originally designed, and the rest by tools that very liberally sample from his.

The “native language” that the central processor in a computer understands is an ornery beast. For one thing, back in the 1970s every two-bit computer company (if you’ll pardon the techy pun) had its own native language; these days, there still remain two very common ones, and dozens of less common examples out there. More importantly, it’s almost incomprehensible, even to most programmers. Take this little snippet, part of the preliminaries to a very simple program that just prints the message “hello, world” to the screen:

_start:
        mov    eax, 4
        mov    ebx, 1
        mov    ecx, str
        mov    edx, str_len
        int    80h

Writing long and complicated bits of software with such unhelpful notation is extremely slow and error-prone.

“High-level” languages, that allowed the logic of software to be expressed in more compact and readable notation, had existed since the 1950s; Grace Hopper was responsible for one of the first. Over time, more and more of the scientific and business software run on the large computers of the era was written in FORTRAN, COBOL, and other high-level langages. However, the “operating systems”, the software plumbing that joined those applications to the hardware, was invariably written in the machine language of specific systems.

In the late 1960s, Ritchie, working with Ken Thompson at Bell Laboratories, hacked together their own little operating system for an obsolete computer nobody was making use of. It was small, but it worked, and was one of the first practical systems to support “timesharing” – the ability for multiple users to run multiple programs simultaneously and interactively. Fairly early on, they had another brainwave; they would rewrite as much of the system – which became known as Unix – as possible in a high-level programming language, to speed development. But first, they needed a suitable high-level language. The resulting language, an evolution of earlier languages entitled BCPL and B, was called “C”.

Both C and Unix were raging successes, partly because of their inherent strengths. The use of C allowed Unix to be “ported” to many different computer systems, a process that continues today as its spiritual successor Linux, written in C, runs on everything from IBM mainframes (and the amphormous “Googleplex” of Google’s servers which, reportedly, draw 240 megawatts of power), to virtually every smartphone on the planet (the iPhone’s operating system is not Linux, but it is also a derivation of Unix and substantial parts are written in C). They also had the good fortune, as with the Internet and World Wide Web which owes so much to both, that they gradually escaped the crush of intellectual property law to become part of the intellectual commons of the field.

Almost as important was the elegance and economy that Ritchie, along with Brian Kernighan, brought to teaching the language. Their textbook The C Programming Language remains the best programming language textbook ever written, in my view, and the one that I still strongly recommend to my students.

Much of the Windows operating system, and Mac OS X, are implemented in C. Those parts that aren’t, are implemented in computer languages directly derived from it – C++ and Objective-C. Most of the software that runs on those systems is also written in C or its successor languages. And perhaps the most pervasive “new” high-level language of the last 20 years – Java – retains so much of C’s “look and feel” that it often takes a second glance to tell which language a piece of code is written in.

Neither C nor Unix were by any means perfect. While some of its design faults have been eliminated in its successors others remain, and will likely continue to bamboozle neophyte (and, all too often, experienced) programmers, for generations to come. But there was so much he and his colleagues got right. Fate did play its part, but there are very good reasons that generations of software developers not yet born will express themselves in notations largely based on Ritchie’s.

RIP, dmr.