In search for a good programming language: C
If there's one thing to take away from my very brief CS curriculum at Yale, it's that C is a terrible programming language and should die in a fire. After all these years since the next generation of system programming languages came out, it appears that nothing has changed, and we are still in the same place with the same question. Enough points have been made from everyone in the PL-sphere and dev-sphere about how bad C is, and I don't feel like I can add new fuel to the fire. I do want to share opinions about using C pedagogically, though.
First, recall the triangle of impossibility:
It is well-known that no language can do all three at once. If you are a fast and safe language, you won't be easy to use—you will necessarily have complex guardrails and arcane concepts. If you are an easy and fast language, you won't be safe—users will be able to shoot themselves in the foot if they write whatever they want without being called out. If you are a safe and easy language, you won't be fast—instead of forcing users to fix mistakes, you fix mistakes for them at runtime with GC, dynamic cast, etc., causing slowdown.
Most modern languages we can name lie somewhere on the edge of the triangle. Rust is on the "fast and safe" edge; Python is on the "safe and easy" edge; ironically, few languages come to mind that lie on the "fast and easy" edge; perhaps C++ is the closest. But, C, being the granddaddy in the room, lies narrowly in the corner of "fast":
- There's no doubt that C is fast. There's basically a word-to-word mapping between C and machine code semantics (barring optimizations), so there's very little hidden complexity (think: iterators, dispatch table, etc.).
- C is not safe. Everyone talks about this but some don't realize how bad it is. No lifetime management is just a part of the story. It also lacks type safety, thread safety, bounds checking, null safety—and the list goes on. No, this is not a skill issue; it's a language issue.
- C is not easy. It's easy to learn your
ifandforandstructandmalloc, but it gets ugly very quickly. The arcane syntax for pointers and function pointers; the lack of standard libraries, especially for data structures; chasing down memory leaks and bugs; the lack of a build system; and more. (Again, explaining each of the points raised here would be its own piece, but there are already ample discussions about this.)
The question to ask is: What do we want to teach? What impression do we want to give on the next generation of programmers? (And remember that these are students who already know "how to program" thanks to AP CS, Python, etc.; they just haven't seen project-scale programming yet.) My answer is: everything other than "write fast code". "Being fast" is a niche engineering concern, not a core educational one; safety, feature richness, and abstraction, on the other hand, should be the norm.
Let us first hear some reasons why C is still taught in the core CS curriculum:
- It has primitive syntax and semantics, so there are relatively few prerequisites before one can write meaningful and idiomatic code.
- Its lack of standard libraries is a feature, because it forces students to implement data structures and algorithms from scratch.
- Its lack of a build system is a feature, because it forces students to learn how to compile and link code.
- It exposes low-level concepts such as memory management, pointers, calling convention, and stack space, which shed light on how computers work.
- It is a common language in systems programming.
I believe that point 1 has been objectively proven irrelevant. Countless people start programming with Python, JavaScript, or even Java, and they can write meaningful and idiomatic code within a few weeks, utilizing additional concepts like OOP, HOFs, iterators, etc. After all, it's increasingly rare to have languages without any of these features, so it's fair to say that the core subset of PL features one must master to be proficient is already beyond what C offers. In that sense, C is actually a hindrance to learn PL concepts, because idioms (i.e., workarounds for the lack of a feature) you use here may already be obsolete in other languages.
Point 2 is a façade of the many practices in education that I hate because they attempt to work against the students. They first presuppose that there are certain skills that students must have, and then they force students to acquire those skills by making them do things the hard way.1 But if students need to be "forced" into learning something by taking existing tools away from them, it's time to reconsider if said thing is worth learning at all. When was the last time someone working on real software had to implement their own array list, hash map, or quicksort? Demonstrating the implementation of these data structures and algorithms can be done in any language; why do we have to do it under the pretense of real-life necessity?
There's another problem with point 2: the "reinvent the wheel" mindset is actively dangerous in today's programming world, because nothing actually works the way you are taught and have been trained to implement. For example, the state-of-the-art sorting algorithm is usually not quicksort, but usually a hybrid sort that guarantees (1) good performance on small arrays, (2) stability, (3) in-place, (4) good cache performance, and (5) good asymptotic performance. If you write quicksort from muscle memory, you just accidentally introduced overhead to both your code and your development process. The same goes for data structures: it's far too easy to write linked lists or hash maps with bugs.
Points 3 and 4 should be taken together because they use C as an embodiment of the underlying machine architecture. I can't deny that compilers, systems, and computer architecture are a must-learn for CS majors, but I doubt that C is the best way to be introduced to these concepts, especially if it's used for non-demonstration purposes. I acknowledge that C is used for canonical examples of memory addressing, call stacks, linking, etc., and I can't find a better alternative. But again, let us separate "demonstration" from "practice". Think about C as those fancy Latin phrases in law: they can be used to embody abstract concepts, but there shouldn't be a need for students to write full articles in Latin, because 99% of the time spent is with irrelevant, frivolous details of the language.
Finally, point 5 is a chicken-and-egg problem. It's not that we haven't successfully phased out other "mainstream industrial languages": think about COBOL, Fortran, Visual Basic, etc. C's status today is largely due to path dependency: it's because we have raised generations of engineers to think of C as the gospel of systems programming that they continue to produce, maintain, and consume C code. If they have been conditioned to think of alternatives, perhaps the outcome would be different. In fact, many "systems programming" tasks today are already transitioning to Rust (despite some resistance); it's no longer a revolutionary idea to write a kernel, browser, database, or compiler in non-C languages. Finally, very, very few students today actually enter their software engineering careers working on systems programming; most of them will be working on web development, data science, machine learning, etc., where C is not the lingua franca.
I'm not arguing that CS education should be vocational training, though. I need to make it clear that I'm in the opposite camp: CS education should teach exactly those skills one can't pick up on a job. For example, I'm a big proponent of teaching formal languages, automata theory, and functional programming in the core curriculum. I have taken classes that write entire projects in Haskell, Racket, or Rocq. The problem with doing the same with C is that it offers no real insight about program paradigms and software design; students' time is spent on wrangling with errors preventable with better PL design, under the belief that "this is a necessary rite of passage for good programmers". But skill issues should be solved by programming languages, not programmers; learn to push your burden onto the language and library designers, not onto yourself.
To rehash, let me state my position clearly: C should have no place in the core CS curriculum as a working language for projects. C consumes too much cognitive bandwidth on avoidable errors, crowding out deeper conceptual learning. It can be used for demonstration purposes, but any attention more than that implies it has anything of merit in today's programming world, which paints the wrong picture. Let us think about introducing the next generation of programmers to well-abstracted and rich languages suited to the task, let that be Python, Rust, or OCaml. And perhaps in another decade, we would be able to bid farewell to C once and for all.
Footnotes
-
In 2026, I can add "forcing students to write code without AI" to this list (I will probably write a separate piece on this topic). ↩