How to get a quick primer on Information Theory

Elad Verbin
2 min readAug 29, 2020


I was helping a colleague get up to speed about Information Theory. I realized this might be of general interest, so I’m posting it here. This primer is aimed at people with general STEM background, wanting to spend 5–15 hours to get familiar with Information Theory.

To get a primer, pick up the standard reference, Cover and Thomas’s Elements of Information Theory. (If you want to get a quick impression of the book, you can find a PDF using a quick google search .)

Start by reading Chapters 1 and 2. Then definitely skip chapters 3 and 4 — they’re too theoretical and too rigorous. Then read chapter 5 and/or 7. (Chapter 6 is super-fun, applying information theory to finance and gambling, but probably won’t be very useful.) If you read Chapters 1,2,5 and 7, I think you’ll have a great overview of the field that’s probably enough for most needs.

Then, if you want to see some real magic with information theory, you want to read about:
1. the construction of Reed-Solomon codes (which are not in the book at all, but are simple to learn about in other sources); and about
2. Slepian-Wolf coding, which is covered in chapter 15, but the treatment is way too technical.
3. Also check out a timeline of information theory on wikipedia to discover other gems.

If you want to explore beyond that, you’ll need to be aware that information theory is a field which is both a part of electrical engineering, and mathematics, and computer science, all with somewhat different nomenclature and focus areas. The materials above seem to be pretty tame across all nomenclatures, but if you dive deeper you’ll have to decide which sub-culture you’re following. In my case, I’m naturally in the computer science camp.

A bit of flavor on why this excites me: I witnessed (and played a tiny part of) a resurgence of information theory in computer science that began around 2000, and is still ongoing. If you want a sampler of this, check out these lecture notes from an entire computer science course that solely covers applications of information theory that give super-strong results in computer science (including the celebrated Direct Sum Theorems and Parallel Repetition Theorems). The impact of information theory on computer science was staggering. Through this, I got pretty convinced that information theory is an incredibly potent tool. The big challenge is to find out how to apply it in smart and flexible ways. It’s not a hammer, as much as it’s a workshop that makes hammers. I always hope to find some ways to use information theory for all kinds of exciting applications.

As always, if you are working on some computer-science-based idea for a startup, and certainly if it relates to information theory, please drop me a line at



Elad Verbin

Berlin. Computer Scientist & Algorithms developer. Invests in pre-seed algo-tech: ML, blockchain, zero knowledge, ... Partner @ Lunar Ventures