EC203: Information, Meaning & Representation
Compression = Understanding
If you can compress it, you truly grok it
Coverage
- Bits, entropy, surprise — explained like psychology
- Encoding numbers, text, media; Unicode realities
- Compression as meaning extraction
- Ambiguity, context, semantics, grounding
- Symbolic vs distributed representations (LLMs)
Goals
- Speak about information precisely without jargon
- Design representations that serve your product + users
- Understand how LLM tokens, embeddings, vectors relate
Modules
- Information Theory, but Friendly
- Encoding Everything
- Compression & Cognition
- Semantics & Meaning-Making
- Representations in AI Systems
Connections
Difficulty: ⭐⭐⭐⭐☆ | Time: 12 hours | Prereq: EC103