Stanford CME 295 Lecture 1 Notes: Transformers and LLM Foundations

Lecture notes and key concepts from Stanford’s CME 295 Transformers & LLMs course. Summarizes NLP fundamentals, tokenization, embeddings, sequence models, attention, and the Transformer architecture.