+
Skip to content

zen0no/nlp_course

 
 

Repository files navigation

YSDA Natural Language Processing course

  • This is the 2025 iteration of the course, materials are added as we prepare them. For full 2025 course materials, go to this branch
  • Lecture and seminar materials for each week are in ./week* folders, see README.md for materials and instructions
  • Any technical issues, ideas, bugs in course materials, contribution ideas - add an issue
  • Installing libraries and troubleshooting: this thread.

Syllabus

  • week01 Word Embeddings

    • Lecture: Word embeddings. Distributional semantics. Count-based (pre-neural) methods. Word2Vec: learn vectors. GloVe: count, then learn. Evaluation: intrinsic vs extrinsic. Analysis and Interpretability. Interactive lecture materials and more.
    • Seminar: Playing with word and sentence embeddings
    • Homework: Embedding-based machine translation system
  • week02 Language Modeling

    • Lecture: Language Modeling: what does it mean? Left-to-right framework. N-gram language models. Neural Language Models: General View, Recurrent Models, Convolutional Models. Evaluation. Practical Tips: Weight Tying. Analysis and Interpretability. Interactive lecture materials and more.
    • Seminar: Build a N-gram language model from scratch
    • Homework: Neural LMs & smoothing in count-based models.
  • TBU ./week03_attention Seq2seq and Attention

    • Lecture: Seq2seq Basics: Encoder-Decoder framework, Training, Simple Models, Inference (e.g., beam search). Attention: general, score functions, models. Transformer: self-attention, masked self-attention, multi-head attention; model architecture. Subword Segmentation (BPE). Analysis and Interpretability: functions of attention heads; probing for linguistic structure. Interactive lecture materials and more.
    • Seminar: Basic sequence to sequence model
    • Homework: Machine translation with attention
  • TBU ./week04_transfer Transfer Learning

    • Lecture: What is Transfer Learning? Great idea 1: From Words to Words-in-Context (CoVe, ELMo). Great idea 2: From Replacing Embeddings to Replacing Models (GPT, BERT). (A Bit of) Adaptors. Analysis and Interpretability. Interactive lecture materials and more.
    • Homework: fine-tuning a pre-trained BERT model
  • TBU ./week06_llm Large Language Models

    • Lecture: Scaling laws. Emergent abilities. Prompting (aka "in-context learning"): techiques that work; questioning whether model "understands" prompts. Hypotheses for why and how in-context learning works. Analysis and Interpretability.
    • Homework: manual prompt engneering and chain-of-thought reasoning
  • Additional lectures to be announced!

Contributors & course staff

Course materials and teaching performed by

About

Solutions for YSDA NLP Course homework

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 95.8%
  • Python 2.7%
  • HTML 1.3%
  • Dockerfile 0.1%
  • Cuda 0.1%
  • C++ 0.0%
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载