3. 강의목표
LLMs have changed the game in the field of natural language processing over the past few years, and they are used not only for various NLP tasks such as Q/A, translation, and generation, but also for multimodal data analysis. This course aims to study the latest LLM-related research topics, including the technical foundations (Bert, GPT, T5, MoE), emergent abilities (few-shot learning, in-context learning), fine-tuning and adaptation, and advanced topics (Trustworthy LLM, RAG, calibration). We will discuss important papers on each topic in depth. Students are expected to regularly present research papers and complete a research project at the end.
Learning goals:
1. Prepare you for performing research in topics related to LLMs
2. Practice research skills including reading/criticizing research papers, oral presentation, and writing research papers.
4. 강의선수/수강필수사항
Students are expected to have taken machine learning and NLP courses before, or be familiar with deep learning models like Transformers.
5. 성적평가
Class participation (30%): In each class, we will cover 2-3 papers. You are required to read one of these papers in depth and give 1-3 paragraphs of comments including questions, weakness, or new ideas before 11:59pm prior to the lecture day. These comments are designed to test your understanding and stimulate your thinking on the topic and will count towards class participation. In the last 20 minutes of the class, we will review and discuss these comments. (Each student may have three missing comments throughout the course.)
Presentations (30%): For each lecture, we will ask each student to deliver a 50-60 minute lecture. The goal is to educate the others in the class about the topic, so do think about how to best cover the material (make sure to include background), do a good job with slides, and be prepared for lots of questions. Topics and papers will be posted on PLMS, but you can pick your own papers with the consent of the instructor. (Recommend to choose those recently published in top conferences like ACL, EMNLP, and NAACL.)
You are also required to upload your slides (ppt/pdf) to PLMS before 11:59pm the day prior to the lecture day.
You are expected to present 1-2 times and you will receive feedback on your presentation from 2-3 classmates.
Final project (40%): At the end of the class, everyone is required to do a class project related to LLMs and submit a final paper. You can work as a team of 1-2. (A two-person team is expected to undertake a larger project or achieve a higher level of project completion.) Each team is required to submit a proposal and the final paper. We will schedule in-class project presentations at the end of semester.
6. 강의교재
도서명 |
저자명 |
출판사 |
출판년도 |
ISBN |
No textbook
|
|
|
0000
|
|
7. 참고문헌 및 자료
J & , slp3 (https://web.stanford.edu/~jurafsky/slp3/) is an NLP textbook for you to check out specific topics in NLP.
Recommend to study the HuggingFace's NLP course (https://huggingface.co/learn/nlp-course/chapter1/1) if you are not familiar with NLP programming.
8. 강의진도계획
W1: Course Introduction, LLM and Transformers
W2: BERT, Encoder-decoder (T5) vs decoder-only models (GPT-3)
W3: MobileLLM, LoRA (No Tue class)
W4: Parameter-Efficient Fine-Tuning (PEFT), In-context learning
W5: LLM calibration (No Thu class)
W6: Scaling, RAG 1
W7: RAG 2, RAG 3
W8: No midterm exam
W9: RAG 4, Knowledge editing
W10: Knowledge distillation, MoE 1
W11: MoE 2, Multimodal LLMs
W12: LLMs for Robotics, Time-aware LLMs
W13-W15: project presentation
W16: No final exam
11. 장애학생에 대한 학습지원 사항
- 수강 관련: 문자 통역(청각), 교과목 보조(발달), 노트필기(전 유형) 등
- 시험 관련: 시험시간 연장(필요시 전 유형), 시험지 확대 복사(시각) 등
- 기타 추가 요청사항 발생 시 장애학생지원센터(279-2434)로 요청