3. 강의목표
This course provides a comprehensive introduction to the fundamental optimization theory and algorithms essential for modern data science. Students will explore how optimization underpins machine learning, statistical modeling, and large-scale data analysis. The course covers unconstrained and constrained optimization techniques, starting with core concepts like convexity, gradients, and Hessians. Key topics include Gradient Descent, Stochastic Gradient Descent (SGD), Newton's Method, and quasi-Newton methods (like L-BFGS). Furthermore, the course will delve into Lagrange multipliers and the Karush-Kuhn-Tucker (KKT) conditions for constrained problems, with practical applications in models like Support Vector Machines (SVMs) and regularized linear regression (Lasso and Ridge). Emphasis will be placed on the practical implementation, computational efficiency, and scalability of these methods in a high-dimensional data context, using popular data science programming environments.
4. 강의선수/수강필수사항
Basic knowledge of calculus, linear algebra, probability theory, and machine learning principles
5. 성적평가
| 중간고사 |
기말고사 |
출석 |
과제 |
프로젝트 |
발표/토론 |
실험/실습 |
퀴즈 |
기타 |
계 |
|
|
|
|
|
|
|
|
|
|
| 비고 |
- Midterm exam 50%
- Final exam 50%
|
6. 강의교재
| 도서명 |
저자명 |
출판사 |
출판년도 |
ISBN |
|
- Convex Optimization for Machine Learning, Changho Suh
|
|
|
0000
|
|
|
- Convex Optimization, Stephen Boyd and Lieven Vandenberghe
|
|
|
0000
|
|
7. 참고문헌 및 자료
- Convex Optimization: Algorithms and Complexity, Sebastien Bubeck
- Introductory Lectures on Convex Optimization: A Basic Course, Yurii Nesterov
- Numerical Optimization, Jorge Nocedal and Stephen J. Wright
8. 강의진도계획
Week 1: Intro to Optimization
Week 2: Overview of Convex Optimization
Week 3: Linear Programming (LP), Least Squares (LS)
Week 4: Quadratic Programming (QP)
Week 5: Second-Order Cone Programming (SOCP)
Week 6: Semi-Definite Programming (SDP)
Week 7: Strong/Weak Duality
Week 8: Midterm exam
Week 9: Gradient descent
Week 10: Adaptive methods
Week 11: Second order methods
Week 12: Dual methods
Week 13: Distributed optimization
Week 14: Nonconvex optimization
Week 15: Advanced topics
Week 16: Final exam
9. 수업운영
- This course is a standard lecture-based course.
- Students will need to take two exams; there is no term project.
- Students will be evaluated with letter grading.
11. 장애학생에 대한 학습지원 사항
- 수강 관련: 문자 통역(청각), 교과목 보조(발달), 노트필기(전 유형) 등
- 시험 관련: 시험시간 연장(필요시 전 유형), 시험지 확대 복사(시각) 등
- 기타 추가 요청사항 발생 시 장애학생지원센터(279-2434)로 요청