my notes

some notes, cheatsheets and summaries

A subset of my notes for a subset of the courses I took.

Large Language Models (ETH Zürich Spring 2024)

The course is split into three parts. The first part starts with the probabilistic foundations of language models, covering what constitutes a language formally and theoretically. The second part covers many of the architectures, prompting, transfer-learning, RLHF, RAG, Parameter Efficient Finetuning. The third part discusses LLM security.

Please note that these notes are heavily based on the lecturers and lecture notes. All credits go to ETH Zürich, and it’s original author. These we simply my notes used for studying the course. Feel free to email me if you want it to be removed.

Formal Methods and Functional Programming (ETH Zürich Spring 2024)

This course is split into Functional Programming (FP) and Formal Methods (FM). The FP part focuses on designing and reasoning about functional programs (Haskell), covering the lambda calculus, higher-order programming, typing, proofs of correctness. The FP part focuses on deductive and algorithmic validation of programs modeled as transition systems (big/small-step semantics, axiomatic semantics, model checking, linear temporal logic).

Please note that these notes are heavily based on the lecturers and lecture notes. All credits go to ETH Zürich, and it’s original author. These we simply my notes used for studying the course. Feel free to email me if you want it to be removed.

Machine Perception (ETH Zürich Spring 2024)

This course beings with the fundamental concepts of deep learning, reviewing backpropagation, convolutional neural networks and recurrent neural networks. Then, we begin with Generative Models (the core of the course), diving into Variational AutoEncoders, Autoregressive models such as MADE, NADE, and covering Attention&Transformers. We then moved onto Generative Adversarial Networks and Diffusion Models. Then, we moved more into deep learning for computer vision Neural Implicit Surfaces, NERFs, Gaussian Splatting, and how to represent human body’s with Parametric Human Models (SMPL). Finally, we concluded with Reinforcement Learning.

Probabilistic Artificial Intelligence (ETH Zürich Fall 2023)

This course teaches how to represent uncertainties via Bayesian Learning (Gaussian Processes, Bayesian Linear/Logistic Regression, Bayesian Deep Learning). We also learned how to deal with intractable distributions (Variational Inference, Markov Chain Monte Carlo). Then, we apply these concepts (Active Learning, Bayesian Optimization, Markov Decision Processes, Reinforcement Learning).

Natural Language Processing (ETH Zürich Fall 2023)

This course introduces the fundamental concepts of Natural Language Processing (NLP). We started with Backpropagation and Multi-layer Perceptrons for Sentiment Classification, language modelling and Recurrent Neural Networks. Then, we moved more into the algorithmic parts of NLP such as Viterbi’s for Parts-of-Speech tagging, transliteration with Lehmann’s and WFSA’s, CKY for constituency parsing under a CFG under CNF. Dependency parsing with the CLE algorithm and the Matrix-Tree Theorem, Semantic Parsing for CCG and LIG’s. We concluded the course with Attention and Transformers. The concepts are explored a lot deeper in the graded assignments, and involves semiring theory.

Visual Computing (ETH Zürich Fall 2023)

This course is split into a Vision and Graphics part. The vision part starts with the classical edge detection algorithms, segmentation, Fourier Transform, optical flow, Image/Video Compression, PCA, Convolutional Neural Networks and Radon Transform. The graphics part covers the graphics pipeline, and also goes into Bezier/B-spline curves for animation, surfaces and ray tracing.