Second Semester at KAUST as CEMSE/CS Ph.D.
My second semester (Spring, 2021) at KAUST CEMSE/CS.
At KAUST, classes are not big, and professors are world-level scientists.
This allows direct dialogue with professors on the subject to obtain deep insights and inspiration for the paper.
Classes that I took (CS332, CS323, CS398):
As a subpart of my academic load, I took classes at KAUST during my second semester, which I can use for Ph.D. qualification requirements, and from another side, that are important for my academic work.
In this post, I would like to give some small insides about all of them.
Description of my academic load during the first semester that may be in the area of interest of people who apply to the CEMSE/CS Ph.D. division of KAUST is described in the previous note: First-Semester-at-KAUST-post/
Federated Learning, CS 332 with prof. Peter Richtarik
The course was taught by a professor in the CEMSE division prof. Peter Richtarik and the course are almost paper-based.
During the class, students jointly with the professor prepared high-quality lecture notes on various subjects, including information theory, local optimization methods, second-order optimization methods, software and systems for FL, cryptography, the internet of things, and background materials to go deep into strong mathematical proof of various statements.
Another part of the class targeted creating publishable work, which in my case, happens. The course provides a hot start for the work.
If you’re a student and want to go deep into the true multidisciplinary direction of AI, this FL can be a choice. Especially the FL fields due to its working regime (across billions of devices in cross-device settings) requires understanding what we are doing. Heuristics method in that field can be used as a temporary solution, but everything should have a strict mathematical form in the long term. We probably will still have tunable parameters during modeling and connection with reality, but at least these parameters should have a decoupled state.
The course allows enough freedom to make a selection on various topics. The course has taught me that preparing good lecture notes on complex subjects in compact form is challenging.
Prof. Peter Richtarik is open to practical and theoretical work in Federated Learning. He has always had time to discuss, help, and initiate projects for students in his class.
What is essential during a course is you will present lectures, and making a good presentation in live mode is also not an easy skill, but you can build this skill if you apply it to this course in the future.
Deep Learning for Visual Computing, CS323 with prof. Bernard Ghanem
Even though I faced Deep Learning(DL) methods during my career at NVIDIA and during my academic passing of several classes at Stanford University, this class was very useful.
The course was taught by prof. Bernard Ghannem from Image and Video Understanding Lab. At the moment of 2021, prof. Bernard Ghanem is also KAUST AI Initiative Leader and Deputy Director of AI Initiative.
This was a beautiful intensive course regarding the number of materials and homework assignments. Prof. Bernard Ghanem encourages questions and discussion during a lecture. He accurately highlighted and helped filter the exciting results from temporal heuristics that the community can not solve exactly. This course is beneficial for people who want to catch improvements in state-of-the-art image and video DL methods.
The course is a mixture of classical computer vision methods covering works from 2012 like AlexNet up to recent models, including Transformers, PointNet++,ResNeSt, GANs that can be observed as deterministic two parties game, VAE in which output of Encoder in a probabilistic way is plugged into Decoder. Nowadays, VAE is compatible with GANS, e.g., the NVAE paper.
The excellent and unusual thing compared to the education model at Stanford, where I took classes CS229, CS230 with prof. A.Ng is in the following.
Students have to read original papers, answer in detail, and prove that they understand the scope and limitations of approaches from original papers in reading assignments. The course has opened my eyes that in Applied Machine Learning, it’s far easier in 90% of cases to read original papers. Of course, it’s not always a case of considering all Applied Math in general, but in that case, it is. At Stanford classes, students have been encouraged to read original papers but not has been strictly required.
Another special thank if to a group of fantastic teaching assistants. The homework started from the first principles using PyTorch as a computation backend and simple models. Finally, the course considers pretty complicated schemas for image, video, and point cloud classification somewhere in the middle. I recommend this course.
The homework assignments were in Python using PyTorch.
There would be no problem if you did not use PyTorch before because homework is structured pretty clearly, and one of their goals is to build your ability to use PyTorch. During a course, I read the original documentation of PyTorch, and it was written nicely and clearly. I have organized links to references for documentation (my mental picture) for the course with the following table.
If you have never used Python as a programming language, I recommend the following materials:
- These days, the root book that the author has written of the language Gvino Van Rossum is converted into a Language Tutorial: https://docs.python.org/3/reference/index.html. It will take 5-7 days to read it.
- To obtain at least some technical nuances, I recommend looking into Language Reference: https://docs.python.org/3/reference/index.html
- A good resource for finding Python programming language nuances is the Table of Content available here: https://docs.python.org/3.8/contents.html.
- In the course, you will use a Conda package manager, Jupiter notebooks, to prepare reports. These tools are straightforward. My random remarks about that tools: https://sites.google.com/site/burlachenkok/python-relative-things.
CS398 Graduate Seminar organized by prof. Ivan Viola
Graduate Seminars is a non-credit weekly seminar in various fields connected to the CEMSE division where speakers are professors from KAUST or another university who share their research.