Code Confessions Digest #5: News, and Resources from Last Week
In this week's digest: GPU Puzzles, Embeddings, Writing a VM in C, Lua in Rust, Real world OCaml, and Probability for computer scientists
Hey everyone, welcome to Confessions of a Code Addict. Quite a few new people joined us since the last article on GPU computing which has been widely appreciated. I am very grateful that so many people read and liked it.
This post is something new I’m trying where I will share a few updates about upcoming things at Confessions of a Code Addict, and some interesting resources that might be of interest to you.
My regular educational articles are free and open for everyone. However, if you find them valuable, consider becoming a paid subscriber to support my writing.
Updates from Confessions of a Code Addict
I was completely wiped out mentally after writing the article on GPU computing and it has taken a while for me to get back on the saddle. I’m currently working on couple of new ideas for articles.
There were few topics related to GPUs which I had to leave out in the last article because of space and time constraints. Those topics were not necessary to understand the fundamentals of GPU computing, but it’s good to know about them. I’m tentatively calling the article as “GPU Computing: The Missing Parts”. I plan to discuss topics such as:
Advanced grid and thread block layouts
Thread block clusters (newly introduced feature in Nvidia H100)
Control divergence: How warps execute in the presence of if/else conditions
Independent thread scheduling
Unified virtual memory between CPUs and GPUs
Another exciting topic I am working on is concerned with CPython internals. It’s too soon to disclose the details but many of you have enjoyed the CPython internal articles and this one is going to be very hardcore.
I hope you are as excited about these as I am! Next week is quite busy for me, but I’m working as hard as I can to get these out for you as soon as possible. Stay tuned!
Resources for the Week
GPU Puzzles: Learn GPU Programming by solving these interactive GPU puzzles. This is the perfect next progression after reading the GPU computing article: https://github.com/srush/GPU-Puzzles
Do Language Models really understand Language: Whether LLMs understand language or not is a hotly debated topic. In this articledigs deep into the research in this area and breaks it down for you. Highly recommended read:
Keep reading with a 7-day free trial
Subscribe to Confessions of a Code Addict to keep reading this post and get 7 days of free access to the full post archives.