← Ch 2  ·  Contents  ·  Ch 4 →

Chapters: Ch 1 · Ch 2 · Ch 3 · Ch 4 · Ch 5

AI Tools & Infrastructure

Groq’s Lightning Fast AI Chip (Techopedia) Coverage of Groq’s Language Processing Unit (LPU) — a deterministic, SRAM-based chip architecture designed specifically for LLM inference rather than training. Groq achieves dramatically lower latency than GPU-based inference by eliminating the memory bottleneck through on-chip SRAM and a compiler-driven execution model that avoids runtime scheduling overhead.


Daniel San: Groq + VSCode + Llama3 A tutorial tweet demonstrating integrating Groq’s fast inference API with VSCode for real-time Llama-3 code completion — a setup that combines the open weights of Meta’s model with Groq’s sub-100ms token generation latency to create a local-equivalent coding assistant that actually keeps up with typing speed.


Vector DB Comparison (superlinked.com) A comparison of vector database options — Pinecone, Weaviate, Qdrant, Chroma, pgvector, and others — across dimensions of throughput, latency, filtering capabilities, hosted vs. self-managed, and cost. An essential reference when selecting infrastructure for RAG or semantic search applications. https://superlinked.com/


Ilya 30u30 Reading List (arc.net) The widely circulated list of ~30 papers that Ilya Sutskever reportedly considers essential for understanding deep learning — covering attention, ResNets, LSTMs, variational autoencoders, GANs, AlphaGo, and the scaling hypothesis. The list functions as a curriculum for anyone trying to understand the intellectual genealogy of modern AI.


Cursor — The AI Code Editor Cursor is a VS Code fork deeply integrated with LLM code generation — supporting multi-file context, codebase-wide search-aware completions, and a Claude/GPT-4-powered chat interface for architectural discussions. It represents the first mainstream IDE built around AI-first workflows rather than bolting AI features onto an existing editor. https://www.cursor.com/


Jaynit Makwana: These 4 Guys Completely Revolutionized Coding — Cursor A tweet celebrating the Cursor founding team’s impact on software development workflows, arguing that AI-integrated IDEs have lowered the barrier to programming enough to enable non-programmers to build functional software. A marker of the “anyone can code” discourse that accompanied the widespread adoption of AI coding tools in 2024–25. https://x.com/JaynitMakwana/status/1829390237107200046


After Three Years, Modular’s CUDA Alternative Is Ready eetimes.com — https://www.eetimes.com/after-three-years-modulars-cuda-alternative-is-ready/

An EE Times article on Modular’s GPU computing platform (the Mojo language ecosystem) reaching production readiness as a CUDA alternative — aimed at enabling GPU programming without NVIDIA’s proprietary toolchain. [→ mathematics-science; ML infrastructure → machine-learning-ai]


AI for Science

AI Discovers New Antibiotics (DeepLearning.AI Batch Issue 231) Coverage of deep learning models — specifically graph neural networks trained on molecular property data — discovering novel antibiotic compounds that kill bacteria resistant to all known drugs. The models identified Halicin and subsequent candidates by predicting antimicrobial activity from molecular structure without being constrained to known antibiotic scaffolds.


Artificial Intelligence to Solve Production Scheduling Problems in Real Industrial Settings — Systematic Literature Review (MDPI) A systematic review of AI methods applied to production scheduling in industrial settings, covering particle swarm optimization, neural networks, reinforcement learning, and hybrid approaches. The paper benchmarks these methods against classical scheduling algorithms and identifies the problem types (job-shop, flow-shop, project scheduling) where AI most consistently outperforms traditional methods. https://www.mdpi.com/2079-9292/12/23/4732


Optimization Methods in Neural Networks (Kaggle Notebook) A Kaggle notebook benchmarking gradient descent optimizers — SGD, Momentum, RMSprop, Adam, Adagrad — on standard datasets, visualizing loss landscapes and convergence trajectories. A practical reference for understanding why Adam remains the default optimizer for most LLM training despite known issues with generalization compared to SGD with momentum. https://www.kaggle.com/code/nadaahassan/optimization-methods-in-nn


Google Personal Health LLM (PH-LLM) — Shrey Jain Tweet A tweet reporting on Google’s release of PH-LLM — a Gemini variant fine-tuned on personal health and wellness data from Fitbit and medical literature. The model can interpret wearable sensor data, answer health questions grounded in the user’s own biometrics, and generate personalized sleep and fitness recommendations. https://x.com/shreyjaineth/status/1800586918117453911


ML Theory & Education

Machine Learning Course — Lecture 1 (Microsoft Research) Microsoft Research’s introductory ML lecture, covering the supervised learning framework, empirical risk minimization, bias-variance trade-off, and the distinction between generalization (test performance) and optimization (training performance). A rigorous starting point that grounds practical ML methods in statistical learning theory. https://www.microsoft.com/en-us/research/video/machine-learning-course-lecture-1/


Understanding Machine Learning — From Theory to Algorithms (449-page PDF eBook, Kirk Borne tweet) Shai Shalev-Shwartz and Shai Ben-David’s textbook — available free online — covering the PAC learning framework, VC dimension, uniform convergence, regularization theory, and neural network generalization in a mathematically rigorous way. Among the best references for understanding why machine learning works, not just how to use it. https://x.com/KirkDBorne/status/1831074777487827058


Santiago: 7 Baby Steps for ML A tweet-based roadmap by Santiago Valdarrama for beginners entering machine learning: from Python basics through sklearn, neural networks, deep learning frameworks, to production deployment. The simplicity of the 7-step structure makes it useful for advising people at career entry points.


Practical Machine Learning With Rust — San Mateo County Libraries A library catalog entry for a book on implementing machine learning algorithms in Rust — covering linear regression, decision trees, neural networks, and clustering using Rust’s linfa ecosystem. Relevant for performance-critical ML inference use cases where Python’s overhead is unacceptable. https://smcl.bibliocommons.com/v2/record/S76C3263013