New Arrivals/Restock

50 ML projects to understand LLMs: Investigate transformer mechanisms through data analysis, visualization, and experimentation

flash sale iconLimited Time Sale
Until the end
15
26
08

$36.00 cheaper than the new price!!

Free shipping for purchases over $99 ( Details )
Free cash-on-delivery fees for purchases over $99
Please note that the sales price and tax displayed may differ between online and in-store. Also, the product may be out of stock in-store.
New  $60.00
quantity

Product details

Management number 219166544 Release Date 2026/05/03 List Price $24.00 Model Number 219166544
Category

Most books about LLMs teach you how to build language models from scratch or deploy them via APIs. This book does something different: it uses guided machine-learning projects to teach you how to understand, visualize, and investigate LLMs including GPT and BERT.Through 50 hands-on, guided projects solved in Python, you will investigate the internal mechanisms of large language models by treating their hidden states, attention patterns, and embeddings as data to analyze. Rather than accepting LLMs as black boxes, you will open them up, examine what's inside, and run experiments to understand why they behave the way they do. All projects are based on Python (using libraries such as NumPy, PyTorch, statsmodels, scikit-learn, Matplotlib, Pandas, and Seaborn) and come with full solutions and partial solution notebook files, so you can practice and improve your skills in data science, deep learning, data visualization, and scientific and statistical coding.What makes this book unique:Each project is built around three learning goals: machine learning techniques, LLM mechanisms, and Python coding with data visualization. This is not a dense theoretical textbook; it's hands-on, pratical, and project-oriented. You will learn how to measure, visualize, and manipulate the internal components of LLMs (including embeddings, transformer outputs, hidden-states, attention, and MLP layers) directly. Projects range from analyzing tokenization and embedding geometry to dissecting attention heads, probing MLP neurons, and running causal experiments that reveal how information flows through a model during inference.Topics covered include:Tokenization schemes and their statistical propertiesEmbedding spaces: cosine similarity, semantic axes, and analogy vectorsOutput logits, softmax distributions, perplexity, and language biasesLayer-by-layer transformer dynamics and dimensionalityAttention mechanisms: QKV weights, attention scores, head ablation, and activation patchingMLP subblocks: neuron tuning, mutual information, subspace analysis, and statistics-based causal manipulationsLogit lens, indirect object identification, and causal tracingWho this book is for:This book is for data scientists, ML engineers, and researchers who want to go beyond surface-level understanding of LLMs. Prior Python experience is required. Familiarity with machine learning or deep learning is helpful but not required — techniques are introduced as they arise throughout the projects.Practical and accessible:All code runs on Google Colab, so there is nothing to install and no local configuration required. Each of the 50 projects comes with two Jupyter notebooks: one with hints and incomplete code for guided practice, and one with a complete working solution. All code is freely available on GitHub at https://github.com/mikexcohen/ML4LLM_bookMike X Cohen, PhD, is a former neuroscience professor and full-time educator with 25 years of experience teaching machine learning, mathematics, and data science. His courses are bestsellers on Udemy and his textbooks are published by O'Reilly, MIT Press, and independently. Read more

ISBN13 979-8248428694
Language English
Publisher Independently published
Dimensions 7.44 x 1.17 x 9.69 inches
Item Weight 2.51 pounds
Reading age 12 - 18 years
Print length 519 pages
Publication date February 18, 2026

Correction of product information

If you notice any omissions or errors in the product information on this page, please use the correction request form below.

Correction Request Form

Product Review

You must be logged in to post a review