Artificial Intelligence | Towards Data Science https://towardsdatascience.com/tag/artificial-intelligence/ Publish AI, ML & data-science insights to a global community of data professionals. Mon, 15 Dec 2025 20:49:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://towardsdatascience.com/wp-content/uploads/2025/02/cropped-Favicon-32x32.png Artificial Intelligence | Towards Data Science https://towardsdatascience.com/tag/artificial-intelligence/ 32 32 The Machine Learning “Advent Calendar” Day 15: SVM in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-15-svm-in-excel/ Mon, 15 Dec 2025 19:41:01 +0000 https://towardsdatascience.com/?p=607912 Instead of starting with margins and geometry, this article builds the Support Vector Machine step by step from familiar models. By changing the loss function and reusing regularization, SVM appears naturally as a linear classifier trained by optimization. This perspective unifies logistic regression, SVM, and other linear models into a single, coherent framework.

The post The Machine Learning “Advent Calendar” Day 15: SVM in Excel appeared first on Towards Data Science.

]]>
Lessons Learned from Upgrading to LangChain 1.0 in Production https://towardsdatascience.com/lessons-learnt-from-upgrading-to-langchain-1-0-in-production/ Mon, 15 Dec 2025 10:30:00 +0000 https://towardsdatascience.com/?p=607893 What worked, what broke, and why I did it

The post Lessons Learned from Upgrading to LangChain 1.0 in Production appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 14: Softmax Regression in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-14-softmax-regression-in-excel/ Sun, 14 Dec 2025 18:12:00 +0000 https://towardsdatascience.com/?p=607910 Softmax Regression is simply Logistic Regression extended to multiple classes.

By computing one linear score per class and normalizing them with Softmax, we obtain multiclass probabilities without changing the core logic.

The loss, the gradients, and the optimization remain the same.
Only the number of parallel scores increases.

Implemented in Excel, the model becomes transparent: you can see the scores, the probabilities, and how the coefficients evolve over time.

The post The Machine Learning “Advent Calendar” Day 14: Softmax Regression in Excel appeared first on Towards Data Science.

]]>
The Skills That Bridge Technical Work and Business Impact https://towardsdatascience.com/the-skills-that-bridge-technical-work-and-business-impact/ Sun, 14 Dec 2025 14:30:29 +0000 https://towardsdatascience.com/?p=607866 In the Author Spotlight series, TDS Editors chat with members of our community about their career path in data science and AI, their writing, and their sources of inspiration. Today, we’re thrilled to share our conversation with Maria Mouschoutzi.  Maria is a Data Analyst and Project Manager with a strong background in Operations Research, Mechanical […]

The post The Skills That Bridge Technical Work and Business Impact appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 13: LASSO and Ridge Regression in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-13-lasso-and-ridge-regression-in-excel/ Sat, 13 Dec 2025 16:56:00 +0000 https://towardsdatascience.com/?p=607908 Ridge and Lasso regression are often perceived as more complex versions of linear regression. In reality, the prediction model remains exactly the same. What changes is the training objective. By adding a penalty on the coefficients, regularization forces the model to choose more stable solutions, especially when features are correlated. Implementing Ridge and Lasso step by step in Excel makes this idea explicit: regularization does not add complexity, it adds preference.

The post The Machine Learning “Advent Calendar” Day 13: LASSO and Ridge Regression in Excel appeared first on Towards Data Science.

]]>
NeurIPS 2025 Best Paper Review: Qwen’s Systematic Exploration of Attention Gating https://towardsdatascience.com/neurips-2025-best-paper-review-qwens-systematic-exploration-of-attention-gating/ Sat, 13 Dec 2025 10:16:00 +0000 https://towardsdatascience.com/?p=607899 This one little trick can bring about enhanced training stability, the use of larger learning rates and improved scaling properties

The post NeurIPS 2025 Best Paper Review: Qwen’s Systematic Exploration of Attention Gating appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 12: Logistic Regression in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-12-logistic-regression-in-excel/ Fri, 12 Dec 2025 17:15:00 +0000 https://towardsdatascience.com/?p=607901 In this article, we rebuild Logistic Regression step by step directly in Excel.
Starting from a binary dataset, we explore why linear regression struggles as a classifier, how the logistic function fixes these issues, and how log-loss naturally appears from the likelihood.
With a transparent gradient-descent table, you can watch the model learn at each iteration—making the whole process intuitive, visual, and surprisingly satisfying.

The post The Machine Learning “Advent Calendar” Day 12: Logistic Regression in Excel appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 11: Linear Regression in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-11-linear-regression-in-excel/ Thu, 11 Dec 2025 16:31:00 +0000 https://towardsdatascience.com/?p=607891 Linear Regression looks simple, but it introduces the core ideas of modern machine learning: loss functions, optimization, gradients, scaling, and interpretation.
In this article, we rebuild Linear Regression in Excel, compare the closed-form solution with Gradient Descent, and see how the coefficients evolve step by step.
This foundation naturally leads to regularization, kernels, classification, and the dual view.
Linear Regression is not just a straight line, but the starting point for many models we will explore next in the Advent Calendar.

The post The Machine Learning “Advent Calendar” Day 11: Linear Regression in Excel appeared first on Towards Data Science.

]]>
How Agent Handoffs Work in Multi-Agent Systems https://towardsdatascience.com/how-agent-handoffs-work-in-multi-agent-systems/ Thu, 11 Dec 2025 12:00:00 +0000 https://towardsdatascience.com/?p=607875 Understanding how LLM agents transfer control to each other in a multi-agent system with LangGraph

The post How Agent Handoffs Work in Multi-Agent Systems appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 10: DBSCAN in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-10-dbscan-in-excel/ Wed, 10 Dec 2025 16:30:00 +0000 https://towardsdatascience.com/?p=607882 DBSCAN shows how far we can go with a very simple idea: count how many neighbors live close to each point.
It finds clusters and marks anomalies without any probabilistic model, and it works beautifully in Excel.
But because it relies on one fixed radius, HDBSCAN is needed to make the method robust on real data.

The post The Machine Learning “Advent Calendar” Day 10: DBSCAN in Excel appeared first on Towards Data Science.

]]>
Don’t Build an ML Portfolio Without These Projects https://towardsdatascience.com/dont-build-an-ml-portfolio-without-these-projects/ Wed, 10 Dec 2025 13:30:00 +0000 https://towardsdatascience.com/?p=607871 What recruiters are looking for in machine learning portfolios

The post Don’t Build an ML Portfolio Without These Projects appeared first on Towards Data Science.

]]>
Optimizing PyTorch Model Inference on AWS Graviton https://towardsdatascience.com/optimizing-pytorch-model-inference-on-aws-graviton/ Wed, 10 Dec 2025 12:00:00 +0000 https://towardsdatascience.com/?p=607814 Tips for accelerating AI/ML on CPU — Part 2

The post Optimizing PyTorch Model Inference on AWS Graviton appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 9: LOF in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-9-lof-in-excel/ Tue, 09 Dec 2025 17:45:00 +0000 https://towardsdatascience.com/?p=607869 In this article, we explore LOF through three simple steps: distances and neighbors, reachability distances, and the final LOF score. Using tiny datasets, we see how two anomalies can look obvious to us but completely different to different algorithms. This reveals the key idea of unsupervised learning: there is no single “true” outlier, only definitions. Understanding these definitions is the real skill.

The post The Machine Learning “Advent Calendar” Day 9: LOF in Excel appeared first on Towards Data Science.

]]>
Personal, Agentic Assistants: A Practical Blueprint for a Secure, Multi-User, Self-Hosted Chatbot https://towardsdatascience.com/personal-agentic-assistants-a-practical-blueprint-for-a-secure-multi-user-self-hosted-chatbot/ Tue, 09 Dec 2025 16:30:00 +0000 https://towardsdatascience.com/?p=607863 Build a self-hosted, end-to-end platform that gives each user a personal, agentic chatbot that can autonomously vector-search through files that the user explicitly allows it to access.

The post Personal, Agentic Assistants: A Practical Blueprint for a Secure, Multi-User, Self-Hosted Chatbot appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 8: Isolation Forest in Excel https://towardsdatascience.com/the-machine-learning-advent-calendar-day-8-isolation-forest-in-excel/ Mon, 08 Dec 2025 18:26:42 +0000 https://towardsdatascience.com/?p=607851 Isolation Forest may look technical, but its idea is simple: isolate points using random splits. If a point is isolated quickly, it is an anomaly; if it takes many splits, it is normal.

Using the tiny dataset 1, 2, 3, 9, we can see the logic clearly. We build several random trees, measure how many splits each point needs, average the depths, and convert them into anomaly scores. Short depths become scores close to 1, long depths close to 0.

The Excel implementation is painful, but the algorithm itself is elegant. It scales to many features, makes no assumptions about distributions, and even works with categorical data. Above all, Isolation Forest asks a different question: not “What is normal?”, but “How fast can I isolate this point?”

The post The Machine Learning “Advent Calendar” Day 8: Isolation Forest in Excel appeared first on Towards Data Science.

]]>
The AI Bubble Will Pop — And Why That Doesn’t Matter https://towardsdatascience.com/the-ai-bubble-will-pop-and-why-that-doesnt-matter/ Mon, 08 Dec 2025 15:00:00 +0000 https://towardsdatascience.com/?p=607844 How history’s biggest tech bubble explains where AI is headed next

The post The AI Bubble Will Pop — And Why That Doesn’t Matter appeared first on Towards Data Science.

]]>
Optimizing PyTorch Model Inference on CPU https://towardsdatascience.com/optimizing-pytorch-model-inference-on-cpu/ Mon, 08 Dec 2025 12:00:00 +0000 https://towardsdatascience.com/?p=607812 Flyin’ Like a Lion on Intel Xeon

The post Optimizing PyTorch Model Inference on CPU appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 7: Decision Tree Classifier https://towardsdatascience.com/the-machine-learning-advent-calendar-day-7-decision-tree-classifier/ Sun, 07 Dec 2025 14:30:00 +0000 https://towardsdatascience.com/?p=607847 In Day 6, we saw how a Decision Tree Regressor finds its optimal split by minimizing the Mean Squared Error.
Today, for Day 7 of the Machine Learning "Advent Calendar", we switch to classification. With just one numerical feature and two classes, we explore how a Decision Tree Classifier decides where to cut the data, using impurity measures like Gini and Entropy.
Even without doing the math, we can visually guess possible split points. But which one is best? And do impurity measures really make a difference? Let us build the first split step by step in Excel and see what happens.

The post The Machine Learning “Advent Calendar” Day 7: Decision Tree Classifier appeared first on Towards Data Science.

]]>
Reading Research Papers in the Age of LLMs https://towardsdatascience.com/reading-research-papers-in-the-age-of-llms/ Sat, 06 Dec 2025 16:00:00 +0000 https://towardsdatascience.com/?p=607833 How I keep up with papers with a mix of manual and AI-assisted reading

The post Reading Research Papers in the Age of LLMs appeared first on Towards Data Science.

]]>
The Machine Learning “Advent Calendar” Day 6: Decision Tree Regressor https://towardsdatascience.com/the-machine-learning-advent-calendar-day-6-decision-tree-regressor/ Sat, 06 Dec 2025 14:30:00 +0000 https://towardsdatascience.com/?p=607840 During the first days of this Machine Learning Advent Calendar, we explored models based on distances. Today, we switch to a completely different way of learning: Decision Trees.
With a simple one-feature dataset, we can see how a tree chooses its first split. The idea is always the same: if humans can guess the split visually, then we can rebuild the logic step by step in Excel.
By listing all possible split values and computing the MSE for each one, we identify the split that reduces the error the most. This gives us a clear intuition of how a Decision Tree grows, how it makes predictions, and why the first split is such a crucial step.

The post The Machine Learning “Advent Calendar” Day 6: Decision Tree Regressor appeared first on Towards Data Science.

]]>