LLM, Deep Learning, Machine Learning, Gen AI: How to Better Understand the Language of AI

March, 2025

To avoid confusion, let’s clarify what we mean when we talk about AI, Generative AI, machine learning, and deep learning. Understanding these concepts helps us become more aware of their impact, refine our practices, and discuss this evolving topic with greater accuracy.

Artificial Intelligence (AI) is not a new concept, but the progress over the last two years has been remarkable. With this rapid advancement, AI has become accessible to a broader audience—many of whom may struggle to grasp its terminology and implications. To form a well-informed opinion on AI, including its benefits and challenges—such as its environmental impact—it is essential to clarify key concepts like deep learning, machine learning, and large language models (LLMs).

Andres Algaba, an FWO postdoctoral researcher in AI, is also a guest professor in econometrics and mathematics for business and economics at VUB (Vrije Universiteit Brussel). His research focuses on fairness in machine learning, LLMs, and the science of science. He holds a joint PhD in business economics from VUB and UGent, was a researcher-entrepreneur in the VUB Sentometrics project, and is a member of the Young Academy Belgium. Additionally, he serves as a Generative AI coordinator at the university.

Deep Learning, Machine Learning, and AI

AI as we know it today—particularly in the field of generative AI—is often a combination of machine learning and deep learning. “AI is the broadest term; it includes everything related to artificial intelligence automation, including control theory in robotics,” explains Andres Algaba. “AI is much broader than just machine learning.”

Machine learning, a key component of AI, is primarily used for making predictions. However, AI as a whole is not just about prediction; it also encompasses learning, reasoning, and self-correction—elements that aim to replicate human intelligence. “AI is really the all-encompassing term. But a subset of that is machine learning. In machine learning, I would say it's all about prediction through what we call training, where we have input data and aim to predict an output,” Andres Algaba elaborates.

Machine learning algorithms learn by recognizing patterns. “You provide data to an algorithm and let it train by showing it many examples, indicating whether it was correct or not, and measuring how far off it was.” While machine learning is a powerful tool, it is not inherently groundbreaking.

A practical example is Spotify’s recommendation system. The platform analyzes listening habits to personalize song suggestions. Each interaction refines the algorithm, improving future recommendations—similar processes occur on Netflix or Amazon. However, to create more advanced AI models like ChatGPT, deep learning is necessary. “Deep learning is a subset of machine learning. Machine learning contains all possible prediction algorithms—of which there are nearly infinite—and many are what we call shallow,” Algaba adds.

Like three Concentric Circles

Deep learning, contrary to machine learning, relies on multilayered artificial neural networks to replicate human-like decision-making. While deep learning requires vast amounts of data, it functions with minimal human intervention. Conceptually, AI, as it is perceived today, can be seen as three concentric circles: deep learning at the core, focusing on brain-like processes, machine learning above it, enabling predictive capabilities, and AI as the broadest category.

The precision and accuracy of modern machine learning is partly due to advances in deep learning. Deep learning mimics the human brain’s learning process, allowing systems to improve independently. Without deep learning, self-improving AI models wouldn’t exist.

Deep learning represents a kind of evolution of machine learning. AI today is often a fusion of deep learning, which enables autonomous learning, and machine learning, which specializes in prediction. But AI is even more than that—particularly in the field of Generative AI, where it can create entirely new data.

Despite these advancements, can we truly say AI today replicates human intelligence? Not so sure, it should be better to talk more about advanced machine learning.

LLMs: The Superpower of Generative AI

A Large Language Model (LLM), built on deep learning, is a computer program trained on vast datasets, allowing it to recognize and interpret human language or other complex information. However, its functioning differs from human reasoning—it makes predictions about the most likely next word based on the input it receives. It constructs sentences word by word, forming responses that appear coherent.

Like all deep learning technologies, training an LLM requires massive amounts of data. The model learns by analyzing text, identifying common word sequences, and refining its predictions accordingly. However, human involvement is crucial—annotators label different types of text to ensure contextual understanding.

The lack of transparency surrounding this process raises concerns. It is unknown how many people are involved and their working conditions. In 2023, reports revealed that OpenAI employed workers in Kenya, paying them mere cents per labeled text—a practice that sparked ethical debates about AI development.

An Undeniable Environmental Footprint

“Now, going to environmental impact, learning has an immense computational cost” says Andres Algaba, “depending on how you fuel these computational engines, it plays a big role in the environmental impact.” Deep learning relies on multilayered structures, demanding significant computational power to generate accurate responses.

While Andres Algaba does not claim expertise in AI’s environmental footprint, he acknowledges the issue: “I wouldn’t label myself an expert in that field. But we have very little information, especially on LLMs from big companies, which is a shame. More transparency would help assess their true impact. What we do know is that they consume enormous amounts of energy.”

Too Harsh on New Technology?

While it is crucial to weigh AI’s benefits against its drawbacks, Andres Algaba warns against being overly critical of new technologies while neglecting other environmental concerns. “Sometimes I feel we’re overly severe on new technology compared to older technologies or other choices,” he notes. “Because of transparency issues, we scrutinize AI more closely than other sectors.”

He advocates for a balanced approach. “I’m not an expert on the energy aspect, but when I discuss this at the university level, I encourage awareness without being overly critical,” he explains. “Sustainability efforts should be part of a broader strategy. You can’t, for example, not promote cycling to campus while being excessively strict on Generative AI.”

As a Generative AI Coordinator at VUB, Andres Algaba collaborates with Vincent Ginis another professor at the Data Lab and another colleague to guide AI best practices. “We oversee data-driven policies, including generative AI,” he explains.

“Practically, we support VUB in developing Generative AI policies for students, lecturers, and researchers,” Andres Algaba continues. “We also provide guest lectures on how these systems work, how to use them, and what the university’s policies are.”

Raising awareness about AI ensures students will be able to use it effectively in their future careers while also understanding its broader impact, both social and environmental. We can at least take comfort in the fact that, with AI, the environmental footprint of digital technology has never been so much in the spotlight!

About

Andres Algaba is FWO Postdoctoral Researcher inArtificial Intelligence (AI), Coordinator in Generative AI, and Guest Professorat VUB, and Member of the Young Academy Belgium

Article written by Rémy Marrone for GreenTech Forum Brussels

GreenTech Forum Brussels is the Tech and Sustainability event.
Co-organised with the Belgian Institute for Sustainable IT, GreenTech Forum Brussels will take place 17-18 June, 2025 at La Maison de la Poste in Brussels, Belgium.
More information