{"id":30969,"date":"2024-02-28T06:55:01","date_gmt":"2024-02-28T11:55:01","guid":{"rendered":"https:\/\/centricconsulting.com\/?p=30969"},"modified":"2024-02-28T07:34:00","modified_gmt":"2024-02-28T12:34:00","slug":"machine-learning-101-part-1-what-are-machine-learning-and-artificial-intelligence","status":"publish","type":"post","link":"https:\/\/centricconsulting.com\/blog\/machine-learning-101-part-1-what-are-machine-learning-and-artificial-intelligence\/","title":{"rendered":"Machine Learning 101: What Are Machine Learning and Artificial Intelligence?"},"content":{"rendered":"

Artificial intelligence and machine learning are hot \u2014 but they aren\u2019t the same. Knowing the difference and some AI and ML basics can help you make smarter business decisions.<\/h2>\n
\n

While artificial intelligence<\/a> (AI) became a household term in 2022 and 2023 with the launch of OpenAI\u2019s ChatGPT, in the business world it still shares the stage with machine learning<\/a> (ML). In fact, Microsoft founder Bill Gates has even said<\/a> that ML will be \u201cworth ten Microsofts,\u201d and Defense Advanced Research Projects Agency (DARPA) Executive Director Tony Tether called it \u201cthe next internet.\u201d<\/p>\n

However, though these terms are often used interchangeably, they aren\u2019t quite the same things.<\/p>\n

Understanding the differences between ML and AI will help you sort through vendors\u2019 claims when they approach you with new technologies.<\/strong> That will help you avoid purchasing services you may not need and make sure you use the right AI and ML approach.<\/p>\n

In this post, we\u2019ll separate the facts from the hype about AI and ML. We\u2019ll start with a brief history of ML and AI, discuss the differences between the two, and give some basics about how they work together. In another post<\/a>, we\u2019ll go a little deeper into how you can make these tools work for you.<\/p>\n

An Overview of Machine Learning and Artificial Intelligence<\/h2>\n

Though some think ML is a brand-new idea, Arthur Samuel coined the term in 1959. In his words, ML is a \u201cfield of study that gives computers the ability to learn without being explicitly programmed.\u201d<\/p>\n

With ML, the computer programs itself by using input data to create output data. With the help of an algorithm, the output data then creates a new program.<\/p>\n

AI is a more general term that refers to creating computer systems that perform tasks more intelligently or in a more human way. AI works the way people work\u2014it tries things, learns from mistakes, and changes its behavior for the future.<\/strong><\/p>\n

AI also relies on people to function. Human beings must create the datasets and algorithms to make it work correctly. As Gartner Vice President Analyst Alexander Linder writes<\/a>, \u201cThe rule with AI today is that it solves one task exceedingly well, but if the conditions of the task change only a bit, it fails.\u201d<\/p>\n

Think of the relationship between ML and AI like this: You can program a computer to demonstrate AI without necessarily using ML, but a computer that uses ML is also using AI.<\/p>\n

The most recent AI advances are the large language models (LLMs) that power tools like ChatGPT<\/a>, Google Bard, Microsoft Bing, and a host of others.<\/strong> They use sophisticated algorithms to recognize patterns in words, numbers and data in response to queries. They then deliver the responses in the user\u2019s human language.<\/p>\n

These AI tools, known collectively as \u201cgenerative AI,\u201d make AI an even more powerful partner to ML. As generative AI creates new content and feeds it into an ML tool, that tool completes its tasks faster, more efficiently, and more accurately.<\/p>\n

ML Programming vs. Traditional Programming: A Deeper Dive<\/h2>\n

In 1998, Tom Mitchell defined ML like this: \u201cA computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T, as measured by P, improves with experience E.\u201d<\/p>\n

Let\u2019s unpack this by looking at ML versus traditional programming.<\/strong><\/p>\n

In traditional programming, the programmer uses input data and a set of code rules to create a program. The coded program runs on the computer and produces a desired output. Below is how traditional programming works:<\/p>\n

\"Centric<\/a><\/p>\n

With ML, in contrast, the programmer creates a predictive model by identifying the data samples for a true\/false condition and then manipulates the data to pass to a predictive algorithm that creates new rules. Below is how ML works, where a model is built from example inputs to make data-driven predictions:<\/p>\n

\"Centric<\/a><\/p>\n

Types of Machine Learning<\/h2>\n

Broadly speaking, three types of ML algorithms exist:<\/p>\n

Supervised Learning<\/h3>\n

Supervised learning is the most popular ML paradigm. It is also the easiest to understand and the simplest to implement. You can think of supervised learning as \u201ctask-oriented\u201d because it typically focuses on a single task. Programmers feed a lot of data into algorithms until they can accurately perform the desired task.<\/strong><\/p>\n

Supervised learning can be two types \u2013 regression or prediction and classification.<\/p>\n