Teaching Computers to Learn: ML Explained Simply
The term machine learning was coined in 1959 by Arthur Samuel, an IBM employee and pioneer in the field of computer gaming and artificial intelligence. But what is machine learning? Imagine trying to train a dog. You give it a treat every time it sits when you say “Sit.” After a while, the dog figures out how to get the treat. That’s learning from past experiences and data. Now, imagine doing something similar but with a computer. This is exactly what Machine Learning is. Machine Learning is a way to teach computers to learn from data rather than providing step-by-step instructions.
In traditional programming, we give the computer specific instructions, but in machine learning, we leave the computer with data, allowing it to identify patterns and rules on its own. In simple terms, in programming, you provide the computer with rules, but in machine learning, you offer a set of examples.
For example, if you want the computer to recognise handwritten numbers, you need to supply it with thousands of images of digits, each labelled with the correct number. The machine analyses the patterns in the pixels, and over time, it begins to learn how the number 2 typically looks different from a 3 or an 8. And just like a dog that improves at obeying commands with more practice, a machine learning model gets better as it is exposed to more data.
The next question is: how does a machine “learn”? The key to machine learning (ML) is something known as a model. You can think of a model like a student. A student learns from textbooks (which represent the training data), practices with homework (examples), and then takes a test (new data that the student hasn’t encountered before). The more effective and comprehensive the training, the better the performance on the test. However, a student can make mistakes, especially if they didn’t have good textbooks or resources. Similarly, machine learning models can also make mistakes.
Machine learning is not magic — it’s about teaching computers to recognise patterns, make decisions, and improve over time, just like we do. It’s one of the most exciting and fast-growing fields in technology today, shaping everything from the apps we use to the way we drive, shop, and even get medical care.

