AI in the 21st Century

AI in the 21st Century

What is AI?

A child is born nine months after egg fertilization in the embryo. Learn to speak, remember things, and walk-in for about two years. A human child does it in itself, learns from his surroundings, keeps growing, and gets better day by day in making decisions.

When we try to simulate these things in a machine or software, we coin it as an AI system. In laymen's terms, we add intelligence to a device or software utilizing some algorithms such that it starts recognizing/doing things for which we have trained them.

Well, most probably, you might be thinking about the sci-fi movies/series such as Terminator, West world, Iron Man, and the list keeps going.

Well, this is not quite in real life. And we may classify any AI in 3 categories.

  1. Artificial Narrow Intelligence: An AI will be termed as narrow AI when it may perform any specific tasks only. In real life, all the current research is in this type of AI only.
  2. Artificial General Intelligence: An AI will be coined as General AI if it gets intelligence near any human being. It may able to perform any intellectual tasks as a normal human being will do. We can show it in the reels only, and it will take quite some time to make it a reality.
  3. Artificial Strong Intelligence: When AI starts beating humans in terms of intelligence.

You might think that there are such AI that may beat humans in many specific tasks, then why we say that we are still in the narrow AI. That is because that is narrow AI, that AI is capable of doing that specific job only. If we throw something which it has not seen or not trained for then, it will collapse. We have AI systems that are capable of detecting  Leucoma just from the images of the eye. But it will fail if we give it a simple dog image.

Types of AI

In general, AI has three subfields/sub-domains

  1. Artificial Intelligence
  2. Machine Learning
  3. Deep Learning

Artificial Intelligence

You might have heard about Tesla and its self pilot mode, where the AI drives the car on its own with no human supervision. It is the best example of Artificial Intelligence. The vehicle, with the help of very high advanced sensors, drives the car on its own. It involves very complex decisions as changing and deciding lanes in the busy road, giving side to the following vehicle based on its driving behavior and real-time circumstances. You may sleep, and the car will take you to your desired destination.

It sounds great, but it is Artificial Narrow Intelligence. It will work very well if the vehicle is on the road because the algorithm knows very well how to drive it on a highway. Don't expect it to run as smoothly as it runs on American roads. Various studies show that it requires the AI to drive 11 billion miles to become as good as human drivers if not better, which is practically not feasible. Hence they use driving simulators instead.

Machine Learning

While watching videos on YouTube, we have a toggle switch that enables auto-play. If the toggle is on, it automatically starts another video. This automatic video is not randomly selected. Instead, the algorithms on YouTube predicts which video you are more likely to watch. The user likes the video, and he/she keeps watching it. The user views a few more videos; the YoutTube shows more ads and generates more revenue. It is all done based on user data, past viewing behavior of the user, and many other parameters.

Deep Learning

Recently the local govt started issuing e-challans if anyone if not wearing helmets while driving two-wheeler vehicles. It works based on detecting vehicles, faces, helmets, and number plates from the ike. Suppose any rider is not wearing the helmet while driving. In that case, the vehicle registration number will be detected, and an e-challan will be issued. In deep learning, we try to simulate the working of a human brain. Some general problems which we solve with Deep Learning are character recognition, image detection.

AI in real life

We see some cool robots working in the manufacturing plants in assembly lines. Do they use AI? May be or may not be. Most of the time, we configure these robots in assembly lines to do a specific task that does not use AI. We issue a fixed set of commands to these machines, and it executes it as per them, they are smart because of the code/instructions that we feed. But it does not use AI.

Tim Shaw and ALS

ALS is a nervous system disease that weakens muscles and impacts physical function. In this disease, nerve cells break down, which reduces functionality in the muscles that they supply. The cause is unknown. And there is no cure available for the same.

Tim Shaw is a former American footballer. In 2014 Tim announced that hs has ALS. Tennessee Titans released him in 2013 after the league due to poor performance. Tim lost his fluent speaking ability with this disease. Because of the condition, it became tough for people to understand what he wants to say. A team of experts came together to solve this problem. And they made a mobile app with speech recognition tailored for Tim. They recorded a whole new dataset based on how Tim spoke.

They trained an AI model for the same based on the dataset from Tim's current voice. The final prototype may recognize what Tim is saying with his shivering voice. It speaks it aloud to his family members. And since the researchers have all the previous voice samples before this disease. The application now speaks/narrates in the original voice of Tim.

Medical Diagnosis

Brachial Plexus Nerves, also known as BP nerves, are present near the neck region in any human body. When placing a catheter (a medical device) during the operation/surgery, doctors make sure to put it in such a place where there are fewer BP Nerves. Ultrasound images of that region in the human body reveal the same. However, it requires excellent medical experience in marking that region in any given ultrasound image. There is a public dataset with data about such masking about the presence of the nerves in an ultrasound image.

Hence, after training a model with this dataset. We may predict and annotate the BP nerves in any given UltraSound Image. It will hardly take a few seconds to detect the same with AI.


YouTube earned $15 billion on the 2019-2020 financial year from ads. YouTube is one of the best choices for advertisers. Let's learn how does YouTube works? YouTubers create most of the content on the platform, varying with almost all genres. Creators keep creating content, and YouTube keeps suggesting similar material to the end-user or viewer. And the creators who have passed some minimum views limit get to earn money by allowing YouTube to insert ads in their contents. The more users view the material into its platform, the more content the user will watch and hence more revenue.

The backbone of this is the recommendation system, which suggests videos very close to what exactly you watch on YouTube.


NetFlix earns money from the subscription. All the users have to pay a recurring subscription charge monthly or yearly. Based on it, they generate their revenue. NetFlix bets on its recommendation system, which keeps suggesting the best content based on your watching history and other parameters.

There are lots of opportunities and possibilities with AI in this 21st century. Right from medical, education, logistics, eCommerce, space research, hiring, sales, marketing, and so on. The above one is just a few examples of how we are using AI to solve real-life problems.


Success! Your account is fully activated, you now have access to all content.
Success! Your billing info is updated.
Billing info update failed.