Breaking the myths on Artificial Intelligence

Automatic Summary

Busting Popular Myths About Artificial Intelligence

Hello, everyone! It's a pleasure to be here. My name is Priya Gandhi and I'm the global head of Q A and automation at Hogarth Worldwide, which is part of the WPP group. Today, my mission is to unravel some of the common misconceptions about artificial intelligence.

Understanding Artificial Intelligence and Its Significance

The tech world brims with buzzwords - artificial intelligence, machine learning, Blockchain, augmented reality, and more. While these words may sometimes seem puzzling and daunting, one must not let this labyrinth of jargons deter curiosity.

"The advancement in technology is rapid and to keep up with it, we need to be familiar with these buzzwords."

Artificial intelligence technology permeates our day-to-day life. From predictive analytics to email intelligence to facial recognition, AI exists in multiple dimensions of our life. It aids in medical diagnosis, credit card fraud detection, mortgage loan approval, and so on. The possibilities with AI are endless. Therefore, it is vital to break down the myths surrounding this ground-breaking technology.

Debunking Common AI Myths

In this conference, I will deconstruct the top five enduring myths about artificial intelligence.

Myth 1: Artificial intelligence, Machine learning, and Deep learning are similar
Contrarily, these three are interrelated, but not identical. Artificial intelligence refers to intelligent behavior enabled by a set of algorithm rules—machine learning, a subset of AI, allows machines to learn from data and make predictions. Deep learning, a branch of machine learning, utilizes neural networks to replicate human-like thinking.

Myth 2: Robots and AI are synonymous.
While both assist with automation, they are distinct. When combined with AI, robots become AI robots, enabling them to perform complex tasks efficiently.

Myth 3: AI and Internet Of Things (IoT) don't blend
In reality, when paired with AI, IoT devices can analyze data, make decisions, and act without human intervention, known as AIoT.

Myth 4: AI won't impact my industry
Gartner, a global research and advisory firm, suggests that about 85% of enterprises will deploy AI automation technology by 2022. AI transcends various industries, equipping them with insights and competitive edges.

Myth 5: AI will replace humans
This myth assumes AI and humans possess identical abilities, whereas both bring their unique capabilities and strengths to the table. The future of cybersecurity lies in the hybrid intelligence model, known as augmented intelligence, which combines AI and human intelligence.

The Future Course of Action

In this ever-evolving world of technology, we need to decide where to place ourselves. Disruptor, explorer, evangelist, or change agent—what do you resolve to be? As our world pivots towards augmented intelligence, it's crucial to stay ahead of the curve. Embrace anything that you do routinely, for fun, work, or service, and study more about how you can incorporate AI into those activities.

Concluding Note

My journey with AI started four years ago, and I look forward to seeing some of you here in the next four years, mastering AI. Lastly, always keep learning, whether through extensive reading, Youtube videos, or podcasts. Remember, the development of technology is ceaseless, and knowledge is the key to adaptability. Keep learning, and you'll find your IKIGAI: the pathway to happiness.

Thanks for tuning in today. Stay curious and keep exploring!


Video Transcription

Welcome everyone to the conference and hope that you are all enjoying your time so far with so many inspiring speakers. My name is Priya Gandhi and I'm the global head of Q A and automation at Hogarth Worldwide part of WPP group. Today.I'm going to help break some of the popular myths around artificial intelligence and any jargons around it. Why do we get up in the morning? What is the meaning of life questions that we answer at some point in our life, both professionally and personally. According to Japanese culture, everyone has got Iki guy. It is the secret to a long and happy professional life. Iki Kai is the happiness of always being busy and happy at your work day to day. It is the passion and talent that gives us meaning to our days and drives us to share the best of ourselves with the world. Detecting our strengths is not always easy and one can use Iai as the beneficial practice for learning anything new. Thanks to all of you for making some time to spend with me today to learn about emerging technology. By coming here to this session. You have taken an important step in your learning journey. The tech world is teeming with trendy buzzwords like artificial intelligence or machine learning or deep learning Blockchain, augmented reality, virtual reality and whatnot. Every year, new term pops up and to keep up with the fast evolving world of technology, you need to stay on top of this.

And if this weren't enough, sometimes two distinct words will come together to create a new buzzwords like robotic process automation or intelligent automation, et cetera. For the uninitiated. All these jargons can be very confusing and daunting. Please don't let the labyrinth of passwords put you off indulging yourself in the vast and interesting landscape of artificial intelligence. Artificial intelligence technologies are already widely used by all of us today. More than we realize it. The biggest area where we interact with A I is through so-called predictive analytics. When you see an Amazon suggest you a book that you like or when Netflix predicts which movie you may want to watch next or when you upload a photo on Facebook, it automatically reflects faces and suggests your friends or how Google reminds you the day a year ago, Facebook, Instagram, Snapchat, Google photos.

All use artificial intelligence or machine learning to identify faces. A I can also help you with email intelligence like mail classification, smart replies, spam filters. Facial recognition is another application where A I is widely used from Siri to Cortana to Google assistant or Amazon, Alexa and Google home by implementing the A I to its entirety, these home devices and personal assistant, follow your command including setting a reminder or searching online information or even controlling lights A I is also applied in many areas that we don't usually see like medical diagnosis or health care or credit card fraud deduction, credit decision, mortgage loan approval.

And the list just goes on A I will continue to transfer industries injecting previously unimagined opportunities into our daily life. Very few subjects in science and technology are causing as much excitement right now as artificial intelligences, the hyper on A I has produced many myths in ma mainstream media in board meetings and across organizations, artificial intelligence will automate everything and and put people out of work or it is a science fiction technology or people say robotics, robots will take over the world and these are few of the myths.

But today uh for this particular meeting, I have collated the top five popular artificial intelligence myths and I have tried my best to demystify for you though the topic is technical. I will try my best to explain these for everyone to follow. So please bear with me. Myth one artificial intelligence, machine learning and deep learning are the same. The fact is these three terms are related, but they are not the same when the machine solves the problem based on a set of rules which is algorithm such intelligent behavior is called artificial intelligence.

And it is just a branch of computer science. And the key aspect that artificial intelligence differentiates from the usual coding is because of the word intelligence. This is where the computer program mimic some level of human intelligence. A I enables the computer or the machines to think.

But how does it happen? Machine learning is a subset of artificial intelligence and this enables machines to learn from data and make accurate predictions. It is a technique for realizing A I and to train the computer to tackle learning, to perceive, for problem solving, for language understanding and logical reasoning. In simple terms, machine learning get its brains by learning, supervised, unsupervised and reinforcement learning.

So what is supervised learning, supervised learning is when it is carried out by classification or regression method. For example, you want to train a machine to help you predict how long it will take you to drive home from your workplace. Here you start by creating a set of label data. This data will be usually like weather conditions or time of the day or holidays or your chosen route, the out all these details or your input and the output is the amount of time it took you to drive back home from your workplace. On that specific day. You as a human from your experience can instinctively know that if it is raining outside or if it is a peak time of the day, then it is going to take you longer to drive home. But the machine needs data and statistics and training the machine to tell us that information is done by teaching them. And this is called supervised learning, unsupervised learning. On the other hand, is carried out by clustering and association method. Let's take a case of this baby and her family dog, she knows her dog because she grows up with the dog.

A few weeks later, when a family friend brings along a dog and tries to play with the baby, the babies immediately realizes that the friend has brought another dog. This happens because the baby can realize or recognize many features like two ears, two eyes walking on four legs and a tail, which are very similar to her pet dog. She identifies the new animal as a dog and this is unsupervised learning where you are not taught, but you learn from the data. Had this been supervised learning, the family friend would have told the baby that it is a dog but he didn't and the baby learned by itself. And this is called uh unsupervised learning. There is also a third type called reinforcement learning, which is just a combination of both supervised and unsupervised learning. The next type is deep learning and deep learning is a subset of machine learning and it gets its name as it makes use of the deep neural networks and mimics the network of neurons in our brain.

The word deep means the network joins in neurons in more than two layers. So there is an input layer, there is another output layer and many layers in between that are called hidden layers. When compared to uh machine learning, deep learning needs considerably greater amounts of training data to deliver accurate research. To grasp the idea of deep learning. Look at the toddler with his mother, the boy points with his little finger and always says the word cat, the mom get concerned about his learning and she keeps telling him yes, that's a cat. No, that's not a cat. Yeah, that is definitely a cat. No, no, no, no. That's no way a cat, the toddler persists in pointing objects but becomes more accurate with her with cats. The little kid deep down does not know actually why he can say it is a cat or not. He has just learned how to hierarchy complex features and coming up with the cat by looking at the pet overall and continuing to focus details such as the tails or nose or the eyes to make up his mind and the neural network works quite the same. So how do you train the computer for deep learning? The training or learning occurs in two phases. The first phase consists of applying a nonlinear transformation of input and creating a statistical model as the output.

And the second phase aims at improving the model with a mathematical method known as the derivative. Look at this example where the model is trying to learn how to dance. And just after 10 minutes of training, the model does not know how to dance properly and it just looks like a scribble. And this is the first phase. Whereas after 48 hours of learning, the computer learns the art of dancing and it just becomes the master and that's how deep learning works. So the next question is how does data or data science fit in here? Data is the fuel for artificial intelligence. Data science is the area of study which involves extracting insights from vast amounts of data by use of various scientific methods, algorithms and processes. It helps you to discover hidden patterns from the raw data. This data will then be fed back into artificial intelligence or machine learning and deep learning and it goes as a loop. And I hope that now you know all about at least the basics about artificial intelligence, machine learning, deep learning and how data come into this perspective.

The next myth is robots and A A I are the same. The fact is robotics and artificial intelligence are two separate things before talking about how robots differ from artificial intelligence. It's crucial to understand the difference between robots bots and how they help with automation, then learn how artificial intelligence will apply to them to get make them better robots as you are aware or the programmable physical machines that are usually able to carry out a series of actions automatically and are widely used in manufacturing surgery transport or space exploration bots or internet robots.

On the other hand, are a type of computer program which automatically operates to complete a virtual task and are widely or usually used in chat bots or web products. So chat bots can answer your questions based on a predefined set of rules that are embedded into them. And when they can't handle or find any rules anymore, they just can hand it over to a normal customer representative. While both robots and bots help with automation, they are just programmed to do a certain job and usually doesn't involve any intelligence. However, when you add artificial intelligence to robots and bots, then they become intelligent. A A I when added to an A ar uh uh A robot becomes a A I robot. And a typical example is a drone that uses autonomous navigation to return home when it is going to run out of battery or a self driving car that uses a combination of a, a algorithms to detect and avoid potential hazards on the road. Similarly, an A, a powered bot a A bot is when a chat bot that uses natural language processing and can intelligently handle a customer chat without any predefined rules. Another A A bot example is robotic process automation that has got self learning ability A I when combined with robots and bots are very effi efficient and are able to perform complex tasks.

The third myth is A I and internet of things do not blend, but the fact is they really do internet of things is the technology that helps us to reimagine daily life and any wearables, any wearable devices like the smart watch or Fitbit or any refrigerators, digital assistance sensors.

These are the equipments when they are connected to the to the internet can be recognized by other devices and collect and process data. Then that's the internet of things. And when artificial intelligence is added on top of this internet of things, which means that these devices and now analyze your data, make decisions and act on that data without involvement of humans. And this is called A IOT, a practical example of A IOT include smart retail environment or smart camera or smart city. Smart retail is a camera system. When you fit it into a store, it is equipped with a computer vision. So when the customers walk through the door, it can initially recognize. For example, if the system detects that the majority of the customers walking into a particular store are millionaires, it can push out product advertisement or in-store specials that appeal to that particular demographic. Therefore, driving up sales.

Similarly, another example is smart camera that can identify shoppers and allow them to skip the checkout line. Like how what it happens in Amazon Go Store A I voting innovation is accelerating, which means the growing partnership between A I and internet of uh things means it is going to bring more connected and a smarter future. The fourth myth is a, I won't affect my industry, but the fact is at least 85% of enterprises will deploy a, a automation technology by 2022. According to Gartner, as you can see from this infographic A A is and will be used by pretty much all of the industries either directly or indirectly from ba banking to gaming, to sports, to politics and government seems like no one is paid from using A I within them. Yeah, analytics can further identify hidden patterns and provide a competitive edge necessary to outdo the competition. And if machines can assist alongside humans, for businesses to prosper, why will anyone be missing out? But there is at which any particular industry will embrace artificial intelligence or the impact of A a on their specific business or how early or late that they can transform is only going to differ that takes us nicely to the next myth A I is going to replace human.

The question of whether A I will replace human workers assumes that A I and humans have the same qualities and abilities. But in reality, they don't cause both people and artificial intelligence bring different abilities, different capabilities and strengths to the table A I is a computer acting and deciding in ways that seem intelligent. It just imitates how we act or feel or speak because A I has the quality to identify international pattern and to optimize trends relevant to the job. In addition, contrary to human, artificial intelligence physically gets nev it never gets tired, which means that as long as it's fed data, it just will keep going. And this type of intelligence is extremely useful in an organizational setting because of its imitating abilities.

And A I can have faster reaction times. Perfect memories sometimes be better at calculations than we are and can help with number of other things. But they can't mimic how we humans think or how we make decisions or how we perceive or so socialize, we the humans have the ability to imagine to anticipate feel and judge based on this changing situations. And we have the capacity to change from short term to long term concerns and vice versa. These abilities are unique to us and this is called authentic intelligence or human intelligence. And we do not require any steady flow of externally provided data to work, which is the case of artificial intelligence. Although artificial intelligence and authentic intelligence seems to be contradicting with each other. In fact, they are very complimentary. Both these type of intelligence offer a range of specific talents. And if put together, if we are putting artificial intelligence and authentic intelligence, put together in the right way, that will allow organizations to be more efficient, more accurate at the same time, be creative and be proactive.

And this um uh hybrid intelligence is called augmented intelligence. And this is the step forward to the future of intelligent work and workplace technology has been threatening jobs and displaced doing jobs or careers throughout the history. Telephone switching technology replaced human operators, automatic call directors, replaced receptionist, word processing and voicemail replaced secretaries, email, replace interoffice, couriers in each of these processes, technical or augmentation enhance the capabilities of humans with some jobs replaced perhaps but more jobs were created albeit requiring different skills.

Humans engage machines simplify, there will always be the need for humans in the loop to interact with humans or machines at some level. If you do a job that requires no thinking, then there are lots of people out there. If not machines to take over your job or try to imitate you. The real question is how ready we are to embrace a relationship with A I that benefits humanity. The secret is to stay ahead by keeping an eye on the future and embracing whatever is next four years ago, I was at this similar conference, not as a speaker but as a member of an audience taking in all this new and exciting news only here. I was given a taste of emerging technology like artificial intelligence, machine learning, augmented reality and others like it. Learning and understanding the big concepts with these emerging technologies is a crucial step in the process. And by coming to this conference, you have already taken the first step from here, it's up to you to decide whether you want to be a disruptor or be an explorer or an evangelist or would you like to be a change agent in the exciting world of artificial intelligence now that you are aware of the popular a, a myths and the real facts against them as a key takeaway.

I would like to suggest that you take anything that you always do for fun or for work or out of service to others and learn more to see how best you can apply A I in real term to all the activities that you do make some time to always keep learning on these technologies. Have a read whenever you get some downtime at linkedin Coursera Udemy, whatever, watch youtube videos on this topic or put on headphones to listen to podcast when you're cleaning the house or jogging in the park. Ultimately, you will find your IKI guy the pathway to your happiness.

This is the step that I am on right now trying to implement A I to the Q A or the software development, which is my full time career. Four years after my initial encounter with A I, I hope I'll see some of you here soon in another four years. Thanks for your time and I hope that you found this session useful in some way. Please make sure to continue with the conference and enjoy other sessions organized by Anna and the woman in the tech team, take care and speak soon. Bye.