How Artificial Intelligence Is Changing The World.

Are we heading to a solution on how making computers think like us?

Nowadays you hear more and more talking about Artificial Intelligence and how it will help us to get a better life, but where are we now in terms of creating computers, robots and tech stuff that can think, talk and perform tasks like humans?

More...

Wow, what a topic for the first article!

I’m not a scientist, but I think Artificial Intelligence is, now more than ever, strictly related to the tech world, therefore I wanted to share what I’ve learned about it and how it impacts our daily life. The first short part of this article will be a little sci-tech, but necessary to understand a little more about AI.

After recent advances in drone and interactive robot technology we are close to a breakthrough in Artificial Intelligence, but have a little way to go yet before engineering a David like the one in A.I. Artificial Intelligence of S. Spielberg.

David as seen in A.I. Artificial Intelligence by S. Spielberg

David as seen in A.I. Artificial Intelligence by S. Spielberg

There are no doubts Artificial Intelligence will change the way we live, and there are already several examples of AI applied to tech gadgets we use daily, but intelligence is a tough concept to pin down.

Some Basics About Our Brain

Human's Brain Areas Related to AI

Human's Brain Areas Related to AI

Is someone, or something, intelligent because they can multiply 27 x 19 in their head? What about emotional intelligence – working out if someone is sad, angry or faking an emotion?

According to Dr. Nando de Freitas, an authority in the fields of machine learning, deep learning, and lead research scientist at Google DeepMind, emotional intelligence isn’t the main problem.

Emotions are one of the easiest things to reproduce. You don't need to build something as intelligent as a human to get an emotional response. We know Amygdala is responsible to process emotions. It's part of the "old brain", which we share with rats, mice, cows, pigs and lots of other animals, that's the part we've been exploring from long time and we know pretty well.

Dr. De Freitas and his team, has spent a lot of time studying the brain, trying to work out what goes on without us knowing it.

There's an area in the brain called the Hippocampus, which is fascinating. Every time we do an action, a different neuron fires in the Hippocampus, that fires a certain set of neurons in the Visual Cortex (where we store images) and in the Auditory Cortex (where we store sequences). Each neuron represents a location in the world and fires when you are there, meaning, for example, you can now imagine your way home. Right now, we are trying to work out how to replicate that. We take intelligence for granted. You aren't aware of a lot of what's going on in your brain, which is why it's so hard to reverse engineer it.

Machine Learning and Deep Learning are the Keys for Artificial Intelligence

Artificial Intelligence, Machine and Deep Learning

Artificial Intelligence, Machine and Deep Learning

We've seen how scientists are studying our brain, to understand which are the most important processes to replicate, in order to achieve a more human-like Artificial Intelligence. All major scientists working on AI, says Machine Learning and Deep Learning are the keys, and Dr. De Freitas gives a pretty clear explanation.

I see intelligence as being able to interact with an environment and do the right thing. Humans are able to plan their actions and engage in counterfactual reasoning. Robots are much smarter than us. They can perform logic and mathematical tasks much quicker. However, humans can go from observing sequences in the world and build representations of them in our brains. This is what we are now trying to achieve with AI.
Deep Learning tries to get robots to build representations of the world and operate on those representations to build sequences in their mind. Then we want them to learn to construct different sequences. We want computers to learn abstract representation about their environment and then think about their environment and the cause and effect of their actions.

Artificial Intelligence In Our Today's Life

I've already talked too much about the theory behind Artificial Intelligence, but as I said, I think it was necessary to let you understand which difficulties scientists are facing, and where are they now.

Did you know there have been a lot of important achievements in AI and you're using some of them in your daily life?
Yes, even if you didn't notice, Artificial Intelligence is already part of our daily life, and here below I'll give some relevant breakthroughs.

Camera's Facial Recognition

Camera Facial Recognition

Camera Facial Recognition

Everyone own a smartphone or a digital camera, and the Facial Recognition is the first most obvious bit of Artificial Intelligence technology. It allows an easier focusing and tagging, and it's a real leap forward in intelligent image-capturing technology.

Apple Siri and Google Now

Siri in action

Siri in action

Siri first and then Google Now, opened our minds to the idea that we don't have to be tethered to a laptop to have seamless interaction with information. This has probably been the first real step into Artificial Intelligence. Going forward AI will move from speech recognition to natural language interaction, to natural language generation, and eventually to an ability to write as well as receive information.

Apple's A11 Bionic Chip & Face ID Technology

Apple A11 Bionic Chip

Apple A11 Bionic Chip

The Apple A11 Bionic Chip, of the last iPhone 8 and iPhone X, is most probably the biggest step in AI applied to consumers goods. Its Neural Engine has circuits tuned to accelerate certain kinds of artificial-intelligence software, called artificial neural networks, that are good at processing images and speech.

The neural engine powers the algorithms that recognize your face to unlock the phone and transfer your facial expressions onto animated emoji. It also powers augmented reality and image recognition, which rely on machine-learning algorithms. Apple said the new silicon could enable unspecified "other features".

Apple Face ID

Apple Face ID

Apple's Face ID technology is something that really comes from the future. Its AI is a clear example of machine learning technology, in fact it recognize your face even if it changes. Let's make a practical example, if you have decided to let your beard grow, Apple's Face ID will understand this, and update its data accordingly! Isn't this incredible?

Apple claims that its technology can even distinguish the differences between twins, and this has been tested and confirmed from external independent sources too.

Listening to rumors, Apple is deeply involved in bringing this technology into healthcare, by helping an iPhone analyze data from a user's Apple Watch. Apple also confirmed is working with Stanford researchers to test an app that detects abnormal heart rhythms.

Apple recently hired John Giannandrea, Google's former head of search and Artificial Intelligence, and this is a clear sign of how much is pushing to bring its AI to the next level.

Facebook

Facebook AI

Facebook AI

Facebook uses Machine Learning technology, to learn about users from all the data they input.

You can learn a lot from the data that exists out there, even their IQ.

Facebook uses these data to show to users the topics that more suits their habits. They could also use these data to start recruiting, or even become a life coach.

No psychologist has ever had access to such amount of data, and that's why Mark Zuckerberg is investing more and more on this.

Google

Google Artificial Intelligence

Google Artificial Intelligence

All the big search engines already use learning technologies, and as we know Google is by far the biggest.

Google has been working in a way to replicate the human brain’s ability to not only see an image, but to understand it as well.

This sounds a little crazy, but it’s already working in Google platform, and it’s called Google Reverse Image Search.

I won’t go further explaining how it works, but you can learn how to use this technology here.

What can we expect in the near future?

As you have seen, Artificial Intelligence is already very close to all of us, much more than everyone thinks.

Apart from Apple, all the big tech giants like Google, Amazon and Facebook are already deeply involved in AI technologies, and also others like Microsoft and Samsung too, even if they are still some steps behind.

Google acquired DeepMind a London-based AI company, and hired notable AI pioneers like Ray Kurzweil and Geoffrey Hinton, so it clearly wants to play a big role in the AI fields.

To be honest, is hard to forecast what we'll see in the near future, because technology can go faster than we expect, but if I've to bet on something, I would pick the following.

Healthcare

Artificial Intelligence has been introduced in the medical fields since 2013, and its implementation is far more advanced than everyone can expect.

There are many different applications for AI in the medical fields. Robots that can assist patients, as well as, machine learning systems that can take on challenges from diagnosing patients more quickly in the emergency room and streamlining communication between doctors, decreasing the risk of complications so patients can go home sooner – and avoid being readmitted.

In Japan the RIBA II robot is already operating. It can lift patients comfortably and take instructions from an operator. Elder patients that requires home nurse, can have their home instrumented, making sure it was non-invasive, and the robot properly trained, could for example detect when a patient is about to have a lapse, so ask for an ambulance in time.

Riba II Robot

Riba II Robot

Another stunning example of applied Artificial Intelligence is the El Camino Hospital in Silicon Valley, that implemented AI almost one year ago, to monitor hospitalised patients. Just after six months the system was implemented, the rate at which patients suffered dangerous falls, dropped 39%. Their system tracks the patients' behavior and treatments, and thanks to Machine Learning, is able to send alerts to nurses about possible complications with each patient.

Robotic surgery, is another important field where AI is going to be deeply applied. The Da Vinci robot, launched back in 2000, has been the first example. It allows surgeons to perform very delicate surgeries with fine detail and in tight spaces (and with less tremors) than would be possible by the human hand alone.

While not all robotic surgery procedures involve machine learning, some systems use computer vision (aided by machine learning) to identify distances, or a specific body part. In addition, machine learning is in some cases used to steady the motion and movement of robotic limbs when taking directions from human controllers.

These are just few examples of what has been done till now, but giving the importance of the medical fields, and considering this is one of the business with the biggest turnover worldwide, we're for sure going to see a lot more here.

Self-Driving Cars

GM Cruise Self-Driving Car

GM Cruise Self-Driving Car

Google has been the first to start testing self-driving cars back in 2009, and since then many others invested in this new technology. All the major car-makers are now involved with self-driving cars, and according to last report from Navigant Researches, GM and Waymo (Google) are ahead of competition, followed by Ford, Daimler and Volkswagen Group.

Despite the expectations Tesla and Uber are fairly behind of their competitors.

Before we go any further talking about Self-Driving Cars, I'd like you to understand, that not all these cars have the same level of AI. Self-Driving Cars are classified according to the SAE International Standard for Motor Vehicle Automated Driving Systems (J3016). This standard defines the level of automation of self-driving cars, hence the quantity of AI applied to them, in the following six levels of driving automation.

  • No Automations - Zero autonomy. The driver performs all driving tasks. Automated system issues warnings and may momentarily intervene but has no sustained vehicle control.
  • Driver Assistance (Hands on) - Vehicle is controlled by the driver, but some driving assist features may be included in the vehicle design. An example would be Adaptive Cruise Control (ACC) where the driver controls steering and the automated system controls speed. The driver must be ready to retake full control at any time. 
  • Partial Automation (Hands off) - Vehicle has combined automated functions, like acceleration and steering, but the driver must remain engaged with the driving task and monitor the environment at all times. The driver must monitor the driving and be prepared to immediately intervene at any time if the automated system fails to respond properly. Contact between hands and wheel is often mandatory during SAE 2 driving, to confirm that the driver is ready to intervene.
  • Conditional Automation (Eyes off) - Driver is a necessity, but is not required to monitor the environment. The driver must be ready to take control of the vehicle at all times with notice. The driver can safely turn their attention away from the driving tasks, e.g. the driver can text or watch a movie. The vehicle will handle situations that call for an immediate response, like emergency braking.
  • High Automation (Mind off) - The vehicle is capable of performing all driving functions under certain conditions. The driver may have the option to control the vehicle. As level 3, but no driver attention is ever required for safety, i.e. the driver may safely go to sleep or leave the driver's seat. Self-driving is supported only in limited areas (geo-fenced) or under special circumstances, like traffic jams. Outside of these areas or circumstances, the vehicle must be able to safely abort the trip, i.e. park the car, if the driver does not retake control.
  • Full Automation (Optional steering wheel) - The vehicle is capable of performing all driving functions under all conditions. The driver may have the option to control the vehicle. An example would be a robotic taxi.

Most car-makers are developing Level 2 cars, but some are already at Level 3.

The 2018 Audi A8 Luxury Sedan was the first commercial car to claim to be able to do level 3 self-driving. It has a so-called Traffic Jam Pilot, that when activated by the human driver, the car takes full control of all aspects of driving in slow-moving traffic at up to 60 kilometers per hour. The function works only on highways with a physical barrier separating oncoming traffic.

As we said GM has been indicated as the leader in this competition, and this is supported from the fact almost two years ago, GM acquired Cruise a Silicon Valley company, leader in self-driving technologies. GM plans to release a modified version of the Chevy Bolt with no steering wheel in 2019 for use in the Cruise driverless taxi service.

GM Chevy Bolt with no steering wheel

GM Chevy Bolt with no steering wheel

Conclusion

I think you can now agree Artificial Intelligence is already far more present than everyone could expect.

We have seen how the tech gadgets we love so much, become everyday more advanced and useful thanks to Artificial Intelligence, and this is just the beginning. Having tech giants, like Apple, Google and Facebook so deeply involved in AI, certainly means, we can expect to see a lot more here.

We'll for sure see a lot of new breakthroughs, in fields like Healthcare and Self-Driving Cars, that are so important for our daily life.

Artificial Intelligence is playing and will play a key role in many other fields, that I didn't explore here, because not so close to our daily life. I'm referring for example to fields like Aviation, Finance and Heavy Industry, where AI has been already implemented and it will be more and more. We'll benefit from all the improvements made in these fields too, even if we won't notice them as clearly as for the other fields.

Henry D'Amico
 

Hi, I'm Henry D'Amico the founder of Tech Periscope, and a fellow passionate about all things digital, since I was a young boy. With the help of my team and Kit (my digital assistant), I like to share what I've learned and give you some great advice.

>