If you’ve been following technology news in recent months you’ve probably heard all the buzz about artificial intelligence and terms such as “machine learning” and “neural networks”. The truth is artificial intelligence has been around for a while. It is just being covered a little more in the news lately. One of the factors that have allowed artificial intelligence to be more relevant has been the public cloud. The advent of the public cloud has had an enormous impact on the advancement of artificial intelligence in recent years.
The History of Artificial Intelligence
Although the idea of beings with intelligence similar to humans has been around for centuries, it wasn’t until the beginning of the computer era that we started seeing actual development geared towards building such beings.
The term “artificial intelligence” was coined back in 1956 at a conference at Dartmouth College, in Hanover, New Hampshire. However, very little progress was made for many years due to lack of government funding and the absence of the much needed computing power. It wasn’t until the mid 90’s that artificial intelligence started to pick up steam.
I decided to make artificial intelligence the focus of my master’s degree after reading Ray Kurzweil’s The Edge of Spiritual Machines: When Computers Exceed Human Intelligence back in the summer of 2000. In his book, Kurzweil predicts that computers will exceed the memory capacity and computational ability of the human brain by the year 2020. He may not be that far off as we have definitely made significant progress in the field.
But what exactly is Artificial Intelligence or how do we define it in terms of computer science? The dictionary defines artificial intelligence as “the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.”
In my opinion artificial intelligence is more than developing machines to mimic human actions such as visual perception or speech recognition. Artificial intelligence is the ultimate technology goal. Aside from making your daily life more convenient by unlocking your phone with your face or asking a virtual assistant a question about history, artificial intelligence aims to solve more important problems ranging from diagnosing diseases to predicting financial risk. Artificial intelligence applications are able to process enormous amounts of data and make accurate decisions based on millions, or even billions, of data points.
Machine learning is an application of artificial intelligence that uses learning algorithms, such as decision trees and neural networks, that iteratively learn from data sets without being explicitly programmed to do so.
My first project in graduate school, back in 2001, that involved neural networks was to monitor a video feed from a camera mounted on the roof of a car and determine when the car started to drift outside its lane. In 2006 my new car had a not-so-common feature called “lane departure”. Needless to say, I was excited about it because I knew exactly how the technology worked. This is when I first realized that artificial intelligence was starting to make its way to the general public. Nowadays we have autonomous, or self-driving, cars on the roads - a big leap from simply detecting a lane change on a video feed. The technology has been able to evolve much more rapidly due to the availability of more computing power in the cloud.
The proliferation of cloud computing has allowed for more innovation on artificial intelligence and machine learning algorithms. The easily accessible storage and scalable computing power available for processing large amounts of data has also enabled more testing of these algorithms.
The common availability of Graphic Processing Units (GPUs), which were originally developed for the video game industry, along with advancements in AI algorithms have facilitated more improvements and testing by shortening the time it takes for some models to learn. Public cloud providers like Amazon and Microsoft have been increasing their support for GPUs in the past year which has made them more accessible for AI researches.
In addition, the fact that the public cloud provides auto-scaling capabilities makes it easier for AI algorithms to process enormous amounts of data at a lower cost, something that was not an option until recent years.
Not only are cloud companies providing the computational power needed to advance AI algorithms but they are also building AI products that the public can use and build on top of. A couple of examples are Amazon’s Alexa and Google’s Home which use artificial intelligence algorithms to recognize and answer to voice commands. These tools are also becoming platforms on which people can build applications, further expanding the general use of artificial intelligence.
Another aspect of artificial intelligence making the news lately is image recognition. Back in the day, recognizing a simple shape in an image using computers required a lot of code implementing complex mathematical formulas. You had to code your own machine learning algorithms and implement your own models. It was painful. Nowadays you can use Amazon’s cloud service Rekognition to add image recognition capabilities to your applications. Amazon Rekognition uses neural network models to detect and label thousands of objects and scenes in your images.
Here are a couple of examples of what Amazon's Rekognition service can do:
AI Services on AWS
Amazon offers several AI services including Image Recognition, Voice Recognition, and Machine Learning. Making these services available to the public accelerates the research process that other companies are conducting.
“AWS cloud is putting AI into the hands of any company or developer looking to add intelligence to their applications. It’s early days, but this technology is moving forward quickly, and already showing success in fields as diverse as healthcare and real estate,” said Dr. Matt Wood, general manager for artificial intelligence at AWS. “There’s a huge opportunity to dive in and continue to accelerate the pace of innovation.”
The cloud has been of great importance to the advancement of artificial intelligence methodologies and algorithms. It has provided the scalable computing power needed to process extremely large sets of data and has significantly reduced the learning time for machine learning algorithms.
From image recognition to big data analytics to medical diagnostics, artificial intelligence is the future of computing and the cloud is what is propelling it forward.
Get in touch with a cloud expert today to discuss how Stratus10 can help!
Call us at 619.780.6100
Send us an email at Sales@Stratus10.com
Send us a message by filling out our Contact Form
Read our Customer Case Studies