21 Dec HOW AMAZON BECAME THE TOP DOG IN ARTIFICIAL INTELLIGENCE
Amazon is a data company. At least, it is if we’re to go on what the leading figures in tech and ecommerce say. However, I beg to differ. In my opinion, Amazon is not just a data company, but an artificial intelligence company, and without that ingredient their data wouldn’t matter nearly as much as it does to their business today.
In just a few years, Amazon has envisioned and realised countless incredible ambitions with machine learning software that ninety percent of experts fifteen years ago wouldn’t have thought possible. They have single handedly pushed forwards the entire AI industry and re-designed what we thought the archetype of modern technology was. We need to look no further than our smart Alexa devices, or our product recommendations on amazon.com for proof of this. How did they get here, and what does it mean to be responsible for such a large portion of advances in machine learning?
Amazon uses artificial intelligence in three primary fields of interest. Firstly, machine learning programs work in harmony with Amazon’s baseline business model – selling products online. On the client side, massive algorithms administer the recommendation of personalized products to browsing customers, much in the same way that YouTube’s AI customizes your recommended videos feed to get you to stay on their website/app for as long as possible.
Data is collected from a huge range of mediums. Amazon’s partners connect via API to forward on browsing and interest data for any given customer. Take Google for example. If a Google user hovered their mouse above an ad that the search engine served up linking to an Amazon product, Google could (depending on the user’s privacy settings) sell data to Amazon saying “hey, we showed this person an ad for this product and they were interested”. Amazon feeds that, along with various other data points, into an algorithm that predicts what the customer is likely to purchase.
We see this kind of technology used on hundreds of ecommerce sites nowadays. You can gain enough intel from just a few product searches on a store to gauge what the customer is interested in. What makes Amazon unique is how they’ve assembled an eco-system whereby they can collect data from many different places, such as if a user asked a question to their Amazon Alexa device, or spent longer on one genre of novel on Audible than another. They use this information to recommend products at a remarkable level of precision. It’s why Amazon has been called a “data company, not an ecommerce company”, and their use of machine learning to take advantage of this data has pushed them to the forefront of the artificial intelligence industry.
Not only has Amazon mastered the art of getting customers to spend more – and more often – they’ve also become a leading figure in logistics. How exactly? Take a look inside any fulfillment centre and it becomes exceedingly clear. They’re not just using neural networks and reinforcement algorithms to predict products, but also to revolutionise the way they get products to their customers – how to organize, package and ship parcels at astounding rates.
In every row of shelves in an Amazon warehouse, cameras record packages coming and going live to glean information about the warehouse’s output rate and to locate specific products. When one section of a warehouse reaches maximum capacity, the computers find out if another row could be used as backup, and predict which row will be most popular in the coming hours or days to keep the flow of packages going. You can almost see it as a traffic jam-preventing AI, except the cars are parcels and the roads are shelves in Amazon’s warehouses.
There’s even more AI that goes into making Amazon’s warehouses ultra-efficient. As products come and go, deep learning algorithms track buyer trends – once again from a huge range of mediums – to predict which listings will be highest in demand, and then allocate storage and fulfillment slots where needed. A simple example of this in action happens every Christmas season. Amazon’s developers aren’t explicitly telling the fulfilment system that customers buy more baubles and lights in December, the algorithm just knows based on the vast number of inputs it receives from popular trends and buying habits. It then immediately clears out warehouse zones to handle higher capacity.
Last year Amazon began rollout of their Pegasus classification system. It is expected to more accurately and efficiently sort products and organize distribution throughout Amazon’s 175 fulfilment centres. Is it working? A resounding yes; Amazon has gotten better at handling more orders at faster rates year on year – even during the current pandemic.
In fact, Amazon’s developers are shooting for a whopping fifty percent reduction in false positives and negatives in product classification, which will lend itself to more efficient and well-timed management of warehouses. This itself is a huge challenge, especially since thousands of products are being shipped out every single day, and implementing an algorithm on such a massive scale without noticeably impacting Amazon’s business model is an impressive feat.
Such is the advance of Amazon’s machine learning models that their warehouses and fulfilment centres have a genuine shot at being fully automated by robotic devices in the next decade, with minimal human supervision. I cannot stress enough how insane that is. There are between one and four million – million! – product bins in every fulfilment centre. Building an AI that can autonomously manage the entirety of that is bordering on ludicrous, but by now no one doubts that Amazon can pull it off.
Plunge further into the depths of Amazon’s technological genius and you’ll come across their range of smart home devices, dubbed “Alexa”. I don’t know who this Alexa person is, but what is very clear is that it’s a big leap from the rudimentary chat bots and web assistants of the late 2000s and early 2010s. The same sceptics who a decade ago dismissed the idea of an AI based, cloud-trained, ever-adapting smart device are now the most avid supporters of Amazon’s innovative brilliance.
Far field speech recognition is used in all of Amazon’s smart Alexa devices, such as the popular Echo Dot. Their cloud hosted audio recognition algorithms are tuned by thousands of example datasets every day. Whenever an Alexa user speaks to their device, the anonymous recording is sent to Amazon data centres and used to train one of many algorithms built for Alexa’s smart “brain”. Amazon was among the first to implement this kind of technology in a consumer-level device, with the creative ingenuity of hosting all the machine learning programming on the cloud so huge volumes of data can be operated on seamlessly.
Unsurprisingly, this integrates flawlessly with Amazon’s broad eco-system of interconnected AI programs. If you buy dog toys through your voice-controlled Echo one day, don’t be surprised if the Amazon website shows ads for dog treats or dog food the next day, or throws you some recommendations for dog related audiobooks on Audible. And every time you interact with any Amazon software, whether through any of their websites or partner sites, or through a hardware device like an Alexa, you incrementally change the algorithm for millions of other users worldwide – making it more accurate, more targeted and more adaptable.
Ending Amazon’s story on Alexa misses out their latest and greatest phenomenon, one that has turned heads since the day it was announced. Amazon Go. I feel that this is one of the most significant implementations of AI in a physical, real-time environment that has been achieved – ever. Go is Amazon’s version of a physical store like any other, but these places are absolutely packed with innovation and technology. No checkouts, automated payment, smart product tracking, the works.
The moment you walk into an Amazon Go store, their AI is watching you. It identifies you and can track your movement throughout the entire store with stunning precision. If we’re going to talk about Apple’s facial recognition, we best pay some attention to Amazon Go, too. Using just a few cameras to tag and follow twenty or more customers in a store simultaneously takes a lot more than running a simple OpenCV program.
When you pick up a product, the algorithm has appended it to your list of items in milliseconds. It’s entirely possible that you could walk into an Amazon Go store, pick up a product, and then leave without making a purchase and see recommendations for that exact product on the Amazon app a week later. Okay, maybe a team of developers with enough experience can create a learning model that recommends products based on what you’ve looked at in a store, but it takes real enterprise to make one that is as accurate as Amazon Go.
Amazon has pushed the boundaries of what is possible with cloud-based artificial intelligence. Given their past record of following through on bold – even audacious – project concepts, I wouldn’t be surprised to see fully automated Amazon warehouses, and unstaffed Amazon retail stores in the next ten years. I also want to see Alexa improve, to evolve into a fully fledged chatbot that can learn about the world from millions of people all at the same time. But in this field, “baby steps” does not do justice to Amazon. They’re making astronomical leaps in artificial intelligence.