When Amazon.com Inc. executive Dave Limp announced in August that he would leave by the end of this year, I wrote that he would depart without having satisfactorily answered the question of what Alexa was for and why the gadget-buying public truly needed the voice-assistant technology.
Amazon seemed be having similar queries — Limp’s unit, reported to have been burning through $5 billion a year, was hit harshly by recent cutbacks at the company. Meanwhile, the explosion of AI, kick-started by the launch of ChatGPT, has drastically redefined what a digital assistant could be expected to do. Alexa looked comparatively stupid: ChatGPT users were writing essays; Alexa users were setting egg timers.
OpenAI’s groundbreaking tool sent shockwaves through Silicon Valley, in no small part because it wrong-footed bigger tech companies that hadn’t yet been able to put generative AI capabilities directly into the hands of consumers.
The shifting sands — ChatGPT,Amazon cutbacks, Limp’s departure — have raised the question of where Amazon’s devices unit goes from here. Limp’s swan-song keynote this week, held at the company’s new “HQ2” campus in Arlington, Virginia, attempted to show Alexa could close the gap on AI.
Crucially, the new phase of Alexa could mean a new business model in which users pay an additional monthly fee for a more sophisticated virtual assistant.
Will Alexa finally be able to make money for Amazon? In a nod to the question-and-answer relationship users have with Alexa, below is a conversation I had with Limp after his keynote. He’s a chatter — it’s edited for brevity and clarity. I’ll jump in with my thoughts as we go.
Dave Lee: Last time we spoke, you were talking about ambient computing. Now you’re instead calling it ambient “intelligence.” You mentioned “generative AI” within about 30 seconds of the keynote starting. ChatGPT has obviously had an impact. How has Amazon changed what it is doing?
Dave Limp, Amazon: Well, there’s no way we could have shown you what we showed you [on Wednesday], and what we’re going to be shipping before the end of the year, if we had started it when ChatGPT was announced. It’s just impossible to do what you’ve seen us do in that period of time. For the past couple years in the background, we’ve been using generative AI [for building Alexa features]. But what has become pretty clear, at least to me and the team, is that as you feed these models more data, they get larger, they get better. And they haven’t stopped getting better. The real hard work is fine-tuning that model for your use case. In the home, the use case is very different than on your phone or in your browser. And that’s what we’re spending the most time on now.
His argument is that Alexa offers a real-world application for generative AI that ChatGPT can’t reach, given Alexa’s physical presence in homes, cars and TVs. Amazon puts the number of Alexa-enabled devices at “nearly” a billion, with owners interacting with them tens of millions of times every hour. But the initial idea of why that would be valuable to Amazon — people would buy things using voice commands — has failed to materialize.
Instead, if AI and large language models can make Alexa smarter, then Amazon sees an opportunity to have people pay extra for that capability, in much the same way OpenAI charges a monthly fee to use more advanced features of its bot.
Lee: Is there a time when [these Alexa AI features] won’t be free? And is a subscription instead? What does that look like?
Limp: Yes, we absolutely think that. [As other companies building AI have found,] when you start using these a lot, the cost to train the model, and the cost for inference of the model in the cloud, is substantial. But before we would start charging customers for this — and I believe we will — it has to be remarkable. It has to prove the utility that you're coming to expect from the “superhuman” assistant. I can paint a line from where we’re going to be right now, which we’re not going to charge for, to something that will have so much utility for every member of the household. We don’t have an idea of a price yet. We’ll talk to customers and learn from them, what they believe the value is. The Alexa that you know and love today is going to remain free.
Lee: But won’t consumers say, “I’ve spent $150 on a device, I’m spending $140 on Amazon Prime”? Why would this not be something you feel you could fold into what people already pay Amazon?
Limp: Both of those things are unbelievably oversized value already. Prime, you know, is the deal of the century — you get a streaming service, you get music, you get books, you get Audible. And, by the way, we’ll ship your packages [in two days or less]. When I compare it to what people are willing to pay for utility for other things — the average selling price of phones is going in the opposite direction. Our prices are coming down, those prices are going up. I do envision this remarkable Alexa out at some period of time. I’m not going to pull the crystal ball out here and predict exactly when that is, but it’s not years away. It’s not decades away. It’s a tractable problem. If what I think the team is building comes to fruition, I will pay for it every day and Sunday.
I think Limp is right here about the potential. What’s less clear is whether it can win a hugely competitive race, particularly when Amazon seems focused on tightening its budgets across the board. While OpenAI has attracted the most headlines for ChatGPT, the real competitor to Alexa’s AI will be Google, which has the upper hand of knowing a lot more about us thanks to its more detailed picture of our web browsing habits and because of the popularity of its web mail and calendar. And then, of course, there’s Apple, which we know is rapidly hiring to improve its Alexa-equivalent, Siri.
For all of them, the goal is to actually create an assistant in the traditional sense: AI that can just handle things on your behalf rather than needing to always be told what to do such as manage your diary, maybe even take your calls. That’s a pretty compelling use case and one, yes, that Amazon would likely be able to charge for if it can crack it.
Achieving that will be up to Limp’s successor: former Microsoft executive Panos Panay, according to Bloomberg reporting. It’s a huge poach — Panay was Microsoft’s chief technology officer, a 20-year veteran at the company who oversaw important product lines such as Windows and the Surface range of laptops and tablets. Not everything was a hit — I met Panay in 2019 when he was gearing up to launch the ill-fated Surface Neo and Duo, folding devices that never found an audience and have been discontinued. (Amazon hasn’t yet confirmed Panay’s arrival.)
Lee: What advice do you offer to your successor?
Limp: I would tell whoever the successor is, to trust this team. They’re gonna do good by you and produce good outputs. And then I say the same thing to any senior leader that comes into Amazon, because it does have a bit of a quirky culture; it’s a founder’s culture. Jeff [Bezos’] culture is still pervasive, even though Andy [Jassy, the CEO] is putting his imprint on it as well. I would say, be patient, you have to learn this culture. And to thrive here, you have to embody it.
Lee: What were some of your mistakes along the way?
Limp: We’ve had some products — Fire Phone, Echo Loop, most recently Halo — where the team built exactly the product we asked them to build, in some cases exceeded it. But it just didn’t resonate, for very different reasons, with customers.
Lee: Is Astro — the home security robot — about to join that club? We didn’t see that little guy this week.
Limp: There’ll be some news about Astro coming out in about a month or two, around small and medium business. We’re coming out of that trial now. No, customers love Astro. Robots are here to stay, I’m convinced of that. I would argue that, that over the last decade, the consumer electronics industry — I will leave this as my parting words — is not taking enough risk. If everything is just iterative and just works and nothing fails, that’s just the definition that you’re not putting enough beta into your business plan. Oh, you rounded the corners off. Great. Somebody added a new button.
Lee: Who could you possibly be talking about?
Limp: That could be about any phone manufacturer. By the way, I use Apple products and I love my Apple iPhone. So this is not — I’m not dissing Apple in any way, shape or form. I spent a lot of time with that company. I love that company. I know [Amazon] is going to continue taking risks. They will announce things that you think are crazy, but I hope the rest of the industry copies us.
Limp wouldn’t say what he was going onto next, only that it absolutely wouldn’t be in consumer electronics. He says he’s had a “personal epiphany” and now wants to “give something back” to the world, a common refrain among departing tech executives, of course. (Amazon declined to comment on whether Limp’s contract contained a noncompete clause that would prevent him from working on consumer electronics elsewhere.)
It was the fifth time I’d interviewed Limp over the past decade. Each conversation had Alexa positioned on the cusp of greatness — but even Limp appears to now admit that it needs to be much smarter if it’s going to be the success first envisioned when development began more than a decade ago. The exciting dream of an all-knowing Star Trekkian computer lives on — but it will be up to someone else to achieve it.
More From Bloomberg Opinion:
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Dave Lee is Bloomberg Opinion's US technology columnist. Previously, he was a San Francisco-based correspondent at the Financial Times and BBC News.
Catch all the Latest Tech News, Mobile News, Laptop News, Gaming news, Wearables News , How To News, also keep up with us on Whatsapp channel,Twitter, Facebook, Google News, and Instagram. For our latest videos, subscribe to our YouTube channel.