Amazon shifts some voice assistant, face recognition computing to its own chips


An Amazon Echo Dot displays a video feed of the backyard in a model home in Vallejo, California. Amazon said the shift to the Infertia chip for some of its Alexa work has resulted in 25% better latency, which is a measure of speed, at a 30% lower cost. — Bay Area News Group/TNS

Amazon.com Inc on Nov 12 said it shifted part of the computing for its Alexa voice assistant to its own custom-designed chips, aiming to make the work faster and cheaper while moving it away from chips supplied by Nvidia Corp.

When users of devices such as Amazon’s Echo line of smart speakers ask the voice assistant a question, the query is sent to one of Amazon’s datacentres for several steps of processing. When Amazon’s computers spit out an answer, that reply is in a text format that must be translated into audible speech for the voice assistant.

The Star Festive Promo: Get 35% OFF Digital Access

Monthly Plan

RM 13.90/month

Best Value

Annual Plan

RM 12.33/month

RM 8.02/month

Billed as RM 96.20 for the 1st year, RM 148 thereafter.

Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Datacentre

Next In Tech News

'Just the Browser' strips AI and other features from your browser
How do I reduce my child's screen time?
Anthropic buys Super Bowl ads to slap OpenAI for selling ads in ChatGPT
Chatbot Chucky: Parents told to keep kids away from talking AI dolls
South Korean crypto firm accidentally sends $44 billion in bitcoins to users
Opinion: Chinese AI videos used to look fake. Now they look like money
Anthropic mocks ChatGPT ads in Super Bowl spot, vows Claude will stay ad-free
Tesla 2.0: What customers think of Model S demise, Optimus robot rise
Vista Equity Partners and Intel to lead investment in AI chip startup SambaNova, sources say
Apple plans to allow external voice-controlled AI chatbots in CarPlay, Bloomberg News reports

Others Also Read