Wednesday, October 16, 2019

Amazon Echo can now identify foods

 We all know the Amazon Alexa is a great tool for the visually impaired and blind consumer. But it just got even better! Alexa can now use a feature Show and Tell that can tell users what an item of food is. Read on to find out more! 

Original post here

Amazon has released a new feature for its Echo Show smart speaker enabling the device to identify grocery items for blind or visually-impaired users.
The update, called Show and Tell, requires customers to hold a product to the speaker's front-facing camera and say aloud: "Alexa, what am I holding?"
Digital assistant Alexa uses computer vision (tech developed to allow machines to 'see' as humans do) and machine learning (teaching computers to learn without being explicitly programmed to do so) technologies to identify the object and say aloud what it is.
“We heard that product identification can be a challenge [for blind and visually-impaired people) and something customers wanted Alexa’s help with," Sarah Caplener, head of Amazon’s Alexa for Everyone team, said in a statement.
"Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need."
The update is currently available for owners of first and second-generation Echo Show users in the US.
The company introduced another accessibility feature earlier this month giving users greater control over the speed at which Alexa speaks.
Users can now chose from seven speeds of speech, four rates of faster speech and two slower, triggered by speaking aloud: "Alexa, speak slower" or "Alexa, speak faster".
Despite Amazon's efforts to make its technology more accessible for all users, Alexa's robotic voice can cause "deep distress" to dementia patients who are unfamiliar with it.
Spoken reminders to take medication can cause distress to people with dementia, while people who have had a stroke or have learning disabilities may not be able to formulate questions in the way the assistants' AI understands, a report from Doteveryone, a think tank promoting responsible tech, claimed.



No comments:

Post a Comment