By Saqib Shaikh, Software Engineering Manager and Project Lead for Seeing AI
Seeing AI provides people who are blind or with low vision an easier way to understand the world around them through the cameras on their smartphones. Whether in a room, on a street, in a mall or an office – people are using the app to independently accomplish daily tasks like never before. Seeing AI helps users read printed text in books, restaurant menus, street signs and handwritten notes, as well as identify banknotes and products via their barcode. Leveraging on-device facial-recognition technology, the app can even describe the physical appearance of people and predict their mood.
Today, we are announcing new Seeing AI features for the enthusiastic community of users who share their experiences with the app, recommend new capabilities and suggest improvements for its functionalities. Inspired by this rich feedback, here are the updates rolling out to Seeing AI to enhance the user’s experience:
- Explore photos by touch: Leveraging technology from Azure Cognitive Services, including Custom Vision Service in tandem with the Computer Vision API, this new feature enables users to tap their finger to an image on a touch-screen to hear a description of objects within an image and the spatial relationship between them. Users can explore photos of their surroundings taken on the Scene channel, family photos stored in their photo browser, and even images shared on social media by summoning the options menu while in other apps.
- Native iPad support: For the first time we’re releasing iPad support, to provide a better Seeing AI experience that accounts for the larger display requirements. iPad support is particularly important to individuals using Seeing AI in academic or other professional settings where they are unable to use a cellular device.
- Channel improvements: Users can now customize the order in which channels are shown, enabling easier access to favorite features. We’ve also made it easier to access the face recognition function while on the Person channel, by relocating the feature directly on the main screen. Additionally, when analyzing photos from other apps, the app will now provide audio cues that indicate Seeing AI is processing the image.
Since the app’s launch in 2017, Seeing AI has leveraged AI technology and inclusive design to help people with more than 10 million tasks. If you haven’t tried Seeing AI yet, download it for free on the App Store. If you have, please share your thoughts, feedback or questions with us at email@example.com, or through the Disability Answer Desk and Accessibility User Voice Forum.