If you haven’t heard, AI now has eyes, and Meta has announced some enhancements for its Ray-Ban Meta glasses. You can now customize Meta AI to give detailed responses based on what’s in the surrounding environment for your smart glasses, Meta said in a blog post for Global Accessibility Awareness Day.
Artificial intelligence is opening a whole new world for accessibility, with new features coming out in droves. Tech giants like Google, Apple and Meta are putting forth a ton of effort to create a world where people with disabilities, such as low or no vision, can more easily interact with the world around them.
While Live AI for the Meta glasses has been around, the additional enhancements for low vision users will undoubtedly be welcomed.
Below are some of the other…