Jess Weatherbed writing for The Verge: These Smart Glasses Use AI to Help Low-Vision Users

Envision says the camera-equipped Ally Solos Glasses can read and translate text, describe surroundings, search the web, and recognize people, objects, and signs, feeding information to the user via open-ear speakers built into the ear stems.

This is something I found really nice about the Meta Ray-Bans as well. If you have low vision or need help understanding a sign or anything else in front of you, you just just say, "hey Meta, what does this sign say?" or "hey Meta, what am I looking at?" and the glasses will tell you. Not a bad feature for a face computer that doesn't make you look silly.