
Over the past seven years, I have been the one with the larger problem to solve: living in a vision-centric world with macular-degeneration–based low vision. I’m fortunate, blessed, really, that I also have the agency to work on it. Repairing something so deep and longstanding within me promises meaningful rewards: greater independence, renewed confidence, and richer daily conversations about place and presence.
Smart AI glasses (SGs) have become a central part of that repair.
These glasses use speakers built into the arms, a microphone, a camera, and a Bluetooth connection to my phone. Together, these components allow me to re-enter the reading world, hands-free, despite low vision. The effect has been genuinely life-changing, not in a cosmic or spiritual sense, but in the practical, daily ways that shape how I move through the world.
Before SGs, reading mail was cumbersome. Most of it is junk, of course, but some comes from NGOs we support, government agencies, or friends sending cards. I relied on a handheld magnifier. If the document was long, I often skipped it altogether. If it was important, I went to my office, turned on my optical character recognition machine, and followed a multi-step process just to hear it read aloud. The entire experience was clunky, time-consuming, and exhausting, something to avoid if possible.
Enter SGs.
Now, wearing my glasses and connected via Bluetooth to the SG app, I can simply ask the glasses to read whatever I’m looking at. That shift, from hand tools to hands-free, has completely changed my relationship with reading. For longer documents, I can ask the SGs to summarize a page, working through it page by page. This is possible because the glasses are integrated with AI, and it feels like a quiet revolution, one that unfolds not with fanfare, but with relief.
Restaurants offer another clear example. With SGs, I can have a menu read to me, hands-free. The speakers in each arm of the glasses deliver the information discreetly, without disturbing anyone nearby. I can ask for menu categories, request all dishes containing chicken, or drill down into ingredients. The process is fluid, dignified, and still astonishing.
Beyond reading, SGs allow me to make phone calls, send and receive texts, set reminders, and manage calendar items. I can read signage as I walk through downtown. In grocery stores, the glasses tell me what’s in each aisle, read ingredient lists, and provide nutritional information. At any moment, I can ask the glasses to describe what’s in front of me.
I can also access my apps and have books read aloud. Tasks that once required careful finger work, often slowed by peripheral neuropathy, are now accomplished through spoken commands. What used to feel like friction now feels like flow.
To learn all this, I took a week-long class at the Lighthouse SF Earle Baum Campus, where I practiced using the SGs and wore them continuously. What surprised me most was how much more engaged I became with my surroundings. When I can ask to “see” something, what this says, what that is, I become curious again. I ask questions constantly. I’ve learned to speak softly to minimize confusion for those around me, but the questions keep coming.
The glasses don’t replace vision. They don’t cure macular degeneration. What they do is restore access to information, to choice, to spontaneity. They allow me to participate more fully in the small, ordinary moments that quietly define a day.
One final note: baking. The SGs read ingredients and instructions, convert cups and spoonfuls into ounces or grams, translate Celsius to Fahrenheit, and even read the oven temperature.
They just can’t tell me when the cookies are done. 😉