NaviSense App Uses AI to Help People With Visual Impairments Find Objects Easily
A new navigation tool is making smartphones far more helpful for people with visual impairments. NaviSense, created by researchers at Penn State, uses AI object finding to guide users toward everyday items. The app scans the surroundings in real time and gives simple audio and vibration cues. As a result, users receive clear guidance without feeling overwhelmed.
How NaviSense Works
NaviSense connects to an external server that runs advanced vision and language models. These models process the user’s voice command and search the environment with impressive accuracy. In addition, the app ignores objects that are not relevant to the request, which helps reduce confusion and keeps the experience smooth.
The system can even ask follow-up questions when it needs more details. For example, it may ask about the size or color of an item to narrow the search. This makes the interaction feel natural and supportive.
One standout feature is the hand-guidance system. The app tracks how the phone moves and sends directional cues that help users reach the target object. Therefore, users can move with more confidence and independence.
Strong Early Results
Researchers tested NaviSense with 12 participants. They found objects faster and with better accuracy compared to standard methods. Most users also said the experience felt more intuitive and less stressful. In addition, they appreciated the real-time feedback and the way the app adapted to their needs.
NaviSense shows how AI can transform accessibility tools. It offers a practical, inclusive, and empowering solution that supports more independent living.

