MUMBAI, India, July 11 -- Intellectual Property India has published a patent application (202521061476 A) filed by Dr. Parag Puranik; Dr. Rahul Pethe; Dr. Abhay Kasetwar; Mrs. Sonam Chopade; Dr. Namrata Mahakalkar; Dr. Kapil Jajulwar; Dr. Tirupati Goskula; Dr. Praful Yerkewar; and Dr. Giridharilal Agrawal, Nagpur, Maharashtra, on June 27, for 'blind-audio-vision: object detection sunglasses with audio output using ai.'

Inventor(s) include Dr. Parag Puranik; Dr. Rahul Pethe; Dr. Abhay Kasetwar; Mrs. Sonam Chopade; Dr. Namrata Mahakalkar; Dr. Kapil Jajulwar; Dr. Tirupati Goskula; Dr. Praful Yerkewar; and Dr. Giridharilal Agrawal.

The application for the patent was published on July 11, under issue no. 28/2025.

According to the abstract released by the Intellectual Property India: "The Blind-Audio-Vision system is an innovative, cost-effective, and wearable assistive device specifically developed to enhance the mobility, awareness, and independence of visually impaired individuals. The device is embedded into a lightweight pair of smart sunglasses, seamlessly combining artificial intelligence, real-time object detection, and spatial audio feedback to create an intuitive and immersive environmental sensing system. Designed around the ESP32-CAM module, the system leverages onboard processing capabilities and AI algorithms to detect, classify, and localize surrounding objects without the need for external servers or internet connectivity. This ensures that the user's privacy and data remain secure, with all information processed locally on the device. The AI model integrated into the ESP32-CAM is trained to recognize a range of everyday obstacles and objects, including pedestrians, vehicles, and static barriers. Once identified, this information is translated into directional and proximity-based audio cues delivered through inbuilt electroacoustic transducers or bone conduction speakers. These auditory signals provide real-time feedback about the object's nature and position - such as "person on the right" or "obstacle ahead at close range" enabling the user to navigate crowded or unfamiliar environments with greater confidence and safety. A key advantage of the system lies in its low-latency operation, optimized for quick decision-making. The feedback loop between detection and audio output is minimal, allowing users to receive timely information and respond rapidly to changing surroundings. Furthermore, the spatial audio interface is carefully designed to be non-intrusive, ensuring that users can remain aware of ambient sounds like traffic or conversations, which are vital for real-world navigation. The device is modular, lightweight, and comfortable for prolonged wear, with power-efficient hardware and a rechargeable battery that supports extended operation. It also features customizable audio feedback patterns, allowing users to tailor their experience according to personal preferences and needs. The Blind-Audio-Vision innovation addresses the limitations of traditional aids like white canes, guide dogs, and smartphone apps by offering an all-in-one solution that ensures object recognition, spatial awareness, and real-time response, all in a discreet and user-friendly wearable. This technology holds the potential to transform the lives of millions of visually impaired individuals by providing them with enhanced autonomy, improved safety, and the ability to interact confidently with their environment."

Disclaimer: Curated by HT Syndication.