LIDARVIZ

AN OBJECT DETECTING APPLICATION FOR THE HEARING IMPAIRED

________

The Lidar Viz project began in Unity, but was quickly switched to native Swift when the team realized the look and feel of certain accessibility features was lacking. DAS rebuilt the project from the ground up using Apple’s native Swift programming language and perfected the design to feature large buttons & text with extreme contrast to assist the visually impaired - in addition to Voice Over functionality.
 
We are proud of the combined effort between Lidar Viz & and our team that has resulted in a tool that positively helps those in need, while also showing the incredible potential of cutting-edge technologies.

Turn On Your Volume!

HOW IT WORKS

LidarViz combines LIDAR data with advanced ARkit world tracking algorithms to create a 3D polygonal mesh - essentially allowing the device to "see" the area around it similar to a human.
rendered2x-1583329168.png
Utilizing built-in Machine Learning models, ARKit is able to classify mesh areas as common objects including tables, seats, windows, walls, floors, and doors. LidarViz recites the name and distance to these objects upon recognition, allowing the visually impaired to navigate the world safely and with confidence.
rendered2x-1583329175.png
rendered2x-1583329178.png
rendered2x-1583329180.png
The below video is an example of how LIDAR works to scan your surroundings and identify objects.
INTERESTED IN AR? TELL US ABOUT YOUR PROJECTS!

CONTACT US