How this robotics pupil used NVIDIA Jetson to make an autonomous wheelchair

Spread the love

Hearken to this text

Voiced by Amazon Polly

Robotics developer Kabilan KB is bringing autonomous navigation capabilities to wheelchairs with the intention of serving to enhance mobility for folks with disabilities. 

KB is an undergraduate on the Karunya Institute of Expertise and Sciences in Coimbatore, India. For this undertaking, he aimed to create a tool that may very well be useful for his cousin, who has a mobility dysfunction, and different folks with disabilities who won’t be capable of management a guide or motorized wheelchair. 

“Generally, folks don’t have the cash to purchase an electrical wheelchair,” KB stated. “In India, solely upper- and middle-class folks can afford them, so I made a decision to make use of essentially the most primary sort of motorized wheelchair obtainable and join it to the Jetson to make it autonomous.”

KB linked a primary motorized wheelchair’s motor hub to depth and lidar sensors, together with USB cameras, to permit it to understand the atmosphere and used the NVIDIA Jetson platform to assist it plan an obstacle-free path towards a person’s vacation spot. 

He additionally educated the AI algorithms for the autonomous wheelchair utilizing YOLO object detection on the Jetson Nano, in addition to on ROS. The wheelchair makes use of these algorithms to understand and map its atmosphere and plan a collision-free path. 

“An individual utilizing the motorized wheelchair may present the placement they should transfer to, which might already be programmed within the autonomous navigation system or path-planned with assigned numerical values,” KB stated. “For instance, they might press ‘one’ for the kitchen or ‘two’ for the bed room, and the autonomous wheelchair will take them there.”


A diagram illustrating how the elements of the autonomous wheelchair work collectively. | Supply: NVIDIA

An NVIDIA Jetson Nano Developer Equipment processes the info from the cameras and sensors in actual time. The system then makes use of deep learning-based laptop imaginative and prescient fashions to detect obstacles within the atmosphere.

The Developer Equipment basically acts because the mind of the autonomous system. It generates a 2D map of its environment to plan a collision-free path to the person’s vacation spot, and sends up to date indicators to the motorized wheelchair to assist guarantee protected navigation alongside the best way. 

Trying ahead, KB imagines that the undertaking will be expanded to permit a person to manage the wheelchair utilizing mind indicators from electroencephalograms, or EEGs, which are linked to machine-learning algorithms. 

The undertaking was funded by the Progam in International Surgical procedure and Social Change, which is positioned below the Boston Youngsters’s Hospital and Harvard Medical Faculty.

robobusiness register now banner.Register now in order that you do not miss this thrilling occasion.


Leave a Reply

Your email address will not be published. Required fields are marked *