Stanford
University
  • Stanford Home
  • Maps & Directions
  • Search Stanford
  • Emergency Info
  • Terms of Use
  • Privacy
  • Copyright
  • Trademarks
  • Non-Discrimination
  • Accessibility
© Stanford University.  Stanford, California 94305.
Stanford Researchers Build $400 Self-Navigating Smart Cane | Stanford HAI

Stay Up To Date

Get the latest news, advances in research, policy work, and education program updates from HAI in your inbox weekly.

Sign Up For Latest News

Navigate
  • About
  • Events
  • Careers
  • Search
Participate
  • Get Involved
  • Support HAI
  • Contact Us
Skip to content
  • About

    • About
    • People
    • Get Involved with HAI
    • Support HAI
    • Subscribe to Email
  • Research

    • Research
    • Fellowship Programs
    • Grants
    • Student Affinity Groups
    • Centers & Labs
    • Research Publications
    • Research Partners
  • Education

    • Education
    • Executive and Professional Education
    • Government and Policymakers
    • K-12
    • Stanford Students
  • Policy

    • Policy
    • Policy Publications
    • Policymaker Education
    • Student Opportunities
  • AI Index

    • AI Index
    • AI Index Report
    • Global Vibrancy Tool
    • People
  • News
  • Events
  • Industry
  • Centers & Labs
news

Stanford Researchers Build $400 Self-Navigating Smart Cane

Date
October 13, 2021
Topics
Healthcare
Andrew Brodhead

The cane, incorporating sensing and way-finding approaches from robotics and self-driving vehicles, could reshape life for people who are blind or sight impaired. 

Most know the white cane as a simple-but-crucial tool that assists people with visual impairments in making their way through the world. Researchers at Stanford University have now introduced an affordable robotic cane that guides people with visual impairments safely and efficiently through their environments.

Using tools from autonomous vehicles, the research team has built the augmented cane, which helps people detect and identify obstacles, move easily around those objects, and follow routes both indoors and out.

The augmented cane is not the first smart cane. Research sensor canes can be heavy and expensive — weighing up to 50 pounds with a cost of around $6,000. Currently available sensor canes are technologically limited, only detecting objects right in front of the user. The augmented cane sports cutting-edge sensors, weighs only 3 pounds, can be built at home from off-the-shelf parts and free, open-source software, and costs $400.

Read the study: "Multimodal Sensing and Intuitive Steering Assistance Improve Navigation and Mobility for People with Impaired Vision".

 

The researchers hope their device will be an affordable and useful option for the more than 250 million people with impaired vision worldwide.

“We wanted something more user-friendly than just a white cane with sensors,” says Patrick Slade, a graduate research assistant in the Stanford Intelligent Systems Laboratory and first author of a paper published in the journal Science Robotics describing the augmented cane. “Something that cannot only tell you there’s an object in your way, but tell you what that object is and then help you navigate around it.” The paper comes with a downloadable parts list and DIY solder-at-home instructions.

close-up of the augmented cane

The cane uses a LIDAR sensor to measure distance to nearby obstacles and then directs users around those areas. | Andrew Brodhead

 

Borrowing from Autonomous Vehicle Technology

The augmented cane is equipped with a LIDAR sensor. LIDAR is the laser-based technology used in some self-driving cars and aircraft that measures the distance to nearby obstacles. The cane has additional sensors including GPS, accelerometers, magnetometers, and gyroscopes, like those on a smartphone, that monitor the user’s position, speed, direction, and so forth. The cane makes decisions using artificial intelligence-based way finding and robotics algorithms like simultaneous localization and mapping (SLAM) and visual servoing — steering the user toward an object in an image.

“Our lab is based out of the Department of Aeronautics and Astronautics, and it has been thrilling to take some of the concepts we have been exploring and apply them to assist people with blindness,” says Mykel Kochenderfer, an associate professor of aeronautics and astronautics and an expert in aircraft collision-avoidance systems, who is senior author on the study.

Mounted at the tip of the cane is the pièce de résistance — a motorized, omnidirectional wheel that maintains contact with the ground. This wheel leads the user with impaired vision by gently tugging and nudging, left and right, around impediments. Equipped with built-in GPS and mapping capabilities, the augmented cane can even guide its user to precise locations — like a favorite store in the mall or a local coffee shop.

In real-world tests with users that volunteered through the Palo Alto Vista Center for the Blind and Visually Impaired, the researchers put the augmented cane in the hands of people with visual impairments as well as sighted people who were blindfolded. They were then asked to complete everyday navigation challenges — walking hallways, avoiding obstacles, and traversing outdoor waypoints.

“We want the humans to be in control but provide them with the right level of gentle guidance to get them where they want to go as safely and efficiently as possible,” Kochenderfer says.

In that regard, the augmented cane excelled. It increased the walking speed for participants with impaired vision by roughly 20 percent over the white cane alone. For sighted people wearing blindfolds, the results were more impressive, increasing their speed by more than a third. An increased walking speed is related to better quality of life, Slade notes, so the hope is that the device could improve the quality of life of its users.

Opening Up Access

The scholars are open-sourcing every aspect of the project. “We wanted to optimize this project for ease of replication and cost. Anyone can go and download all the code, bill of materials, and electronic schematics, all for free,” Kochenderfer says.

“Solder it up at home. Run our code. It’s pretty cool,” Slade adds.

But Kochenderfer notes the cane is still a research prototype. “A lot of significant engineering and experiments are necessary before it is ready for everyday use,” he says, adding that he and the team would welcome partners in industry who could streamline the design and scale up production to make the augmented cane even more affordable.

Next steps for the team include refinements to their prototype and developing a model that uses an everyday smartphone as the processor, an advance that could improve functionality, broaden access to the technology, and further drive down costs.

Additional authors include Arjun Tambe in the Department of Mechanical Engineering at Stanford.

Funding provided by the National Science Foundation, Stanford Graduate Fellowship, and the Stanford Institute for Human-Centered AI (HAI). Learn more at Mobility Smart Cane. 

Andrew Brodhead
Share
Link copied to clipboard!
Contributor(s)
Andrew Myers

Related News

AI Reveals How Brain Activity Unfolds Over Time
Andrew Myers
Jan 21, 2026
News
Medical Brain Scans on Multiple Computer Screens. Advanced Neuroimaging Technology Reveals Complex Neural Pathways, Display Showing CT Scan in a Modern Medical Environment

Stanford researchers have developed a deep learning model that transforms overwhelming brain data into clear trajectories, opening new possibilities for understanding thought, emotion, and neurological disease.

News
Medical Brain Scans on Multiple Computer Screens. Advanced Neuroimaging Technology Reveals Complex Neural Pathways, Display Showing CT Scan in a Modern Medical Environment

AI Reveals How Brain Activity Unfolds Over Time

Andrew Myers
HealthcareSciences (Social, Health, Biological, Physical)Jan 21

Stanford researchers have developed a deep learning model that transforms overwhelming brain data into clear trajectories, opening new possibilities for understanding thought, emotion, and neurological disease.

Why 'Zero-Shot' Clinical Predictions Are Risky
Suhana Bedi, Jason Alan Fries, and Nigam H. Shah
Jan 07, 2026
News
Doctor reviews a tablet in the foreground while other doctors and nurses stand over a medical bed in the background

These models generate plausible timelines from historical patterns; without calibration and auditing, their “probabilities” may not reflect reality.

News
Doctor reviews a tablet in the foreground while other doctors and nurses stand over a medical bed in the background

Why 'Zero-Shot' Clinical Predictions Are Risky

Suhana Bedi, Jason Alan Fries, and Nigam H. Shah
HealthcareFoundation ModelsJan 07

These models generate plausible timelines from historical patterns; without calibration and auditing, their “probabilities” may not reflect reality.

Stanford Researchers: AI Reality Check Imminent
Forbes
Dec 23, 2025
Media Mention

Shana Lynch, HAI Head of Content and Associate Director of Communications, pointed out the "'era of AI evangelism is giving way to an era of AI evaluation,'" in her AI predictions piece, where she interviewed several Stanford AI experts on their insights for AI impacts in 2026.

Media Mention
Your browser does not support the video tag.

Stanford Researchers: AI Reality Check Imminent

Forbes
Generative AIEconomy, MarketsHealthcareCommunications, MediaDec 23

Shana Lynch, HAI Head of Content and Associate Director of Communications, pointed out the "'era of AI evangelism is giving way to an era of AI evaluation,'" in her AI predictions piece, where she interviewed several Stanford AI experts on their insights for AI impacts in 2026.