jump to navigation

Blind Driver Challenge – Next-Gen Vehicle to Appear at Daytona July 2, 2010

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , ,
3 comments

The National Federation of the Blind (NFB) has just announced some ambitious plans inspired by RoMeLa. According the their recent press release, they have partnered with Virginia Tech to demonstrate the first street-legal vehicle that can be driven by the blind.

We saw RoMeLa’s work awhile back and featured the technology used in their first prototype. The NFB was immediately convinced about the viability RoMeLa’s prototype demonstrated and put them to work on a street-legal vehicle with some grant money. The next-generation vehicle will incorporate their non-visual interfaces with a Ford Escape Hybrid, reusing many of the technologies they used in their prototype. The drive-by-wire system will be semi-autonomous and use auditory and haptic cues to provide obstacle detection for a blind driver.

The first public demo will be at the Daytona International Raceway during the Rolex 24 racing event. News coverage is popping up all over the place.

If you can’t make it to Daytona, RoMeLa will be showing off their first prototype (red dune buggy) at NIWeek in August, in case anyone wants to take it for a spin.

Robot VIPs: Roboticists you should know about (part 1) March 8, 2010

Posted by emiliekopp in Robot VIPs.
Tags: , , , , , ,
3 comments

In the years that I’ve worked at National Instruments, I’ve come across several engineers and scientists that are doing incredibly cool things in the robotics industry, using NI technologies. I’ve been lucky to meet some famous, some accomplished, or some just down right geeky people that are definitely worth knowing . I’d like to share my list of Roboticists You Should Know About, starting with one of my favorite VIPs:

Name: Dr. Dennis Hong

Title: Associate Professor and the Director of RoMeLa (Robotics & Mechanisms Laboratory) of the Mechanical Engineering Department, Virginia Tech

Expertise:

  • Novel robot locomotion mechanisms
  • Design and analysis of mechanical systems
  • Kinematics and robot mechanism design
  • Humanoid robots
  • Autonomous systems

Geek cred:

Cool projects:

How to Join:

A quick update to the Blind Driver Challenge vehicle October 14, 2009

Posted by emiliekopp in labview robot projects.
Tags: , , , , , ,
add a comment

Here’s a technical case study that provides some additional information on how the Virginia Tech team built the blind driver vehicle I mentioned in a previous blog post.

In addition, I wanted to share this video I found of 16-year-old Addie Hagen getting prepped to take a first stab at driving a car. It’s short and simple, with little video editing. But watching it truly demonstrates the impact this project can have on a person who thought something like driving a car would never be possible in her lifetime.

Blind Driver Challenge from Virginia Tech: A Semi Autonomous Automobile for the Visually Impaired September 1, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , , ,
5 comments

How 9 mechanical engineering undergrads utilized commercial off-the-shelf technology (COTS) and design HW and  SW donated from National Instruments to create a sophisticated, semi-autonomous vehicle that allows the visually impaired to perform a task that was previously thought impossible: driving a car.

Greg Jannaman (pictured in passenger seat), ME senior and team lead for the Blind Driver Challenge project at VT, was kind enough to offer some technical details on how they made this happen.

How does it work?

One of the keys to success was leveraging COTS technology, when at all possible. This meant, rather than building things from scratch, the team purchased hardware from commercial vendors that would allow them to focus on the important stuff, like how to translate visual information to a blind driver.

example of the information sent back from a LIDAR sensor

example of the information sent back from a LIDAR sensor

So they started with a dune buggy. They tacked on a Hokuyo laser range finder (LIDAR) to the front, which essentially pulses out a laser signal across an area in front of the vehicle and receives information regarding obstacles from  the laser signals that are bounced back. LIDAR is a lot like radar, only it uses light instead of radio waves.

For control of the vehicle, they used a CompactRIO embedded processing platform. They interfaced their sensors and vehicle actuators directly to an onboard FPGA and performed the environmental perception on the real-time 400 Mhz Power PC processor. Processing the feedback from sensors in real-time allowed the team to send immediate feedback to the driver. But the team did not have to learn how to program the FPGA in VHDL nor did they have to program the embedded processor with machine level code. Rather, they performed all programming on one software development platform; LabVIEW. This enabled 9 ME’s to become embedded programmers on the spot.

But how do you send visual feedback to someone who does not see? You use their other senses. Mainly, the sense of touch and the sense of hearing. The driver wears a vest that contains vibrating motors, much like the motors you would find in your PS2 controller (this is called haptic feedback, for anyone interested). The CompactRIO makes the vest vibrate to notify the driver of obstacle proximity and to regulate speed, just like a car racing video game. The driver also wears a set of head phones. By sending a series of clicks to the left and right ear phones, the driver uses the audible feedback to navigate around the detected obstacles.

The team has already hosted numerous blind drivers to test the vehicle. The test runs have been so successful, they’re having ask drivers to refrain from performing donuts in the parking lot. And they already have some incredible plans on how to improve the vehicle even further. Watch the video to find out more about the project and learn about their plans to further incorporate haptic feedback.