jump to navigation

Robots rawk at NIWeek August 17, 2011

Posted by emiliekopp in robot events.
Tags: , , , , , ,
add a comment

Check out the some of the awesome robots featured in the Robotics Pavilion at 2011 NIWeek:

SuperDroid Robots showed off the SD6 robot, a super-rugged, treaded UGV that was developed jointly with NI.

Dr. Hong brought the latest and greatest from RoMeLa, showing off their full-size humanoid robot, CHARLI:

I got a picture with CHARLI and Dr. Hong as well:

You can download many of the presentations from the Robotics Summit from the NIWeek Community:

And that’s not everything. Check out Brian Powell’s recap of all-things-robotic at NIWeek on the LabVIEW Field Journal blog:

LabVIEW Robotics at NIWeek 2011

Blind Driver Challenge – Next-Gen Vehicle to Appear at Daytona July 2, 2010

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , ,
3 comments

The National Federation of the Blind (NFB) has just announced some ambitious plans inspired by RoMeLa. According the their recent press release, they have partnered with Virginia Tech to demonstrate the first street-legal vehicle that can be driven by the blind.

We saw RoMeLa’s work awhile back and featured the technology used in their first prototype. The NFB was immediately convinced about the viability RoMeLa’s prototype demonstrated and put them to work on a street-legal vehicle with some grant money. The next-generation vehicle will incorporate their non-visual interfaces with a Ford Escape Hybrid, reusing many of the technologies they used in their prototype. The drive-by-wire system will be semi-autonomous and use auditory and haptic cues to provide obstacle detection for a blind driver.

The first public demo will be at the Daytona International Raceway during the Rolex 24 racing event. News coverage is popping up all over the place.

If you can’t make it to Daytona, RoMeLa will be showing off their first prototype (red dune buggy) at NIWeek in August, in case anyone wants to take it for a spin.

Move over ASIMO: Meet CHARLI, the first U.S. full-sized humanoid April 29, 2010

Posted by emiliekopp in Uncategorized.
Tags: , , , , , , , , , , ,
3 comments

Our friends at RoMeLa just unveiled their latest major project to Popular Science magazine: CHARLI or Cognitive Humanoid Autonomous Robot with Learning Intelligence.

A few months ago, I had heard quite a bit about this project from Dr. Hong himself. It’s no longer a secret that he and his students have created the nation’s first full-sized humanoid robot, a feat that countries like Japan (ASIMO) and Korea (HUBO) had once kept to themselves.

It all started back in 1997, when RoboCup announced the charter to create a fully-autonomous soccer team of robots that would beat the World Cup champions by 2050. Since then, teams of researchers and students from all over the world have competed in RoboCup, with variations of robotic soccer players, from fully-simulated robots in software, to four-legged Sony Aibos, and more recently, bi-pedal humanoid robots in small and medium sizes.

RoMeLa is no stranger to RoboCup, competing in the “kid-size” humanoid league with DARwIn. Now they’re scaling up and planning to compete in the “teen-size” league with CHARLI-L (CHARLI-Lightweight). RoMeLa will use their extensive experience in humanoid locomotion, kinematics and control to help CHARLI-L beat his opponents on the field.

But beyond competing for the coveted trophy at RoboCup, CHARLI is already turning heads in the robotics industry. CHARLI-H, the next-gen version of CHARLI-L, has a unique mechanical design, inspired from the anatomy of humans. Instead of being actuated by rotational motors in his joints, like most robots of his kind, CHARLI has linear actuators along his limbs that mimic the way our muscles stretch and contract to control movement. CHARLI’s kinematics are like that of a human, with muscle-like actuation and compliance and flexibility at his joints, like tendons.

I’m looking forward to getting up-close and personal with CHARLI-L and CHARLI-H as RoMeLa continues their cutting-edge research. Maxon Motors and National Instruments have been heavily involved in these projects, donating hardware and software for their designs. Specifically, the current design for CHARLI-H uses an NI  sbRIO embedded controller, Maxon motors and Maxon EPOS2 motor controllers; all of CHARLI-H code is written in LabVIEW graphical programming software.

Additionally, it was announced that Dr. Hong will be a keynote speaker at NIWeek 2010. As such, he’ll be bringing some of his posse from RoMeLa, both human and robotic, to the Austin Convention Center this year. So expect some more details on CHARLI and other RoMeLa projects in the weeks to come.

Meet Mini Hubo: part 1 of series March 31, 2010

Posted by emiliekopp in labview robot projects.
Tags: , , , , , , , , , ,
3 comments

Meet Mini Hubo, a small, humaniod robot based on an original,  full scale humanoid design by the Korea Advanced Institute of Science and Technology (KAIST). The original Hubo was replicated and scaled smaller in size by RoMeLa at Virginia Tech  since not everyone has the funds or resources to have their own life-size humanoid walking around the lab; Dr. Hong and his students created a more accessible version. The goal of Mini Hubo is to serve as an affordable and open-ended research platform to expand knowledge in the human robotics field.

Since we’re good friends with the engineers at RoMeLa, we recently got our hands on a Mini Hubo here at NI. One of our interns, RJ Gross from Drexel University, spent some quality time with the robot, which we began to refer to as MiNI Hubo, since all of our robots have some sort of emphasis on “NI” (see NIcholas, DaNI, NIro, NItro, GreeNI, etc). As a result, RJ will be sharing a lot of the LabVIEW code he developed to control MiNI-Hubo (coming soon!).

In the meantime, here’s some mechanical specs on our MiNI Hubo:

Height: 46cm

Weight: 2.9 kg

DOF: 22 (but don’t worry, Mini Hubo comes with documentation that includes his forward and inverse kinematics, whew!)

Motors: Robotis Dynamixel RX-28 (LabVIEW drivers for these particular motors will be published soon, so you can get your hands on them too)

Controller: We chose to use the FitPC2 to controller our MiNI Hubo, although the humanoid platform is flexible, so you could use practically anything, like Gumstix, NanoATX, PC104, etc.

OS/SW: Our MiNI Hubo is programmed using LabVIEW Robotics and runs Windows on the FitPC2. RJ will be publishing a white paper on running LabVIEW on the FitPC2 soon as well. But again, depending on what controller is selected, the OS/SW is flexible.

Vision: We used a USB webcam. This is also a flexible option for Mini Hubo.

Power: Lithium-ion polymer batteries

We’ll be getting more up-close and personal with this robot in the coming weeks, so stay tuned. I have some video of MiNI Hubo walking at one of the cubicles in R&D that I look forward to sharing.

For anyone considering who’s interested in a Mini Hubo of his/her own, be sure to contact RoMeLa. They sell the Mini Hubo platform to researchers.

Robot VIPs: Roboticists you should know about (part 1) March 8, 2010

Posted by emiliekopp in Robot VIPs.
Tags: , , , , , ,
3 comments

In the years that I’ve worked at National Instruments, I’ve come across several engineers and scientists that are doing incredibly cool things in the robotics industry, using NI technologies. I’ve been lucky to meet some famous, some accomplished, or some just down right geeky people that are definitely worth knowing . I’d like to share my list of Roboticists You Should Know About, starting with one of my favorite VIPs:

Name: Dr. Dennis Hong

Title: Associate Professor and the Director of RoMeLa (Robotics & Mechanisms Laboratory) of the Mechanical Engineering Department, Virginia Tech

Expertise:

  • Novel robot locomotion mechanisms
  • Design and analysis of mechanical systems
  • Kinematics and robot mechanism design
  • Humanoid robots
  • Autonomous systems

Geek cred:

Cool projects:

How to Join:

A quick update to the Blind Driver Challenge vehicle October 14, 2009

Posted by emiliekopp in labview robot projects.
Tags: , , , , , ,
add a comment

Here’s a technical case study that provides some additional information on how the Virginia Tech team built the blind driver vehicle I mentioned in a previous blog post.

In addition, I wanted to share this video I found of 16-year-old Addie Hagen getting prepped to take a first stab at driving a car. It’s short and simple, with little video editing. But watching it truly demonstrates the impact this project can have on a person who thought something like driving a car would never be possible in her lifetime.

RAPHaEL: Another incredible robot design from RoMeLa September 29, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , ,
3 comments

A lot of you may have already heard about the second- generation air-powered robotic hand on Engadget from Dr. Hong and his engineering students at RoMeLa. But seeing as how NI and RoMeLa have been long time friends and have worked on many robotics projects together, we’ve got an inside story on how the new and improved RAPHaEL came to be. The following is a recap from Kyle Cothern, one of the undergrads that worked on RAPHaEL 2. He explains how closed-loop control for the mechanical hand was implemented in less than 6 hours, proof that seamless integration between hardware and software can make a big difference in robotic designs.

RAPHaEL (Robotic Air Powered HAnd with Elastic Ligaments) is a robotic hand that uses a novel corrugated tubing actuation method to achieve human like grasping with compliance. It was designed by a team of four undergraduate students working under advisor Dr. Dennis Hong in the Robotics Mechanisms Lab (RoMeLa) at Virginia Tech.  The hand was originally designed to be operated with simple on off switches for each solenoid, providing limited control by a user. The first version was modified to use a simple micro controller to accept switch input and run short routines for demos.

The second version of the hand was designed to include a micro controller to allow for more complicated grasping methods that require closed loop control. These grasping methods included closed loop position and closed loop force control to allow for form grasping and force grasping, the two most commonly used human grasping methods. Each method would require analog input from one or more sensors, analog output to one or more pressure regulators, and digital output to control the solenoids, along with a program to calculate the proper control signal to send to the pressure regulators based on the sensor data.  Using the micro controller from the first version of the hand was considered, however it would have taken about a month for the team to redesign the controller to accept sensor input and analog output for the pressure regulator. It would have then taken weeks to program the controller and calibrate it properly, and a complete redesign would be necessary to add more sensors or actuators.

At this point 3 of the 4 students working on the hand graduated and left the lab. With only one student left it would take a considerably long amount of time to implement a micro controller, and due to the complexity of a custom designed micro controller if that student were to leave the lab, it would take a very long time for a new person to be trained to operate and continue research with the hand. The remaining team member decided to search for an easy to implement, expandable solution to the problem, to allow future research to continue without an extensive learning curve. The stringent requirements for this new controller lead the final team member to consult with a colleague. The colleague recommended an NI CompactDAQ (cDAQ) system for its ease of implementation and expandibility, along with it’s ability to acquire the sensor data, control the solenoids and control the pressure regulator.

Upon receiving the cDAQ, the solenoids were attached, and the control software was written in LabVIEW in about 1 hour. Then the electronic pressure regulator was attached in line with the hand, allowing for proportional control of the pressure to the hand within 1 more hour. At this point a force sensor was attached to the fingertip to make a closed loop system.  The interpretation code for the sensor was written in about 40 minutes, and PID control of the grasping force was functional in a grand total of about 6 hours.

The RoMeLa team plans to upgrade their robotic hand even further by upgrading to a CompactRIO controller. The  CompactRIO would allow control calculations and response to happen at a much faster rate since there is a dedicated FPGA combined with a real-time, embedded PC processor. With a new, beefed up controller, they plan to test other control schemes such as position or vision based control.  They also plan to incorporate additional degrees of freedom (as if there weren’t already enough?!) by adding control of a wrist or arm mechanism.

Dr. Hong also gave a heads up that Discovery Channel will be featuring some the robotic innovations from RoMeLa, so keep an eye out for an update to this blog post with a link to that footage.

Blind Driver Challenge from Virginia Tech: A Semi Autonomous Automobile for the Visually Impaired September 1, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , , ,
5 comments

How 9 mechanical engineering undergrads utilized commercial off-the-shelf technology (COTS) and design HW and  SW donated from National Instruments to create a sophisticated, semi-autonomous vehicle that allows the visually impaired to perform a task that was previously thought impossible: driving a car.

Greg Jannaman (pictured in passenger seat), ME senior and team lead for the Blind Driver Challenge project at VT, was kind enough to offer some technical details on how they made this happen.

How does it work?

One of the keys to success was leveraging COTS technology, when at all possible. This meant, rather than building things from scratch, the team purchased hardware from commercial vendors that would allow them to focus on the important stuff, like how to translate visual information to a blind driver.

example of the information sent back from a LIDAR sensor

example of the information sent back from a LIDAR sensor

So they started with a dune buggy. They tacked on a Hokuyo laser range finder (LIDAR) to the front, which essentially pulses out a laser signal across an area in front of the vehicle and receives information regarding obstacles from  the laser signals that are bounced back. LIDAR is a lot like radar, only it uses light instead of radio waves.

For control of the vehicle, they used a CompactRIO embedded processing platform. They interfaced their sensors and vehicle actuators directly to an onboard FPGA and performed the environmental perception on the real-time 400 Mhz Power PC processor. Processing the feedback from sensors in real-time allowed the team to send immediate feedback to the driver. But the team did not have to learn how to program the FPGA in VHDL nor did they have to program the embedded processor with machine level code. Rather, they performed all programming on one software development platform; LabVIEW. This enabled 9 ME’s to become embedded programmers on the spot.

But how do you send visual feedback to someone who does not see? You use their other senses. Mainly, the sense of touch and the sense of hearing. The driver wears a vest that contains vibrating motors, much like the motors you would find in your PS2 controller (this is called haptic feedback, for anyone interested). The CompactRIO makes the vest vibrate to notify the driver of obstacle proximity and to regulate speed, just like a car racing video game. The driver also wears a set of head phones. By sending a series of clicks to the left and right ear phones, the driver uses the audible feedback to navigate around the detected obstacles.

The team has already hosted numerous blind drivers to test the vehicle. The test runs have been so successful, they’re having ask drivers to refrain from performing donuts in the parking lot. And they already have some incredible plans on how to improve the vehicle even further. Watch the video to find out more about the project and learn about their plans to further incorporate haptic feedback.