jump to navigation

Robots rawk at NIWeek August 17, 2011

Posted by emiliekopp in robot events.
Tags: , , , , , ,
add a comment

Check out the some of the awesome robots featured in the Robotics Pavilion at 2011 NIWeek:

SuperDroid Robots showed off the SD6 robot, a super-rugged, treaded UGV that was developed jointly with NI.

Dr. Hong brought the latest and greatest from RoMeLa, showing off their full-size humanoid robot, CHARLI:

I got a picture with CHARLI and Dr. Hong as well:

You can download many of the presentations from the Robotics Summit from the NIWeek Community:

And that’s not everything. Check out Brian Powell’s recap of all-things-robotic at NIWeek on the LabVIEW Field Journal blog:

LabVIEW Robotics at NIWeek 2011

Vecna BEAR Military UGV: A Jack of All Trades July 14, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , , ,
add a comment

I’ve written about Vecna Robotics’ Battlefield Extraction-Assist Robot (BEAR) before and am familiar with its development process. Its design engineers used LabVIEW and NI CompactRIO to rapidly build and test early prototypes and win defense contracts.

BotJunkie recently featured a video that captures the Vecna BEAR in action. Admittedly, one can see that the actual “extraction” of military casualties still looks a bit awkward and probably needs more work. I’m sure operating a robot with so many degrees of freedom in a potentially hostile environment is extrememly difficult and requires an enormous amount of practice. Bottom line, this is definitely one of the more friendly military robots that is helping save lives.

But once you take handling an injured human out of the equation, the robot can actually serve several other purposes that may not require as much poise. For instance, the BEAR can help with more logistical tasks, like handling munitions and delivering supplies. It’s payload capacity is a whopping 500 lbs, so it could definitely help as an extra hand on the battlefield. And because of it’s dexterity, it could perform maintenance functions as well, such as inspection, decontamination and refueling. Saving time and effort allows troops to focus on the task at hand, which indirectly reduces the risk soldiers are exposed to.

So the BEAR is certainly a robotic jack-of-all-trades that could prove extremely useful when fully deployed. It’s fun to imagine full convoys of these surprisingly cute robots in the future (by the way, the video explains the cuteness factor).

Blind Driver Challenge – Next-Gen Vehicle to Appear at Daytona July 2, 2010

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , ,
3 comments

The National Federation of the Blind (NFB) has just announced some ambitious plans inspired by RoMeLa. According the their recent press release, they have partnered with Virginia Tech to demonstrate the first street-legal vehicle that can be driven by the blind.

We saw RoMeLa’s work awhile back and featured the technology used in their first prototype. The NFB was immediately convinced about the viability RoMeLa’s prototype demonstrated and put them to work on a street-legal vehicle with some grant money. The next-generation vehicle will incorporate their non-visual interfaces with a Ford Escape Hybrid, reusing many of the technologies they used in their prototype. The drive-by-wire system will be semi-autonomous and use auditory and haptic cues to provide obstacle detection for a blind driver.

The first public demo will be at the Daytona International Raceway during the Rolex 24 racing event. News coverage is popping up all over the place.

If you can’t make it to Daytona, RoMeLa will be showing off their first prototype (red dune buggy) at NIWeek in August, in case anyone wants to take it for a spin.

Athena: An FPGA-based UGV from Olin College May 3, 2010

Posted by emiliekopp in labview robot projects.
Tags: , , , , , ,
3 comments

Meet Athena, a UGV designed by students at Olin College to compete in the 2010 Intelligent Ground Vehicle Competition (IGVC).

Athena avoids obstacles using an NI SingleBoardRIO, an FPGA-based embedded processor. Unlike processors, FPGAs use dedicated hardware for processing logic via a matrix of reconfigurable gate array logic circuitry and do not have an operating system.

Since FPGAs are simply huge fields of programmable gates, they can be programmed into many parallel hardware paths. This makes them truly parallel in nature so different processing operations do not have to compete for the same resources. Programmers can automatically map their solutions directly to the FPGA fabric, allowing them to create any number of task-specific cores that all run like simultaneous parallel circuits inside one FPGA chip.

This becomes very useful for roboticists. For anyone programming sophisticated algorithms for autonomy, FPGAs can make a netbook look like an Apple II. Granted, FPGAs are not the easiest embedded processing solution to master, assuming you don’t have an extensive background in VHDL programming.

However, the students at Olin College have taken up LabVIEW FPGA, which allows them to program the FPGA on their sb-RIO using an intuitive, graphical programming language; VHDL programming not necessary.

As a result, they can run their algorithms super fast; incredibly fast; and the faster your robot can think, the smarter your robot can become.

Here’s what Nick Hobbs, one of Athena’s builders had to say:

The cool thing about this is we’re processing LIDAR scans at 70Hz. That means in 1/70 of a second we’re evaluating 180 data points effects on 16 possible vehicle paths. This is super fast, super parallel processing of a ton of data that couldn’t happen without NI’s FPGA. Oh, and naturally, all programmed in LabVIEW!

It’s making more sense on why they named their robot Athena; she’s one smart robot. I’m looking forward to seeing more from Nick’s team. Check out more videos on their YouTube Channel.

For more info on FPGA-level programming and other multicore solutions, check out this white paper.

RoboCup 2009: Robot Rescue League Team Spotlight February 25, 2010

Posted by emiliekopp in labview robot projects, Uncategorized.
Tags: , , , , , ,
1 comment so far

Get up close and personal with RoboRescue Team FH-Wels, from the University of Applied Sciences Upper Austria. These students and researchers have an impressive resume of participating and winning in a variety of worldwide robotic competitions.

Their latest success: building an autonomous robot to compete in the 2009 RoboCup Rescue League, a competition where autonomous robots navigate through a small-scale obstacle course of complex, unstructured terrain in search for victims awaiting rescue.

Their white paper is extremely informative, providing a breakdown of the hardware and software design. The team wisely chose commercial, off-the-shelf (COTS) technologies for their robot design, including a notebook PC and haptic joystick for the command station, a D-Link router for communications, an NI sb-RIO for the onboard processing, a Hokuyo 2-D laser range finder for mapping, an Xsens IMU for localization, an NI Compact Vision System for image processing, and lots more. To piece it all together, they used LabVIEW for software programming.

One blog post wouldn’t do them justice, so I figured just embed their white paper. It serves as an excellent reference design for anyone building an UGV for search and rescue applications.



And here’s a video of their robot in action:

Its a bird, its a plane, its a UAV February 17, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , ,
1 comment so far

In a face-off between UAV vs. UGV vs. UMV, i.e. aerial vs. ground vs. maritime robot, who would win? If we go by shear volume of what’s currently deployed in action, unmanned aerial vehicles take the cake. I recently calculated that DoD spending on research and development for UAVs is 2.5-times its investment in UGV-related R&D, and 15-times its investment in UMVs. (source: FY2009-2034 Unmanned Systems Integrated Roadmap: President’s Budget for Unmanned Systems).

With more eyes in the skies, UAV developers have seen some recent success. For instance, the US Marine Corps recently completed its first successful demonstration of its new robocopter, a helicopter that was gutted and retrofitted to become a UAV. What’s particularly impressive is that this isn’t your run-of-the-mill unmanned vehicle; there’s no hands on control necessary. A flight operator specifies the flight path and then the helicopter autonomously navigates itself to its destination. And the fact that its a helicopter helps address difficulties encountered when delivering supplies or aid to soldiers in particularly rugged terrain. This Popular Science article has more info on the application.

Another recent success: a UAV in the UK has made the first flying drone arrest. Suspects of a stolen vehicle had been evading police during chase thanks to a thick heavy fog. So police called in the help of a UAV and utilized its thermal imaging to identify the body heat and locate the hiding suspects in a nearby ditch. More info here.

Kind of creepy but still very cool.

National Instruments Releases New Software for Robot Development: Introducing LabVIEW Robotics December 7, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , , , , ,
2 comments

Well, I found out what the countdown was for. Today, National Instruments released new software specifically for robot builders, LabVIEW Robotics. One of the many perks of being an NI employee is that I can download software directly from our internal network, free of charge, so I decided to check this out for myself. (Note: This blog post is not a full product review, as I haven’t had much time to critique the product, so this will simply be some high-level feature highlights.)

While the product video states that LabVIEW Robotics software is built on 25 years of LabVIEW development, right off the bat, I notice some big differences between LabVIEW 2009 and LabVIEW Robotics. First off, the Getting Started Window:

For anyone not already familiar with LabVIEW, this won’t sound like much to you, but the Getting Started Window now features a new, improved experience, starting with an embedded, interactive Getting Started Tutorial video (starring robot-friend Shelley Gretlein, a.k.a. RoboGret). There’s a Robotics Project Wizard in the upper left corner that, when you click on it, helps you set up your system architecture and select various processing schemes for your robot. At first glance, it looks like this wizard is best suited for when you’re using NI hardware (i.e. sbRIO, cRIO, and an NI LabVIEW Robotics Starter Kit), but looks like in future software updates, it might include other, 3rd-party  processing targets (perhaps ARM?)

The next big change I noticed is the all-new Robotics functions palette. I’ve always felt that LabVIEW has been a good programming language for robot development, and now it just got better, with several new robotics-specific programming functions, from Velodyne LIDAR sensor drivers to A* path planning algorithms. There looks to be hundreds of new VIs that were created for this product release.

Which leads to me to the Example Finder. There’s several new robotics-specific example VIs to choose from to help you get started. There’s some examples that help you connect to third-party software, like Microsoft Robotics Studio or Cogmation robotSim. There’s examples for motion control and steering, including differential drive and mechanum steering. There’s also full-fledge example project files for varying types of UGV’s for you to study and copy/paste from, including the project files for ViNI and NIcholas, two, NI-built demonstration robots. And if that’s not enough, NI has launched a new code exchange specifically for robotics, with hundreds of additional examples to share and download online. ( A little birdie told me that NI R&D will be contributing to the code available on this code exchange in between product releases as well.)

This is just my taste of the new features this product has. To get the official product specs and features list, you’ll have to visit the LabVIEW Robotics product page on ni.com. I also found this webcast, Introduction to NI LabVIEW Robotics, if you care to watch a 9 minute demo.

A more critical product review will be coming soon.

Looks like the robot revolution has begun.

Feedback control at its finest: Innovations from UCSD Coordinated Robotics Lab October 19, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , , ,
add a comment

I found this cool video (below), provided by IEEE Spectrum Online the other day. Josh Romero, it’s narrator, must have experienced the robot revolution at this year’s NIWeek, as much of the video footage is taken from the Day 3 keynote. Here’s the full, extended version of Dr. Bewley’s talk about the work being done at the UCSD Coordinated Robotics Lab.

10943-switchblade

This small treaded robot can climb stairs with ease and balance itself on a point.

Josh brings up a good point in his video: automatic feedback control can be the difference between simple, ordinary robots and incredibly sophisticated dynamic systems. Take Switchblade, for example. The robot performs low-level control on a dedicated, embedded processor (in this case, a 2M gate FPGA on a SingleBoardRIO) to automatically balance itself on a point. There is an additional, real-time processor that performs additional tasks like maneuvering up a flight of stairs. With it being so small and having such a wide spectrum of mobility, it puts search-and-rescue robots like the PackBot to shame. See you at the top of the stairs, PackBot!

Ok, I take that back. Let’s avoid “shaming” PackBot. Please don’t shoot me, PackBot.

Vodpod videos no longer available.

Stay tuned for a closer look at how Switchblade works in a future post.

ARCH: A humvee that drives itself September 4, 2009

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , , , , , , ,
1 comment so far

Speaking of autonomous vehicles, here’s an autonomous humvee from TORC Technologies, a company founded by a couple of Virginia Tech graduates several years ago. Needless to say, VT is a hot bed of UGV experts.

The idea is that you have a an unmanned humvee at the lead of a convoy. That way, if any IEDs and/or land mines are encountered on the convoy’s path, the unmanned vehicle is targeted first. The leading humvee can be teleoperated (remotely controlled) by an operator in the chase vehicle, or it can operate semi-autonmously or autonomously on its own.

TORC has become an industry expert in unmanned and autonomous vehicle systems. They helped create VictorTango, the vehicle that won 3rd prize in the DARPA Urban Challenge. NI has had the pleasure of working with them quite a bit. They’ve done a lot of their development using LabVIEW and CompactRIO.

Something I thought was cool is that TORC can turn any commericial automobile into an autonomous/teleoperated vehicle in a matter of hours. They do this by interfacing to the vehicle control unit through Controller Area Network (CAN) bus, and then communicate to it using JAUS commands. JAUS is like a universal language in military robotics. You can control one robot with specific JAUS commands and then control a different robot with the same commands, assuming they have the same computing nodes.

As if this blog post doesn’t already have enough acronyms, here’s the name of the autonomous humvee that TORC delivered to the U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC):

ARCH: Autonomous Remote Control HMMWVs or Autonomous Remote Control for High Mobility Multipurpose Wheeled Vehicles (an acronym that contains yet another long acronym; engineers love this stuff)