jump to navigation

Athena: An FPGA-based UGV from Olin College May 3, 2010

Posted by emiliekopp in labview robot projects.
Tags: , , , , , ,
3 comments

Meet Athena, a UGV designed by students at Olin College to compete in the 2010 Intelligent Ground Vehicle Competition (IGVC).

Athena avoids obstacles using an NI SingleBoardRIO, an FPGA-based embedded processor. Unlike processors, FPGAs use dedicated hardware for processing logic via a matrix of reconfigurable gate array logic circuitry and do not have an operating system.

Since FPGAs are simply huge fields of programmable gates, they can be programmed into many parallel hardware paths. This makes them truly parallel in nature so different processing operations do not have to compete for the same resources. Programmers can automatically map their solutions directly to the FPGA fabric, allowing them to create any number of task-specific cores that all run like simultaneous parallel circuits inside one FPGA chip.

This becomes very useful for roboticists. For anyone programming sophisticated algorithms for autonomy, FPGAs can make a netbook look like an Apple II. Granted, FPGAs are not the easiest embedded processing solution to master, assuming you don’t have an extensive background in VHDL programming.

However, the students at Olin College have taken up LabVIEW FPGA, which allows them to program the FPGA on their sb-RIO using an intuitive, graphical programming language; VHDL programming not necessary.

As a result, they can run their algorithms super fast; incredibly fast; and the faster your robot can think, the smarter your robot can become.

Here’s what Nick Hobbs, one of Athena’s builders had to say:

The cool thing about this is we’re processing LIDAR scans at 70Hz. That means in 1/70 of a second we’re evaluating 180 data points effects on 16 possible vehicle paths. This is super fast, super parallel processing of a ton of data that couldn’t happen without NI’s FPGA. Oh, and naturally, all programmed in LabVIEW!

It’s making more sense on why they named their robot Athena; she’s one smart robot. I’m looking forward to seeing more from Nick’s team. Check out more videos on their YouTube Channel.

For more info on FPGA-level programming and other multicore solutions, check out this white paper.

Move over ASIMO: Meet CHARLI, the first U.S. full-sized humanoid April 29, 2010

Posted by emiliekopp in Uncategorized.
Tags: , , , , , , , , , , ,
3 comments

Our friends at RoMeLa just unveiled their latest major project to Popular Science magazine: CHARLI or Cognitive Humanoid Autonomous Robot with Learning Intelligence.

A few months ago, I had heard quite a bit about this project from Dr. Hong himself. It’s no longer a secret that he and his students have created the nation’s first full-sized humanoid robot, a feat that countries like Japan (ASIMO) and Korea (HUBO) had once kept to themselves.

It all started back in 1997, when RoboCup announced the charter to create a fully-autonomous soccer team of robots that would beat the World Cup champions by 2050. Since then, teams of researchers and students from all over the world have competed in RoboCup, with variations of robotic soccer players, from fully-simulated robots in software, to four-legged Sony Aibos, and more recently, bi-pedal humanoid robots in small and medium sizes.

RoMeLa is no stranger to RoboCup, competing in the “kid-size” humanoid league with DARwIn. Now they’re scaling up and planning to compete in the “teen-size” league with CHARLI-L (CHARLI-Lightweight). RoMeLa will use their extensive experience in humanoid locomotion, kinematics and control to help CHARLI-L beat his opponents on the field.

But beyond competing for the coveted trophy at RoboCup, CHARLI is already turning heads in the robotics industry. CHARLI-H, the next-gen version of CHARLI-L, has a unique mechanical design, inspired from the anatomy of humans. Instead of being actuated by rotational motors in his joints, like most robots of his kind, CHARLI has linear actuators along his limbs that mimic the way our muscles stretch and contract to control movement. CHARLI’s kinematics are like that of a human, with muscle-like actuation and compliance and flexibility at his joints, like tendons.

I’m looking forward to getting up-close and personal with CHARLI-L and CHARLI-H as RoMeLa continues their cutting-edge research. Maxon Motors and National Instruments have been heavily involved in these projects, donating hardware and software for their designs. Specifically, the current design for CHARLI-H uses an NI  sbRIO embedded controller, Maxon motors and Maxon EPOS2 motor controllers; all of CHARLI-H code is written in LabVIEW graphical programming software.

Additionally, it was announced that Dr. Hong will be a keynote speaker at NIWeek 2010. As such, he’ll be bringing some of his posse from RoMeLa, both human and robotic, to the Austin Convention Center this year. So expect some more details on CHARLI and other RoMeLa projects in the weeks to come.

RoboCup 2009: Robot Rescue League Team Spotlight February 25, 2010

Posted by emiliekopp in labview robot projects, Uncategorized.
Tags: , , , , , ,
1 comment so far

Get up close and personal with RoboRescue Team FH-Wels, from the University of Applied Sciences Upper Austria. These students and researchers have an impressive resume of participating and winning in a variety of worldwide robotic competitions.

Their latest success: building an autonomous robot to compete in the 2009 RoboCup Rescue League, a competition where autonomous robots navigate through a small-scale obstacle course of complex, unstructured terrain in search for victims awaiting rescue.

Their white paper is extremely informative, providing a breakdown of the hardware and software design. The team wisely chose commercial, off-the-shelf (COTS) technologies for their robot design, including a notebook PC and haptic joystick for the command station, a D-Link router for communications, an NI sb-RIO for the onboard processing, a Hokuyo 2-D laser range finder for mapping, an Xsens IMU for localization, an NI Compact Vision System for image processing, and lots more. To piece it all together, they used LabVIEW for software programming.

One blog post wouldn’t do them justice, so I figured just embed their white paper. It serves as an excellent reference design for anyone building an UGV for search and rescue applications.



And here’s a video of their robot in action:

National Instruments Releases New Software for Robot Development: Introducing LabVIEW Robotics December 7, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , , , , ,
2 comments

Well, I found out what the countdown was for. Today, National Instruments released new software specifically for robot builders, LabVIEW Robotics. One of the many perks of being an NI employee is that I can download software directly from our internal network, free of charge, so I decided to check this out for myself. (Note: This blog post is not a full product review, as I haven’t had much time to critique the product, so this will simply be some high-level feature highlights.)

While the product video states that LabVIEW Robotics software is built on 25 years of LabVIEW development, right off the bat, I notice some big differences between LabVIEW 2009 and LabVIEW Robotics. First off, the Getting Started Window:

For anyone not already familiar with LabVIEW, this won’t sound like much to you, but the Getting Started Window now features a new, improved experience, starting with an embedded, interactive Getting Started Tutorial video (starring robot-friend Shelley Gretlein, a.k.a. RoboGret). There’s a Robotics Project Wizard in the upper left corner that, when you click on it, helps you set up your system architecture and select various processing schemes for your robot. At first glance, it looks like this wizard is best suited for when you’re using NI hardware (i.e. sbRIO, cRIO, and an NI LabVIEW Robotics Starter Kit), but looks like in future software updates, it might include other, 3rd-party  processing targets (perhaps ARM?)

The next big change I noticed is the all-new Robotics functions palette. I’ve always felt that LabVIEW has been a good programming language for robot development, and now it just got better, with several new robotics-specific programming functions, from Velodyne LIDAR sensor drivers to A* path planning algorithms. There looks to be hundreds of new VIs that were created for this product release.

Which leads to me to the Example Finder. There’s several new robotics-specific example VIs to choose from to help you get started. There’s some examples that help you connect to third-party software, like Microsoft Robotics Studio or Cogmation robotSim. There’s examples for motion control and steering, including differential drive and mechanum steering. There’s also full-fledge example project files for varying types of UGV’s for you to study and copy/paste from, including the project files for ViNI and NIcholas, two, NI-built demonstration robots. And if that’s not enough, NI has launched a new code exchange specifically for robotics, with hundreds of additional examples to share and download online. ( A little birdie told me that NI R&D will be contributing to the code available on this code exchange in between product releases as well.)

This is just my taste of the new features this product has. To get the official product specs and features list, you’ll have to visit the LabVIEW Robotics product page on ni.com. I also found this webcast, Introduction to NI LabVIEW Robotics, if you care to watch a 9 minute demo.

A more critical product review will be coming soon.

Looks like the robot revolution has begun.

Feedback control at its finest: Innovations from UCSD Coordinated Robotics Lab October 19, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , , ,
add a comment

I found this cool video (below), provided by IEEE Spectrum Online the other day. Josh Romero, it’s narrator, must have experienced the robot revolution at this year’s NIWeek, as much of the video footage is taken from the Day 3 keynote. Here’s the full, extended version of Dr. Bewley’s talk about the work being done at the UCSD Coordinated Robotics Lab.

10943-switchblade

This small treaded robot can climb stairs with ease and balance itself on a point.

Josh brings up a good point in his video: automatic feedback control can be the difference between simple, ordinary robots and incredibly sophisticated dynamic systems. Take Switchblade, for example. The robot performs low-level control on a dedicated, embedded processor (in this case, a 2M gate FPGA on a SingleBoardRIO) to automatically balance itself on a point. There is an additional, real-time processor that performs additional tasks like maneuvering up a flight of stairs. With it being so small and having such a wide spectrum of mobility, it puts search-and-rescue robots like the PackBot to shame. See you at the top of the stairs, PackBot!

Ok, I take that back. Let’s avoid “shaming” PackBot. Please don’t shoot me, PackBot.

Vodpod videos no longer available.

Stay tuned for a closer look at how Switchblade works in a future post.

How to Build a Quad Rotor UAV October 6, 2009

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , , , , ,
20 comments

Blog Spotlight: Dr. Ben Black, a Systems Engineer at National Instruments, is documenting his trials and tribulations in his blog as he builds an autonomous unmanned aerial vehicle (UAV), using a SingleBoardRIO (2M gate FPGA+400MHz PowerPC processor), four brushless motors, some serious controls theory and lots of gorilla glue.

I particularly appreciate his attention to the details, stepping through elements of UAV design that are often taken for granted, like choosing reference frames, when you should use PID control, and the genius that is xkcd.

Like most roboticists, throughout the design process, he has to wear many hats. I think Ben put it best:

I think that the true interdisciplinary nature of the problems really makes the field interesting.  A roboticist has to have at minimum a working knowledge of mechanical engineering, electrical engineering, computer science / engineering and controls engineering.  My background is from the world of mechanical engineering (with a little dabbling in bio-mechanics), but I end up building circuits  and writing tons of code.  I’ve had to pick up / stumble through the electrical and computer science knowledge as I go along, and I know just enough to make me dangerous (I probably don’t always practice safe electrons…sometimes I let the magic smoke out of the circuits…and I definitely couldn’t write a bubble sort algorithm to save my life).

My point in this soap-box rant is that in the world of robotics it’s good to have a specialty, but to really put together a working system you also need to be a bit of a generalist.

For anyone even considering building a UAV (or just likes to read about cool robotics projects), I suggest you check it out. He shares his .m-files, LabVIEW code, and more. Thanks Ben.