jump to navigation

U.S. Military Insect Drones (powered by LabVIEW?) July 15, 2011

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , ,

Do the screens of any of the monitors look familiar?

Read the news coverage of this robots here:

Micro-machines are go: The U.S. military drones that are so small they even look like insects

Open Source Code: Using XBox Kinect with the LabVIEW Robotics Starter Kit May 25, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , ,
add a comment

This example features sensor fusion, using the Kinect to gather the 3D image of the world and a scanning sonar to help avoid obstacles that get too close for the Kinect to see.

Check out the full recipe on the NI Robotics Code Exchange, including hardware lists, software and setup requirements, as well as code descriptions and downloads.

Download: Using the XBox Kinect with LabVIEW Robotics Starter Kit

Kinect 6D Visualization in LabVIEW April 19, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , , , , , ,

The LabVIEW Kinect code keeps rolling in. I am happy to share yet another example that is available for free download.

This one is very similar to John Wu’s LabVIEW + Kinect example I shared awhile back. Karl Muecke, NI R&D engineer, shares his 6D visualization example on the NI Robotics Code Exchange.

You can view a video screen capture of the demo and download his open source code here:


More LabVIEW Development for the Xbox Kinect April 7, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , ,

So remember when I said the Xbox Kinect was going to revolutionize robotics (at least from a sensor-hardware point of view)?

Well, when it rains, it pours: More and more LabVIEW developers are uniting, creating and sharing drivers that allow you to communicate with the Xbox Kinect hardware using LabVIEW software.

An NI Community member, anfredres86, has published his VI driver library, making it easy to download and install the necessary files for you to start developing robotics applications in LabVIEW that utilize the Kinect hardware for robot sensing.

Here is a video of the 2D occupancy grid mapping example he put together using LabVIEW and the Kinect:

I encourage everyone to check out (and download) his code:

Kinect Drivers for Labview: http://decibel.ni.com/content/docs/DOC-15655

And be sure to share your examples on the NI Robotics Code Exchange as well!

Open Source Code: Using LabVIEW to Acquire iPhone Accelerometer Data April 1, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , ,
add a comment

Ok, so this week’s post proves that I simply can’t get enough of John Wu’s blog, RIOBotics. John is cranking out LabVIEW code for robotics applications , left and right.

This time, John helps you use LabVIEW to acquire and plot data directly from your iPhone’s accelerometer through UDP. (I bet this is similar to how Waterloo Labs was able to build an iPhone app that allowed them to steer a car by tilting their iPhone from side to side.)

Visit John’s blog to download his code.

Xbox Kinect Hack Using LabVIEW March 7, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , ,

If you haven’t seen this already, you need to. The Xbox Kinect is not only revolutionizing gaming, it will revolutionize the way humans interact with machines, including robots (think: robots can now more easily interpret human gestures).

Ryan Gordon, from http://ryangordon.net/, got things started by building and sharing a LabVIEW wrapper for the OpenKinect library. Then John Wu, another LabVIEW programmer, took things one step further building an example VI for 3D scene construction using the Kinect sensor and point clouds.

Download John’s example on his blog post: LabVIEW, Xbox Kinect, and 3D point cloud visualization

Thank you John and Ryan! This is the beginning of some incredible and exciting work!

DARPA Arm Robot Controlled via LabVIEW January 25, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , ,
add a comment

By now, you’ve all heard of one of DARPA’s latest robotics projects, but just in case:


DARPA is introducing its Autonomous Robotic Manipulation  (ARM) program. The goal of this 4 year, multi-track program is to  develop software and hardware that allows an operator to control a robot  which is able to autonomously manipulate, grasp and perform complicated tasks,  given only high-level direction. Over the course of the program in the  Software Track, funded performers will be developing algorithms that  enables the DARPA robot to execute these numerous tasks. DARPA is also  making an identical robot available for public use, allowing anyone the  opportunity to write software, test it in simulation, upload it to the  actual system, and then watch, in real-time via the internet, as the  DARPA robot executes the user’s software. Teams involved in this  Outreach Track will be able to compete and collaborate with other teams  from around the country.

One of NI’s R&D engineers, Karl, has developed a LabVIEW wrapper for the DARPA arm simulator in his spare time and has graciously shared it on the NI Robotics Code Exchange (ni.com/code/robotics).

Using Karl’s code, you can directly control the arm simulator using LabVIEW. This means you develop your own control code and easily create UIs using LabVIEW’s graphical programming environment (two of the things LabVIEW is best for).

Check out Karl’s blog to request the code:

DARPA Arm Robot Controlled via LabVIEW

HeartLander: A Miniature Mobile Robot That Crawls Around the Heart, Powered by LabVIEW November 16, 2010

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , ,

Recall from the NIWeek 2010 Day Three Keynote: Dr. Michio Kaku, theoretical physicist, predicted the next 20 years of technology development, describing tiny robots that would travel throughout the body, taking readings, administering medication and performing tiny microscopic procedures, all while we go about our daily routines.

Researchers at the Carnegie Melon Robotics Institute are one step ahead of us, delivering the HeartLander, a miniature robotic device that can crawl around the surface of the heart, taking measurements and performing simple surgical tasks, all while the heart continues to pump blood throughout the body. I stumbled upon Nicholas Patronik’s Ph.D. thesis describing this project (and I encourage everyone to check it out). Here is what I found out:

An early prototype of HeartLander, a small robot that can adhere to and traverse a beating heart.

Robot-assisted surgery
NI technologies have been used for robot-assisted surgeries before, but the HeartLander robot addresses two major challenges of cardiac therapy and surgery:
•    Gaining access to the heart without opening the chest
•    Operating on the heart while it is beating

Several options for tackling these challenges exist today. Thoracoscopic techniques use laparoscopic tools inserted through the chest cavity to operate on the beating heart (think DaVinci Robot). While this avoids cracking open the rib cage (ouch!), even less invasive methods exist for accessing the heart and performing simpler procedures, like dye injections. Percutaneous transvenous techniques access inner organs through main arteries and veins. For example, a doctor can guide a heart stent through the veins in your thigh to treat blockages, and this procedure is performed on an outpatient basis. While these transvenous procedures are easier to recover from, thoracoscopic techniques offer much more flexibility in the complexity of surgical operations that can be performed.

Illustration of HeartLander

The HeartLander robot is considered a hybrid of these two approaches in that it can achieve fine control of thoracoscopic techniques while maintaining the ability to be performed on an outpatient basis, like the percutaneous transvenous techniques. It adheres to and traverses the heart’s surface, the epicardium, providing a tool for precise and stable interaction with the beating heart. Even better, it can access difficult to reach locations of the heart such as the posterior wall of the left ventricle (the side of your heart that faces your back).

How it works
The HeartLander is launched onto the surface of the beating heart through a small puncture underneath the bottom of the sternum. From there, the robot steadily traverses the epicardium like an inchworm. Offboard linear motors actuate the robot forward while solenoids regulate vacuum pressure to suction pads. Watch a video of early prototypes inching across an inflated balloon, a synthetic beating heart and a porcine beating heart to see HeartLander’s motion in action (warning: the video clips get progressively graphic in nature).

An umbilical tether sends and receives information between the HeartLander and the control station, where the pressure to the suction pads is monitored and controlled to maintain grip at all times. The mobility of the robot is semiautonomous: it uses a pure pursuit tracking algorithm to navigate to predetermined surface targets, and can also be controlled via teleoperation.

Two drive wires transmit the actuation from off-board motors for locomotion. A 6-DOF electromagnetic tracking sensor is mounted to the front of the body.

It can navigate to any location on the epicardium, with clock speeds up to 4 mm per second, and acquire localization targets within 1 mm. But what I think is particularly exciting about this application is that the robot’s motion is controlled entirely with LabVIEW software and NI data acquisition hardware.

So far, the HeartLander has been successfully demonstrated through a series of closed-chest, beating-heart porcine studies. We don’t have tiny robotic heart worms crawling around in us just yet. But we’re certainly excited to see how the HeartLander project progresses and we’re proud that NI technologies are helping pave the way for incredible, futuristic innovations like this.

Learn more about the HeartLander project on CMU’s website

Discover other autonomous robots designed and controlled using LabVIEW software

Cyborg Fly Controls Mobile Robot Through Obstacle Course August 31, 2010

Posted by emiliekopp in labview robot projects.
Tags: , , , , , ,
1 comment so far

Erico Guizzo, editor of the IEEE Spectrum Automaton Blog, recently featured a cyborg fly application from Chauncey Graetzel and his colleagues at ETH Zurich’s Institute of Robotics and Intelligent Systems. Essentially, Swiss researchers have devised a closed-loop system that uses a tethered fly to control a robot and perform obstacle avoidance. How is this possible? Check out Erico’s post to get a good breakdown of the system.

I should mention this application was a finalist in the Robotics category for the 2010 Graphical System Design Achievement Awards. This is an annual paper contest that recognizes amazing feats and innovative applications built using National Instruments hardware and software.

Dr. Graetzel and his team needed a control system that was super fast and super flexible. They used NI CompactRIO as an interface to an LED-based visual stimulus arena at temporal and spatial resolutions that allowed them to efficiently stimulate the fly’s visual system. LabVIEW software was used to record these signals and provide the real-time execution for stimulus generation.

Here’s what Dr. Graetzel had to say about using NI tools for his project:

LabVIEW and CompactRIO provided an ideal solution for building a control loop that incorporates a living insect and allows us to perform a variety of experiments. CompactRIO acquires and generates signals for a multitude of industry standards and extends custom-made research tools. In addition, we achieved major efficiency gains with the ability to distribute our application between a PC, the real-time controller, and the FPGA without having to learn several programming and design languages. The range of available add-on products and interfaces also offered great potential for future extensions and adaptations.

Here’s a video of the mobile robot successfully avoiding obstacles using the feedback from the fly (which is tethered inside the LED cylinder array):

Watch more videos and read Dr. Graetzel’s submission to the Robotics category in the Graphical System Design Achievement Awards contest here:

Designing a Robotic Device to Study Flying Insects Using LabVIEW and CompactRIO

Snake-like robot developed by the Army – powered by LabVIEW July 28, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , ,
Check out the latest front page article on army.mil, the official page of the U.S. Army:

Army technology expands snake-robotics

The story highlights a snake-robot developed by the U.S. Army Research Laboratory. The robo-snake is biomemetic, meaning it maneuvers just like a real snake would, pushing off ground surfaces to propel itself. It can crawl, swim, climb or shimmy through narrow spaces while transmitting images to the Soldier operator. And it’s scalable, such that it can be built a robo-snake however large or small they’d like it to be. It’s expected to help with search-and-rescue and reconnaissance missions.

I’m sure you recognized the software on the command laptop’s screen too. Yep, that’s LabVIEW. Developers are using the graphical programming language to quickly and cost-effectively interface to the robot’s sensors and control it’s actuators. They can rapidly build and test early prototypes and ultimately deliver this dexterous robot to the field more quickly, saving lives and taxpayer dollars.

Read more about how the robot works here.