U.S. Military Insect Drones (powered by LabVIEW?) July 15, 2011Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: drone, insect, labview, mav, micro_aerial_vehicle, military
Do the screens of any of the monitors look familiar?
Read the news coverage of this robots here:
Tags: kinect, labview, obstacle_avoidance, open_source, sensor_fusion, sonar, starter_kit, xbox
add a comment
This example features sensor fusion, using the Kinect to gather the 3D image of the world and a scanning sonar to help avoid obstacles that get too close for the Kinect to see.
Check out the full recipe on the NI Robotics Code Exchange, including hardware lists, software and setup requirements, as well as code descriptions and downloads.
Kinect 6D Visualization in LabVIEW April 19, 2011Posted by emiliekopp in code, labview robot projects.
Tags: 3d, 6d, cloud, kinect, labview, obstacle_avoidance, point, robot, robotics, vision, visualization, xbox
You can view a video screen capture of the demo and download his open source code here:
More LabVIEW Development for the Xbox Kinect April 7, 2011Posted by emiliekopp in code, labview robot projects.
Tags: code, driver, kinect, labview, open_source, robotics, vision, xbox
So remember when I said the Xbox Kinect was going to revolutionize robotics (at least from a sensor-hardware point of view)?
An NI Community member, anfredres86, has published his VI driver library, making it easy to download and install the necessary files for you to start developing robotics applications in LabVIEW that utilize the Kinect hardware for robot sensing.
Here is a video of the 2D occupancy grid mapping example he put together using LabVIEW and the Kinect:
I encourage everyone to check out (and download) his code:
And be sure to share your examples on the NI Robotics Code Exchange as well!
Tags: accelerometer, code, control, iphone, labview
add a comment
Ok, so this week’s post proves that I simply can’t get enough of John Wu’s blog, RIOBotics. John is cranking out LabVIEW code for robotics applications , left and right.
This time, John helps you use LabVIEW to acquire and plot data directly from your iPhone’s accelerometer through UDP. (I bet this is similar to how Waterloo Labs was able to build an iPhone app that allowed them to steer a car by tilting their iPhone from side to side.)
Xbox Kinect Hack Using LabVIEW March 7, 2011Posted by emiliekopp in code, labview robot projects.
Tags: 3d, code, example, kinect, labview, open_source, visualization
If you haven’t seen this already, you need to. The Xbox Kinect is not only revolutionizing gaming, it will revolutionize the way humans interact with machines, including robots (think: robots can now more easily interpret human gestures).
Ryan Gordon, from http://ryangordon.net/, got things started by building and sharing a LabVIEW wrapper for the OpenKinect library. Then John Wu, another LabVIEW programmer, took things one step further building an example VI for 3D scene construction using the Kinect sensor and point clouds.
Download John’s example on his blog post: LabVIEW, Xbox Kinect, and 3D point cloud visualization
Thank you John and Ryan! This is the beginning of some incredible and exciting work!
DARPA Arm Robot Controlled via LabVIEW January 25, 2011Posted by emiliekopp in code, labview robot projects.
Tags: arm, code, control, darpa, labview, simulator
add a comment
By now, you’ve all heard of one of DARPA’s latest robotics projects, but just in case:
DARPA is introducing its Autonomous Robotic Manipulation (ARM) program. The goal of this 4 year, multi-track program is to develop software and hardware that allows an operator to control a robot which is able to autonomously manipulate, grasp and perform complicated tasks, given only high-level direction. Over the course of the program in the Software Track, funded performers will be developing algorithms that enables the DARPA robot to execute these numerous tasks. DARPA is also making an identical robot available for public use, allowing anyone the opportunity to write software, test it in simulation, upload it to the actual system, and then watch, in real-time via the internet, as the DARPA robot executes the user’s software. Teams involved in this Outreach Track will be able to compete and collaborate with other teams from around the country.
Using Karl’s code, you can directly control the arm simulator using LabVIEW. This means you develop your own control code and easily create UIs using LabVIEW’s graphical programming environment (two of the things LabVIEW is best for).
Check out Karl’s blog to request the code:
HeartLander: A Miniature Mobile Robot That Crawls Around the Heart, Powered by LabVIEW November 16, 2010Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: carnegie melon, heartlander, labview, michio kaku, robotics institute, surgery
Recall from the NIWeek 2010 Day Three Keynote: Dr. Michio Kaku, theoretical physicist, predicted the next 20 years of technology development, describing tiny robots that would travel throughout the body, taking readings, administering medication and performing tiny microscopic procedures, all while we go about our daily routines.
Researchers at the Carnegie Melon Robotics Institute are one step ahead of us, delivering the HeartLander, a miniature robotic device that can crawl around the surface of the heart, taking measurements and performing simple surgical tasks, all while the heart continues to pump blood throughout the body. I stumbled upon Nicholas Patronik’s Ph.D. thesis describing this project (and I encourage everyone to check it out). Here is what I found out:
NI technologies have been used for robot-assisted surgeries before, but the HeartLander robot addresses two major challenges of cardiac therapy and surgery:
• Gaining access to the heart without opening the chest
• Operating on the heart while it is beating
Several options for tackling these challenges exist today. Thoracoscopic techniques use laparoscopic tools inserted through the chest cavity to operate on the beating heart (think DaVinci Robot). While this avoids cracking open the rib cage (ouch!), even less invasive methods exist for accessing the heart and performing simpler procedures, like dye injections. Percutaneous transvenous techniques access inner organs through main arteries and veins. For example, a doctor can guide a heart stent through the veins in your thigh to treat blockages, and this procedure is performed on an outpatient basis. While these transvenous procedures are easier to recover from, thoracoscopic techniques offer much more flexibility in the complexity of surgical operations that can be performed.
The HeartLander robot is considered a hybrid of these two approaches in that it can achieve fine control of thoracoscopic techniques while maintaining the ability to be performed on an outpatient basis, like the percutaneous transvenous techniques. It adheres to and traverses the heart’s surface, the epicardium, providing a tool for precise and stable interaction with the beating heart. Even better, it can access difficult to reach locations of the heart such as the posterior wall of the left ventricle (the side of your heart that faces your back).
How it works
The HeartLander is launched onto the surface of the beating heart through a small puncture underneath the bottom of the sternum. From there, the robot steadily traverses the epicardium like an inchworm. Offboard linear motors actuate the robot forward while solenoids regulate vacuum pressure to suction pads. Watch a video of early prototypes inching across an inflated balloon, a synthetic beating heart and a porcine beating heart to see HeartLander’s motion in action (warning: the video clips get progressively graphic in nature).
An umbilical tether sends and receives information between the HeartLander and the control station, where the pressure to the suction pads is monitored and controlled to maintain grip at all times. The mobility of the robot is semiautonomous: it uses a pure pursuit tracking algorithm to navigate to predetermined surface targets, and can also be controlled via teleoperation.
It can navigate to any location on the epicardium, with clock speeds up to 4 mm per second, and acquire localization targets within 1 mm. But what I think is particularly exciting about this application is that the robot’s motion is controlled entirely with LabVIEW software and NI data acquisition hardware.
So far, the HeartLander has been successfully demonstrated through a series of closed-chest, beating-heart porcine studies. We don’t have tiny robotic heart worms crawling around in us just yet. But we’re certainly excited to see how the HeartLander project progresses and we’re proud that NI technologies are helping pave the way for incredible, futuristic innovations like this.
Open Source Project: Robot Swarm October 13, 2010Posted by emiliekopp in code, labview robot projects.
Tags: ai, artificial intelligence, collective intelligence, NIWeek, open_source, swarm
1 comment so far
We all know programming just one mobile robot with artificial intelligence is hard. So adding more robots and having them exhibit a collective behavior can increase the difficultly level exponentially. This is what makes swarm intelligence such hot topic in the world of robotics today.
During a National Instruments user conference, I saw a very impressive swarm demo from the NI Robotics R&D team:
Karl Muecke, the project lead, is now lifting the hood and opening up all of the build instructions and control code used to create his robot swarm. He starts will high level topics like hardware architectures, data communications, localization, driver station UI, obstacle avoidance and path planning, and then delves into the details in each area.
Check out the entire open source project on the NI Robotics Code Exchange and be sure to continue checking in, as he continually adds more pieces to puzzle.
Cyborg Fly Controls Mobile Robot Through Obstacle Course August 31, 2010Posted by emiliekopp in labview robot projects.
Tags: closed_loop, crio, cyborg, fly, labview, mobile_robot, obstacle_avoidance
1 comment so far
Erico Guizzo, editor of the IEEE Spectrum Automaton Blog, recently featured a cyborg fly application from Chauncey Graetzel and his colleagues at ETH Zurich’s Institute of Robotics and Intelligent Systems. Essentially, Swiss researchers have devised a closed-loop system that uses a tethered fly to control a robot and perform obstacle avoidance. How is this possible? Check out Erico’s post to get a good breakdown of the system.
I should mention this application was a finalist in the Robotics category for the 2010 Graphical System Design Achievement Awards. This is an annual paper contest that recognizes amazing feats and innovative applications built using National Instruments hardware and software.
Dr. Graetzel and his team needed a control system that was super fast and super flexible. They used NI CompactRIO as an interface to an LED-based visual stimulus arena at temporal and spatial resolutions that allowed them to efficiently stimulate the fly’s visual system. LabVIEW software was used to record these signals and provide the real-time execution for stimulus generation.
Here’s what Dr. Graetzel had to say about using NI tools for his project:
LabVIEW and CompactRIO provided an ideal solution for building a control loop that incorporates a living insect and allows us to perform a variety of experiments. CompactRIO acquires and generates signals for a multitude of industry standards and extends custom-made research tools. In addition, we achieved major efficiency gains with the ability to distribute our application between a PC, the real-time controller, and the FPGA without having to learn several programming and design languages. The range of available add-on products and interfaces also offered great potential for future extensions and adaptations.
Here’s a video of the mobile robot successfully avoiding obstacles using the feedback from the fly (which is tethered inside the LED cylinder array):
Watch more videos and read Dr. Graetzel’s submission to the Robotics category in the Graphical System Design Achievement Awards contest here: