jump to navigation

Kinect 6D Visualization in LabVIEW April 19, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , , , , , ,

The LabVIEW Kinect code keeps rolling in. I am happy to share yet another example that is available for free download.

This one is very similar to John Wu’s LabVIEW + Kinect example I shared awhile back. Karl Muecke, NI R&D engineer, shares his 6D visualization example on the NI Robotics Code Exchange.

You can view a video screen capture of the demo and download his open source code here:


More LabVIEW Development for the Xbox Kinect April 7, 2011

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , ,

So remember when I said the Xbox Kinect was going to revolutionize robotics (at least from a sensor-hardware point of view)?

Well, when it rains, it pours: More and more LabVIEW developers are uniting, creating and sharing drivers that allow you to communicate with the Xbox Kinect hardware using LabVIEW software.

An NI Community member, anfredres86, has published his VI driver library, making it easy to download and install the necessary files for you to start developing robotics applications in LabVIEW that utilize the Kinect hardware for robot sensing.

Here is a video of the 2D occupancy grid mapping example he put together using LabVIEW and the Kinect:

I encourage everyone to check out (and download) his code:

Kinect Drivers for Labview: http://decibel.ni.com/content/docs/DOC-15655

And be sure to share your examples on the NI Robotics Code Exchange as well!

Last chance to vote: Help name the new NI robotics product November 23, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , ,
add a comment

Rumor is the NI R&D team is developing a new robotics starter kit. I’ve seen some early prototypes and while I can’t provide any specs (without losing my job, of course), I can divulge that the team is looking for outside help. Specifically, they need outside opinion on what to name the new robot. What do you think it should be?

Visit the NI Robotics Code Exchange to cast your vote.

Hurry because the poll ends soon.

Microsoft ups the ante in the robotics market, makes MSRDS free May 20, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , , , ,

An exciting post on the IEEE Spectrum Automaton Blog from Erico Guizzo, fellow robotics blogger: Microsoft Shifts Robotics Strategy, Makes Robotics Studio Available Free

New MSRDS simluation environment (photo from Automaton Blog)

It seems Microsoft is taking a similar approach to Willow Garage and is now offering its development tools for free, in efforts to gain widespread adoption in the robotics market.

I, for one, am excited to see this play out and welcome Microsoft’s increased investment in the robotics market. I’ve seen first hand that MSRDS plus LabVIEW can be a powerful combination of simulation and graphical programming tools.

But keep in mind, lowering the boundary to robotics not only means making your development software available to the masses with free/low-cost, hobbyist options. It also means making sure that technology is put in the hands of tomorrow’s engineers and scientists and leveraging other key industry players like LEGO and FIRST to reach students.

And it means making your tools open to other design platforms, so the roboticist is free to leverage any and every development tool that makes sense. Disruptive technologies like multicore processors, FPGAs and increasingly sophisticated commercial sensors are providing an exciting landscape for robotics developers out there. They need intuitive design tools to piece everything together. Hopefully Willow Garage (ROS), Microsoft (MSRDS) and NI (LabVIEW) will continue to find ways to optimize this.

What do you think?

Robots to take over Silicon Valley April 20, 2010

Posted by emiliekopp in robot events.
Tags: , , , , ,
add a comment

On April 26, the largest community of embedded systems designers, technologist, business leaders, and suppliers will
convene from all over the world at the 2010 Embedded Systems Conference (ESC) in San Jose, California. Any product that uses a processor requires some aspect of embedded design, so it’s no surprise this conference pulls in thousands of engineers working on the latest, greatest, future-forward solutions.

And it’s guaranteed to bring in some sophisticated robots.

NI is sending out their best humans and best robots to attend. From subsea robotic tuna fish, to interplanetary robotic manipulators on Mars, the NI staff will be featuring some it’s most-impressive robotic applications built with graphical system design technology. If you’re in or around the Silicon Valley area and you have even the slightest interest in robotic design, this conference is a must-see.

Here’s a close-up on the technical sessions the NI humans will be presenting:

If you’re thinking about attending this conference, visit the ESC website for registration info.

To check out NI’s crew, presentations and booth demos (including the Guitar-Hero-playing robot featured in the video below), you can visit ni.com/esc.

National Instruments Congratulates Austin FRC Teams in Atlanta April 13, 2010

Posted by emiliekopp in robot events.
Tags: , , , ,
add a comment

National Robotics Week was officially kicked off this weekend with an awesome robo-gathering of more than 500 humans and 20+ robots at the Austin Children’s Museum.

What better way for National Robotics Week to a close than at the FIRST Robotics World Championship, held in the Georgia Dome in Atlanta. Over the past few months, tens of thousands of high-schoolers have been competing in regional events all over the US, hoping to qualify and gain a coveted spot in the world championship event.

National Instruments has been there every step of the way and would like to congratulate all of the FIRST teams that will be joining in on the celebration in Atlanta. And a special shout out to the local, Austin-area teams that will be making the trip to the world championship:

Team 2583 – from Westwood Highschool, competing in Curie Division

Team 2468 – from Westlake Highschool, competing in Galileo Division

Team 647 – The Cyberwolves from Killeen Independent School District & Robert M. Shoemaker High School STEM Academy, competing in Newton Division

Good luck to everyone and GO ROBOTS!

PSA: Keanu Loves Robots January 11, 2010

Posted by emiliekopp in robot fun.
Tags: , , , , ,
add a comment

There’s a clever joke I can make of this… something about The Matrix, Neo and the machines taking over… this is simply the beginning… bah!

Nice plug for FRC, though. I wonder if there will be any other celebrities at this year’s championship?

How to Build a Quad Rotor UAV October 6, 2009

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , , , , ,

Blog Spotlight: Dr. Ben Black, a Systems Engineer at National Instruments, is documenting his trials and tribulations in his blog as he builds an autonomous unmanned aerial vehicle (UAV), using a SingleBoardRIO (2M gate FPGA+400MHz PowerPC processor), four brushless motors, some serious controls theory and lots of gorilla glue.

I particularly appreciate his attention to the details, stepping through elements of UAV design that are often taken for granted, like choosing reference frames, when you should use PID control, and the genius that is xkcd.

Like most roboticists, throughout the design process, he has to wear many hats. I think Ben put it best:

I think that the true interdisciplinary nature of the problems really makes the field interesting.  A roboticist has to have at minimum a working knowledge of mechanical engineering, electrical engineering, computer science / engineering and controls engineering.  My background is from the world of mechanical engineering (with a little dabbling in bio-mechanics), but I end up building circuits  and writing tons of code.  I’ve had to pick up / stumble through the electrical and computer science knowledge as I go along, and I know just enough to make me dangerous (I probably don’t always practice safe electrons…sometimes I let the magic smoke out of the circuits…and I definitely couldn’t write a bubble sort algorithm to save my life).

My point in this soap-box rant is that in the world of robotics it’s good to have a specialty, but to really put together a working system you also need to be a bit of a generalist.

For anyone even considering building a UAV (or just likes to read about cool robotics projects), I suggest you check it out. He shares his .m-files, LabVIEW code, and more. Thanks Ben.

RAPHaEL: Another incredible robot design from RoMeLa September 29, 2009

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , , , , , ,

A lot of you may have already heard about the second- generation air-powered robotic hand on Engadget from Dr. Hong and his engineering students at RoMeLa. But seeing as how NI and RoMeLa have been long time friends and have worked on many robotics projects together, we’ve got an inside story on how the new and improved RAPHaEL came to be. The following is a recap from Kyle Cothern, one of the undergrads that worked on RAPHaEL 2. He explains how closed-loop control for the mechanical hand was implemented in less than 6 hours, proof that seamless integration between hardware and software can make a big difference in robotic designs.

RAPHaEL (Robotic Air Powered HAnd with Elastic Ligaments) is a robotic hand that uses a novel corrugated tubing actuation method to achieve human like grasping with compliance. It was designed by a team of four undergraduate students working under advisor Dr. Dennis Hong in the Robotics Mechanisms Lab (RoMeLa) at Virginia Tech.  The hand was originally designed to be operated with simple on off switches for each solenoid, providing limited control by a user. The first version was modified to use a simple micro controller to accept switch input and run short routines for demos.

The second version of the hand was designed to include a micro controller to allow for more complicated grasping methods that require closed loop control. These grasping methods included closed loop position and closed loop force control to allow for form grasping and force grasping, the two most commonly used human grasping methods. Each method would require analog input from one or more sensors, analog output to one or more pressure regulators, and digital output to control the solenoids, along with a program to calculate the proper control signal to send to the pressure regulators based on the sensor data.  Using the micro controller from the first version of the hand was considered, however it would have taken about a month for the team to redesign the controller to accept sensor input and analog output for the pressure regulator. It would have then taken weeks to program the controller and calibrate it properly, and a complete redesign would be necessary to add more sensors or actuators.

At this point 3 of the 4 students working on the hand graduated and left the lab. With only one student left it would take a considerably long amount of time to implement a micro controller, and due to the complexity of a custom designed micro controller if that student were to leave the lab, it would take a very long time for a new person to be trained to operate and continue research with the hand. The remaining team member decided to search for an easy to implement, expandable solution to the problem, to allow future research to continue without an extensive learning curve. The stringent requirements for this new controller lead the final team member to consult with a colleague. The colleague recommended an NI CompactDAQ (cDAQ) system for its ease of implementation and expandibility, along with it’s ability to acquire the sensor data, control the solenoids and control the pressure regulator.

Upon receiving the cDAQ, the solenoids were attached, and the control software was written in LabVIEW in about 1 hour. Then the electronic pressure regulator was attached in line with the hand, allowing for proportional control of the pressure to the hand within 1 more hour. At this point a force sensor was attached to the fingertip to make a closed loop system.  The interpretation code for the sensor was written in about 40 minutes, and PID control of the grasping force was functional in a grand total of about 6 hours.

The RoMeLa team plans to upgrade their robotic hand even further by upgrading to a CompactRIO controller. The  CompactRIO would allow control calculations and response to happen at a much faster rate since there is a dedicated FPGA combined with a real-time, embedded PC processor. With a new, beefed up controller, they plan to test other control schemes such as position or vision based control.  They also plan to incorporate additional degrees of freedom (as if there weren’t already enough?!) by adding control of a wrist or arm mechanism.

Dr. Hong also gave a heads up that Discovery Channel will be featuring some the robotic innovations from RoMeLa, so keep an eye out for an update to this blog post with a link to that footage.

Open Source LabVIEW Code: LIDAR Example featuring Radiohead September 15, 2009

Posted by emiliekopp in code, labview robot projects.
Tags: , , , , , , , , ,
1 comment so far

This one is compliments of Alessandro Ricco, a LabVIEW Champion in Italy who had some free time on his hands. He was inspired by Radiohead’s music video for House of Cards. Some of you may recall, this is the music video that was created without any filming, whatsoever. Rather, the producers recorded 3D images of Thom Yorke singing the lyrics with a Velodyne LIDAR sensor and then played back the data, in sync with the song.

I mentioned LIDAR technology before when describing the Blind Driver Car from Virginia Tech. There’s a ton of other robots that I’ve come across that use LIDAR for sensing and perception, so I figured you robot builders out there might be interested in getting your hands on some code to do this yourself.

Start by downloading Alessandro’s example here. You’ll need LabVIEW 8.5 or newer. If you don’t have LabVIEW, you can download free evaluation software here (be warned, it might take some time to download).

You’ll also need to find yourself some LIDAR data. If you don’t have a $20,000 LIDAR sensor lying around the lab, you can simply download the LIDAR data from the Radiohead music video from Google Code.

On the other hand, if you do have a LIDAR lying around and let’s say you want to create your own music video (or perhaps more likely, if you just want to create a video recording of the 3D data your mobile robot just acquired), Alessandro also includes a VI that saves each 3D plot as a .jpeg and then strings them all together to create an .avi. Here’s where you can find the necessary IMAQ driver VIs to do this part (be sure to download NI-IMAQ 4.1, I don’t think you’ll need the other stuff).


Big thanks to Alessandro. His instructions, documented in his VIs, are exceptional.