jump to navigation

LabVIEW Robotics binding to Gostai’s Urbi? February 3, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , , , ,
2 comments

I was checking the incoming links to my blog the other day and found the following:

http://vote.gostai.com/forums/37683-general

It’s a product suggestion forum for Gostai users. Someone must have read my post featuring the connectivity between MSRDS and LabVIEW Robotics and thought a similar example for LabVIEW and Gostai could be useful. He/she is petitioning Gostai developers to add LabVIEW connectivity to their “liburbi,” which currently interfaces Gostai’s scripting language, UrbiScript, to languages like C++, Java and Matlab. Why not add LabVIEW to the list? 🙂

I’m still trying to figure out why that might be useful…

Both LabVIEW and Urbi are viable platforms for programming robots. Both lend tools for parallel programming. Both are intended to be open and flexible programming platforms that allow to developers to integrate with the enormous amount of robotics tools out in the industry. Both have very similar mantras. Case in point: here’s how NI has stated its claim on the robotics industry:

The robotics industry needs a software development platform that is what Microsoft BASIC was to the PC industry… A challenge for many roboticists is finding a modular, reusable software development platform that caters to all of the necessary disciplines of robotics.

Here’s a similar take from Gostai on their website:

Like PCs in the early 80’s, today’s robots are still incompatible in term of software. There is yet no standard way to reuse one component from one robot to the other, which is needed to have a real software industry bootstraping. And most attempts have been failing to provide tools genuinely adapted to the complex need of robot programming.

Both of these statements support what Bill Gates had proposed in his technology outlook article in Scientific American.

Needless to say, we’re all on the same page. So where do NI and Gostai differ? From a  high-level, it looks like Gostai is targeting researchers and hobbyists while LabVIEW Robotics is targeting more industrial-grade robotics development. But what is it about Urbi that could be useful for LabVIEW developers and vice versa? Why would it be useful for these two languages to talk to each other? It seems as though they attempt to accomplish the same things.  Without much first-hand experience with Urbi, I’ve hypothesized a scenario:

Urbi has great examples for controlling robot hardware platforms, like the Sony Aibo, LEGO NXT, and Robotis Bioloid. Assuming you’re not designing your own custom hardware platform, these examples should get you up  and running quickly.

But, let’s say you are building your own hardware platform, where you are selecting the specific motors, sensors and physical model. You are not confined to the physical model of the commercial robot platforms like the NXT and iCreate. Urbi’s library of specific hardware connectivity may not be as extensive. On the other hand, LabVIEW offers hundreds of drivers for commercial actuators and sensors. Perhaps someone could develop algorithms in Urbi and then use LabVIEW to easily connect, communicate and control their hardware.

Just a thought. Anyone is free to chime in an offer additional thoughts or scenarios.

Without knowing much about how the two can help each other, I went ahead and voted on the idea and I encourage anyone else to do the same. I mean, why not? Like I said in regards to MSRDS, the more connectivity we have between development tools, the more sharing and reuse developers have with their code. The more the better is what I say.

Advertisements

LabVIEW Robotics Connects to Microsoft Robotics Studio Simulator January 27, 2010

Posted by emiliekopp in labview robot projects.
Tags: , , , , , , , , , , , ,
7 comments

Several people have pointed out that Microsoft Robotics Developer Studio has some strikingly familiar development tools, when compared to LabVIEW. Case in point: Microsoft’s “visual programming language” and LabVIEW “graphical programming language;” both are based on a “data flow” programming paradigm.

LabVIEW Graphical Programming Language

MSRDS Visual Programming Language

There’s no need for worry, though (at least, this is what I have to keep reminding myself). National Instruments and Microsoft have simply identified a similar need from the robotics industry. With all the hats a roboticist must wear to build a complete robotic system (programmer, mechanical engineer, controls expert, electrical engineer, master solderer etc.), they need to exploit any development tools that allow them to build and debug robots as quickly and easily as possible. So it’s nice to see that we’re all on the same page. 😉

Now, both LabVIEW Robotics and MSRDS are incredibly useful robot development tools, each on its own accord. That’s why I was excited to see that LabVIEW Robotics includes a shipping example that enables users to build their code in LabVIEW and then test a robot’s behavior using the MSRDS simulator. This way, you get the best of both worlds.

Here’s a delicious screenshot of the MSRDS-LabVIEW connectivity example I got to play with:

How it works:

Basically, LabVIEW communicates with the simulated robot in the MSRDS simulation environment as though it were a real robot. As such, it continuously acquires data from the simulated sensors (in this case, a camera, a LIDAR and two bump sensors) and displays it on the front panel. The user can see the simulated robot from a birds-eye view in the Main Camera indicator (large indicator in the middle of the front panel; can you see the tiny red robot?). The user can see what is in front of the robot in the Camera on Robot indicator (top right indicator on the front panel) . And the user can see what the robot sees/interprets as obstacles in the Laser Range Finder indicator (this indicator, right below Camera on Robot,  is particularly useful for debugging).

On the LabVIEW block diagram, the simulated LIDAR data obtained from the MSRDS environment is processed and used to perform some simple obstacle avoidance, using a Vector Field Histogram approach. LabVIEW then sends command signals back to MSRDS to control the robot’s motors, and successfully navigates the robot throughout the simulated environment.

There’s a tutorial on the LabVIEW Robotics Code Exchange that goes into more detail for the example. You can check it out here.

Why is this useful?

LabVIEW users can build and modify their robot control code and test it out in the MSRDS simulator. This way, regardless of whether or not you have hardware for your robot prototype, you can start building and debuging the software. But here’s the kicker: once your hardware is ready, you can take the same exact code you developed for the simulated robot and deploy it to an actual physical robot, within a matter of minutes. LabVIEW takes care of porting the code to embedded processors like ARMs, RT OS targets and FPGAs so you don’t have to. Reusing proof-of-concept code, tested and fined-tuned in the simulated environment, in the physical prototype will save the developers SO MUCH TIME.

Areas of improvement:

As of now, the model used in the LabVIEW example is fixed, meaning, you do not have the ability to change the physical configuration of actuators and sensors on the robot; you can only modify the behavior of the robot. Thus, you have a LIDAR, a camera, two bumper sensors and two wheels, in a differential-drive configuration, to play with. But it’s at least a good start.

In the future, it would be cool to assign your own model (you pick the senors, actuators, physical configuration). Perhaps you could do this from LabVIEW too, instead of having to build one from scratch in C#. LabVIEW already has hundreds of drivers available to interface with robot sensors; you could potentially just pick from the long list and LabVIEW builds the model for you…

Bottom line:

It’s nice to see more development tools out there, like LabVIEW and MSRDS, working together. This allows roboticists to reuse and even share their designs and code. Combining COTS technology and open design platforms is the recipe for the robotics industry to mirror what the PC industry did 30 years ago.