jump to navigation

LabVIEW Robotics Connects to Microsoft Robotics Studio Simulator January 27, 2010

Posted by emiliekopp in labview robot projects.
Tags: , , , , , , , , , , , ,
7 comments

Several people have pointed out that Microsoft Robotics Developer Studio has some strikingly familiar development tools, when compared to LabVIEW. Case in point: Microsoft’s “visual programming language” and LabVIEW “graphical programming language;” both are based on a “data flow” programming paradigm.

LabVIEW Graphical Programming Language

MSRDS Visual Programming Language

There’s no need for worry, though (at least, this is what I have to keep reminding myself). National Instruments and Microsoft have simply identified a similar need from the robotics industry. With all the hats a roboticist must wear to build a complete robotic system (programmer, mechanical engineer, controls expert, electrical engineer, master solderer etc.), they need to exploit any development tools that allow them to build and debug robots as quickly and easily as possible. So it’s nice to see that we’re all on the same page. 😉

Now, both LabVIEW Robotics and MSRDS are incredibly useful robot development tools, each on its own accord. That’s why I was excited to see that LabVIEW Robotics includes a shipping example that enables users to build their code in LabVIEW and then test a robot’s behavior using the MSRDS simulator. This way, you get the best of both worlds.

Here’s a delicious screenshot of the MSRDS-LabVIEW connectivity example I got to play with:

How it works:

Basically, LabVIEW communicates with the simulated robot in the MSRDS simulation environment as though it were a real robot. As such, it continuously acquires data from the simulated sensors (in this case, a camera, a LIDAR and two bump sensors) and displays it on the front panel. The user can see the simulated robot from a birds-eye view in the Main Camera indicator (large indicator in the middle of the front panel; can you see the tiny red robot?). The user can see what is in front of the robot in the Camera on Robot indicator (top right indicator on the front panel) . And the user can see what the robot sees/interprets as obstacles in the Laser Range Finder indicator (this indicator, right below Camera on Robot,  is particularly useful for debugging).

On the LabVIEW block diagram, the simulated LIDAR data obtained from the MSRDS environment is processed and used to perform some simple obstacle avoidance, using a Vector Field Histogram approach. LabVIEW then sends command signals back to MSRDS to control the robot’s motors, and successfully navigates the robot throughout the simulated environment.

There’s a tutorial on the LabVIEW Robotics Code Exchange that goes into more detail for the example. You can check it out here.

Why is this useful?

LabVIEW users can build and modify their robot control code and test it out in the MSRDS simulator. This way, regardless of whether or not you have hardware for your robot prototype, you can start building and debuging the software. But here’s the kicker: once your hardware is ready, you can take the same exact code you developed for the simulated robot and deploy it to an actual physical robot, within a matter of minutes. LabVIEW takes care of porting the code to embedded processors like ARMs, RT OS targets and FPGAs so you don’t have to. Reusing proof-of-concept code, tested and fined-tuned in the simulated environment, in the physical prototype will save the developers SO MUCH TIME.

Areas of improvement:

As of now, the model used in the LabVIEW example is fixed, meaning, you do not have the ability to change the physical configuration of actuators and sensors on the robot; you can only modify the behavior of the robot. Thus, you have a LIDAR, a camera, two bumper sensors and two wheels, in a differential-drive configuration, to play with. But it’s at least a good start.

In the future, it would be cool to assign your own model (you pick the senors, actuators, physical configuration). Perhaps you could do this from LabVIEW too, instead of having to build one from scratch in C#. LabVIEW already has hundreds of drivers available to interface with robot sensors; you could potentially just pick from the long list and LabVIEW builds the model for you…

Bottom line:

It’s nice to see more development tools out there, like LabVIEW and MSRDS, working together. This allows roboticists to reuse and even share their designs and code. Combining COTS technology and open design platforms is the recipe for the robotics industry to mirror what the PC industry did 30 years ago.

Advertisements

NXT AlphaRex: Meet Spykee from ERECTOR August 12, 2009

Posted by emiliekopp in industry robot spotlight, robot fun.
Tags: , , , , , , , , , ,
1 comment so far

I first heard about this from Christian Loew, one of NI’s Systems Engineers for FPGA and CompactRIO, when he tweeted this morning about a new robot kit, released from ERECTOR, called Spykee.

Spykee is, fittingly, a spy robot, equipped with treadded rubber tracks, speaker, microphone, webcam and a WiFi card, making it a pretty interesting robot “toy.” Using the WiFi connection, you can use Spykee to make free calls on the Internet; it’s Skype 3.0 compatible. You can also use that same WiFi connection to broadcast the video taken from the onboard webcam. Hence, Spykee could make an interesting surviellance robot, perhaps even a useful pet-sitter?

The website claims you can build Spykee yourself, which I assume to mean you use ERECTOR set pieces to put him together. So then, ERECTOR releases a robot platform, equipped with WiFi and a webcam, a mobile-base and Machine Man Interface (or MMI) software, which is claimed to be available as open-source, so Do-It-Yourself-ers can potentially hack in and give Spykee a customized brain.

Sign me up.

However, I will say that ERECTOR’s Spykee is not nearly as flexible of a robot prototyping platform as the LEGO MINDSTORMS NXT. Sure you can use ERECTOR pieces to give Spykee his own unique shape and appearance. But there’s no actuation beyond the motors built into the treadded base platform. So he certainly won’t have any dexterity; i.e. you can’t actuate and/or control any of the erector pieces, only the movement of the treads. With the NXT, you have access to individual motors, which can then connect to the mechanical design built with your NXT pieces. Thus, your robot could walk or roll on wheels, whichever you choose.

On the software side, Spykee comes with MMI, which is a fancy UI you can use to connect to Skype, play your MP3 playlist and spy on your pets while you’re not home. However, it doesn’t look like there’s any built in programming environment for Spykee. Thus, if you want to give him his own brain, you’re going to have to get tricky and use some other, more-traditional programming language to make function calls from the alleged open-source MMI library.

And so far, ERECTOR hasn’t provided much detail on what they’re opening up for access on Spykee. Will I have access to the IR sensor that is used for auto-parking at the charging station? Am I going to have access to the processor for on-board control or will I have to use wireless TCP commands from a PC station? Better yet, what is the actual processor on Spykee? It doesn’t look like anybody knows yet (will we ever know?)

One thing I like about the NXT is that LEGO wants to make it as easily accessible as possible. You have direct access to all sensors and actuators. You also have a choice between running your customized program, using either NXT-G or the free LabVIEW NXT Toolkit, directly on the NXT brick (which I would add uses an ARM 7 processor) or controlling your robot via BlueTooth communication from a mission control PC, thus making it possible to use practically any programming language, including NXT-G, LabVIEW, MSRS, Java, C/C++, etc.

So as a quick, high-level review, the ERECTOR Spykee looks like it could be a formidable competitor to the LEGO MINDSTORMS NXT; the webcam and WiFi hardware make it particularly desirable. However, it’s still in it’s early stages of release and there is little known on how the functionality will be opened up for hacking. As such, for now I’m sticking to my NXT as a low-cost, robot prototyping platform.