jump to navigation

U.S. Military Insect Drones (powered by LabVIEW?) July 15, 2011

Posted by emiliekopp in industry robot spotlight, labview robot projects.
Tags: , , , , ,

Do the screens of any of the monitors look familiar?

Read the news coverage of this robots here:

Micro-machines are go: The U.S. military drones that are so small they even look like insects

Snake-like robot developed by the Army – powered by LabVIEW July 28, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , ,
Check out the latest front page article on army.mil, the official page of the U.S. Army:

Army technology expands snake-robotics

The story highlights a snake-robot developed by the U.S. Army Research Laboratory. The robo-snake is biomemetic, meaning it maneuvers just like a real snake would, pushing off ground surfaces to propel itself. It can crawl, swim, climb or shimmy through narrow spaces while transmitting images to the Soldier operator. And it’s scalable, such that it can be built a robo-snake however large or small they’d like it to be. It’s expected to help with search-and-rescue and reconnaissance missions.

I’m sure you recognized the software on the command laptop’s screen too. Yep, that’s LabVIEW. Developers are using the graphical programming language to quickly and cost-effectively interface to the robot’s sensors and control it’s actuators. They can rapidly build and test early prototypes and ultimately deliver this dexterous robot to the field more quickly, saving lives and taxpayer dollars.

Read more about how the robot works here.

Vecna BEAR Military UGV: A Jack of All Trades July 14, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , , ,
add a comment

I’ve written about Vecna Robotics’ Battlefield Extraction-Assist Robot (BEAR) before and am familiar with its development process. Its design engineers used LabVIEW and NI CompactRIO to rapidly build and test early prototypes and win defense contracts.

BotJunkie recently featured a video that captures the Vecna BEAR in action. Admittedly, one can see that the actual “extraction” of military casualties still looks a bit awkward and probably needs more work. I’m sure operating a robot with so many degrees of freedom in a potentially hostile environment is extrememly difficult and requires an enormous amount of practice. Bottom line, this is definitely one of the more friendly military robots that is helping save lives.

But once you take handling an injured human out of the equation, the robot can actually serve several other purposes that may not require as much poise. For instance, the BEAR can help with more logistical tasks, like handling munitions and delivering supplies. It’s payload capacity is a whopping 500 lbs, so it could definitely help as an extra hand on the battlefield. And because of it’s dexterity, it could perform maintenance functions as well, such as inspection, decontamination and refueling. Saving time and effort allows troops to focus on the task at hand, which indirectly reduces the risk soldiers are exposed to.

So the BEAR is certainly a robotic jack-of-all-trades that could prove extremely useful when fully deployed. It’s fun to imagine full convoys of these surprisingly cute robots in the future (by the way, the video explains the cuteness factor).

Ethics of Military Robots February 10, 2010

Posted by emiliekopp in industry robot spotlight.
Tags: , , , , ,

I recently became familiar with the DoD’s Unmanned Systems Integrated Roadmap, a document that forecasts the evolution and adoption of robot technologies in modern warfare. For a government document, it was actually a pretty interesting read.

Many people are timid when discussing military robots, and justifiably so. While most of the robots are meant to perform tasks that are simply too dull, dirty or dangerous to warrant the risk of human life (for instance, MULE robots or robots that dispose of IEDs), most of the mainstream media attention is geared towards the robots with guns. And that’s when the references to Skynet come rolling in.

What happens when robots have guns? If something goes wrong, who is ultimately held responsible? The robot? The operator? The designer? The supplier of electromechanical parts? The chain of responsible parties could go on and on.

So we haven’t found the answer. But initiating the conversations is an good start.

P.W. Singer’s Wired for War has brought the conversation to the mainstream. The main point Singer addresses is that once you begin to move humans away from the battlefield (i.e give the guns to robots), they become more willing to use force. So as you reduce the risk of human life on one side, you become more willing to shed human life on the other. Understandably scary.

Another resource I found incredibly interesting is a report prepared for the US Department of Navy’s Office of Naval Research by California Polytechnic State University: Autonomous Military Robotics: Risk, Ethics, and Design. This report takes a more technical approach to understanding the ethics behind robots in the battlefield. While it addresses many of the concerns that Singer has brought to light, it also entertains the point that robots are “unaffected by the emotions, adrenaline, and stress that cause soldiers to overreact or deliberately overstep the Rules of Engagement and commit atrocities, that is to say, war crimes.” Of course, this assumes that humans are capable of programming robots to make ethically sound decisions on their own, which in turn warrants a walk down memory lane with Asimov’s 3 Laws of Robotics.

Bottom line: technology is a double-edged sword (thank you, Ray Kurzweil). There will be pro’s and con’s to exponentially-advancing technologies, especially with battlefield robots. Yet, we shouldn’t feel like we need to tip toe around the issue. The more we talk about it, the better equipped we’ll be when decisions must be made.