fbpx

Robot Soldiers: Hampered By Their Imagers

grenade_robot

As anyone who's followed the U.S. military's engagements in countries such as Afghanistan, Iraq, and Pakistan already knows, this country's armed forces are increasingly relying on robotic alternatives to human warriors; tread-equipped Roomba and Scooba siblings to disarm bombs, for example, or propeller- and wing-equipped drone alternatives to fighter jets for both surveillance and bomb-delivering purposes. IEEE Spectrum several weeks ago published an excellent writeup on this phenomenon, in fact; I wholeheartedly commend 'Autonomous Robots in the Fog of War' to your attention.

However, as the Spectrum piece points out, the 'autonomous' part of the title is largely a futures forecast, not a present reality. Part of the problem involves analysis shortcomings (quality, quantity and speed alike) of the incoming data. Quoting a few lines' worth of author Lora G. Weiss's prose:

Despite the advances in both their performance and safety, these robots are still far from perfect, and they routinely operate in situations for which they may not have been designed and in which their responses cannot always be anticipated. Some of the DOD's most advanced UAVs carry dozens of sensors, including high-resolution night-vision cameras, 3-D imagers, and acoustic arrays. Yet most cannot distinguish a sleeping dog from a bush, even at high noon. Humans are still needed to operate the vehicles, interpret the data, and coordinate tasks among multiple systems.

She's right, of course. Those bomb-detecting and -dismantling robots are managed in a wired or wireless fashion by a nearby human operator. And similarly, the flying drones are controlled by humans sitting at computer screens, often on the other side of the world at a base north of Las Vegas, NV. But information processing isn't the only issue, as Weiss notes in a continuation of that same paragraph, "If we are ever to see fully autonomous robots enter the battlefield—those capable of planning and carrying out missions and learning from their experiences—several key technological advances are needed, including improved sensing, more agile testing, and seamless interoperability."

Note the emphasis on improved sensing. That's a theme, specifically in the visual domain, which was also raised in a Wired Magazine writeup published just yesterday, titled 'Bad Eyes Keep Unmanned Infantry Out of the Fight.' Check out this choice excerpt:

Perhaps most importantly, the robots have poor eyesight; the machines still can’t see as far as they can shoot. “If I’ve got a robot with a machine gun that’s got a max range of 800 meters, and a camera that can only see a couple meters, well, that’s a problem,” Lt. Col. Stewart Hatfield, chief of the lethality branch of the U.S. Army Capabilities Integration Center, told an audience at the Association for Unmanned Vehicle Systems International conference in Washington on Tuesday.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.

Contact

Address

1646 N. California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone
Phone: +1 (925) 954-1411
Scroll to Top