Part 1 looked at trends in Robotics. Here I consider some of the challenges, as well as provide more information on military robots.
What is needed for robots to be valued and respected members of our world? As a non-specialist I see five main requirements:
Discriminating – It is easy for us to distinguish individuals and objects by sight, sound, touch and smell. It is much harder for software to do so, but this is changing rapidly. As better and more sensors are added to robots, they’ll need to also get better at analysing and filtering the information that these sensors provide them with.
Safe – Not quite Asimov’s Three Laws of Robotics at this stage, but as robots become increasingly autonomous in both mobility and decision-making there is a need to ensure that they do what they are supposed to do. As Lora Weiss points out, being able to properly test such independent robots before letting them out into the real world will be challenging but not insurmountable. One potential safety complication is to ensure that someone in their garage or bedroom can’t (or is strongly discouraged from doing so) hack into a robot and make it do nasty things. Or turn them into a real walking, talking spam bot. (Oh wait, we have those already — they can’t read the ‘no circulars’ sign on letter boxes!)
Communicative — Weiss also notes the importance of robots being able to communicate appropriately with other robots, as well as other machines and humans. There will be a diversity of devices of varying degrees of ‘smartness’ in the world, and it is critical that different systems can interact with each other to avoid accidents.
Cheaper — Most robots are expensive, affordable only by well funded firms or institutions. Simple modestly priced cleaning robots are really of novelty value. The price of more sophisticated robots will need to go the way of flat screen TVs before they really take off as common domestic items.
Ethical — Many research groups are looking at the morals and ethics of robots, and human interactions with them. See, for example, the publications from the Georgia Tech Mobile Robot Lab. How we respond to the robots (especially those that look like us) may be a greater challenge. Under what circumstances is it right or ethical for a robot to take over a human job? Surveillance robots are common in the military. There is likely to be growing interest in having similar surveillance drones used for law & order and traffic management on civvy street. Will additional safe guards (and warnings) be required for such applications so that privacy is adequately protected?
Military robots have experienced particularly rapid growth. In the US nearly 12,000 robots had been drafted by last year (compared with just 50 flying ones in 2000), with many more anticipated to be added over the coming years. At least 50 countries now have military robots.
Some in the military think we are nearing a tipping point [PDF, 1.28MB] for how robots are used in wars. P.W. Singer, a respected commentator on military robots, suggests that since robotic technologies are changing so quickly the armed forces should experiment more with different types of robots rather than lock on to a few tried and trusted ones. There is already considerable discussion both within the military and outside it about the ethics and rules that should govern autonomous robots in warfare. Should robots be able to decide for themselves when to shoot, and if so how should they make the decision? Will countries be more likely to start wars if it is mostly robots doing the fighting?
I think a useful hunter-killer robot would be one that patrols our forests tracking down and humanely killing possums, rats and other mammalian pests. In a few years it should be straight forward to programme a robot to correctly identify such species. Then we could do away with all that 1080 and other poisons. Carnivorous robots have already been designed (as art not science), so an HKP (hunter-killer possum) Bot could both power itself from its prey as well as clean up the mess.
On a serious note, the critical issue with robots may be defining what we don’t want them to do, rather than specifying what they can do.
Interested in videos of a range of robots? Go here.