Robotics is rapidly being transformed by advances in artificial intelligence.
But our ever-growing appetite for intelligent, autonomous machines poses a host of ethical challenges.
As with all technology, the range of future uses for our research is difficult to imagine.

Its even more challenging to forecast given how quickly this field is changing.
Today, though, the best algorithms as shown in published papersare now at 86 percent accuracy.
That advance alone allows autonomous robots to understand what they are seeing through the camera lenses.

Reports indicate that the Islamic State is using off-the-shelf drones, some of which are being used for bombings.
It also shows the rapid pace of progress over the past decade due to developments in AI.
This kind of improvement is a true milestone from a technical perspective.
It’s free, every week, in your inbox.

There are online forums filled with peopleeager to help anyone learn how to do this.
Whats to stop someone from strapping an explosive or another weapon to a drone equipped with this technology?
Using a variety of techniques, autonomous drones are already a threat.

The autonomous systems that are being developed right now could make staging such attacks easier and more devastating.
Reports indicate that the Islamic State is using off-the-shelf drones, some of which are being used for bombings.
Regulation or review boards?
They defined lethal autonomous weapons as platforms that are capable of selecting and engaging targets without human intervention.
However, they might inadvertently enable others, with minimal expertise, to create such weapons.
What can we do to address this risk?
Regulation is one option, and one already used by banning aerial drones near airports or around national parks.
Those are helpful, but they dont prevent the creation of weaponized drones.
Traditional weapons regulations are not a sufficient template, either.
They generally tighten controls on the source material or the manufacturing process.
Another option would be to follow in the footsteps of biologists.
In 1975, they helda conference on the potential hazards of recombinant DNAat Asilomar in California.
There, experts agreed to voluntary guidelines that would direct the course of future work.
For autonomous systems, such an outcome seems unlikely at this point.
Many research projects that could be used in the development of weapons also have peaceful and incredibly useful outcomes.
These boards consider the benefits to the populations involved in the research and craft ways to mitigate potential harms.
But they can regulate only research done within their institutions, which limits their scope.
Research review boards would be the first step toward self-regulation and could flag projects that could be weaponized.
I feel that the potential for good is too promising to ignore.