Data61’s nuclear robot challenge


Stuart Kennedy
Contributor

Nuclear facility inspectors wary of getting a bit of glow on after dark have help on the way as researchers delve into using AI-assisted robots to do the radiation dirty work, and are testing their ideas at an Australian co-sponsored competition.

A Robotics Challenge run this week under the auspices of the International Atomic Energy Agency (IAEA) and Data61 has 12 teams from nine countries competing in two challenges. One challenge takes place on ground, while the other is on the water.

This is the first Robotics Challenge to be conducted by the IAEA.

Robot challenge: Data61 is pushing for the better use of AI-assisted robots

The challenges simulate some of the checks undertaken manually by IAEA Safeguards Inspectors. While the IAEA is autonomous, it reports to both the United Nations General Assembly and Security Council.

“The Robotics Challenge aims to test the suitability of new robotic designs to help the IAEA fulfil some of its verification tasks more efficiently, freeing up inspectors to concentrate more on examining how facilities are being used. We’re excited to host this Challenge with the team of robotics experts at CSIRO’s Data61,” said Andrey Sokolov, Technology Foresight Officer at the IAEA in a statement.

There are teams from Hungary, Republic of Korea, Israel, Switzerland, USA, UK, Canada, Germany and Finland. Australia as a co-host is not fielding a team.

Winners from the challenge rounds will go through a procurement process which could see them eventually scoring a contract agreement with the IAEA.

In the water test, teams will mount a handheld gadget used by inspectors called the Improved Cerenkov Viewing Device on an autonomously moving, floating robot to help confirm the presence of spent fuel stored underwater.

In the second challenge robots will be tested in their ability to help an inspector by moving autonomously across a storage area, REWRITE counting nuclear substance involved items, recording their ID tags, and carrying some IAEA instrument payloads.

Humans will still be around the nuclear inspections game for some time though.

“Full autonomy is probably still on the distant wish list, but what the agency is interested in in is robots that are substantially autonomous, so they would drive along or walk along with an inspector and be conducting measurements and so on,” said Professor Alberto Elfes, Chief Research Scientist & Group Leader for Robotics at CSIRO’s Data61.

“The robot can be doing all the boring, repetitive stuff,” said Professor Dr Elfes.

“It’s not going to get lost in terms of did I count barrel 500 or not.

“There are other situations where the best approach is to have a fully autonomous system with supervisory controls so you might have a human some distance away, peeking over the robot’s shoulder so to speak and the robot is executing its tasks autonomously.

“Only if absolutely necessary the inspector could intervene and take control of the robot.”

Professor Elfes said in the future, robots could be constantly patrolling, inspecting and surveying nuclear facilities, although this would only be practical with cooperative, friendly nations.

The challenge aims to boost collaboration between participating teams and Professor Elfes said Australia had already made a major contribution in this area with the CSIRO developed 3D Lidar-based simultaneous localisation and mapping system (3D SLAM).

3D SLAM is being used in everything from recording of homicide crime scenes for Queensland Police to mapping dinosaur footprints as well as mapping nuclear facilities. It is being commercialised worldwide by CSIRO co-founded partner GeoSLAM.

The nuclear inspections robot challenge comes at a heated time for AI as a debate rages worldwide on the proliferation of weaponising AI into gun platforms like drones, tanks and legged robot soldiers.

“This is an area that raises a huge amount of moral and ethical challenges,” said Professor Elfes who said there was the issue of developing a legal and ethical framework around the use of weaponised AI and then the technical issue of developing technology that could detect weaponised AI being developed or used.

Earlier this month a group of more than one hundred Australian AI researchers sent an open letter to Prime Minister Malcolm Turnbull, imploring the government to be part to a worldwide ban on autonomous lethal weapons.

“This whole area of robot ethics and how robot ethics translates into practice is an emerging area of interest to us and so we are starting to do work in this area,” said Professor Elfes.

Professor Ron Arkin, a renowned roboticist and robo-ethicist from the Georgia Institute of Technology in the US is currently on sabbatical at QUT and CSIRO.

Do you know more? Contact James Riley via Email.

Leave a Comment

Related stories