Artificial intelligence researchers from nearly 30 countries are boycotting a South Korean university over concerns a new lab in partnership with a leading defence company could lead to “killer robots”.
More than 50 leading academics signed the letter calling for a boycott of Korea Advanced Institute of Science and Technology (KAIST) and its partner, defence manufacturer Hanwha Systems. The researchers said they would not collaborate with the university or host visitors from KAIST over fears it sought to “accelerate the arms race to develop” autonomous weapons.
“There are plenty of great things you can do with AI that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern,” said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales. “This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms.”
The boycott comes ahead of a United Nations meeting in Geneva next week on autonomous weapons, and more than 20 countries have already called for a total ban on killer robots. The use of AI in militaries around the world has sparked fears of a Terminator-like situation and questions have been raised about the accuracy of such weapons and their ability to distinguish friend from foe.
Hanwha is one of South Korea’s largest weapons manufacturers, and makes cluster munitions which are banned in 120 countries under an international treaty. South Korea, along with the US, Russia and China, are not signatories to the convention.
Walsh was initially concerned when a Korea Times article described KAIST as “joining the global competition to develop autonomous arms” and promptly wrote to the university asking questions but did not receive a response.
KAIST’s president, Sung-Chul Shin, said he was saddened to hear of the boycott. “I would like to reaffirm that KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots,” Shin said in a statement.
“As an academic institution, we value human rights and ethical standards to a very high degree,” he added. “I reaffirm once again that KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.”
Profile What is Hanwha Systems?
Hanwha Systems is part of a large global South Korean conglomerate spanning finance, construction, chemicals and hotels and is one of the 10 largest companies in the country. It was founded in 1952, during the Korean War, as the Korea Explosives Company.
The company was excluded from Norway’s national oil fund, designed to invest profits for future generations, over ethical concerns related to the production of cluster munitions. The weapons have been banned under a treaty signed by 120 countries, although South Korea is not a party.
Hanwha employees were set to “build a cooperation system through sharing education programs and exchanging research manpower” at an artificial intelligence research centre at a leading Korean university, sparking controversy.
In recent years, Hanwha has singled out the Middle East as “a prime and strategic market” and sought to expand with acquisitions of rival companies. The company partnered with Indian firm Larsen & Toubro to secure a reported $660m weapons contract as part of the beginning of a plan to overhaul the Indian military.
The group’s chairman, Kim Seung-youn, was convicted of embezzlement in 2012, part of a string of high-profile Korean businessmen charged with financial crimes.
KAIST opened the research centre for the convergence of national defence and artificial intelligence on 20 February, with Shin saying at the time it would “provide a strong foundation for developing national defence technology”.
The centre will focus on “AI-based command and decision systems, composite navigation algorithms for mega-scale unmanned undersea vehicles, AI-based smart aircraft training systems, and AI-based smart object tracking and recognition technology”, the since-deleted announcement said.
South Korea’s Dodaam Systems already manufactures a fully autonomous “combat robot”, a stationary turret, capable of detecting targets up to 3km away. Customers include the United Arab Emirates and Qatar and it has been tested on the highly militarised border with North Korea, but company executives told the BBC in 2015 there were “self-imposed restrictions” that required a human to deliver a lethal attack.
The Taranis military drone built by the UK’s BAE Systems can technically operate entirely autonomously, according to Walsh, who said killer robots made everyone less safe, even in a dangerous neighbourhood.
“Developing autonomous weapons would make the security situation on the Korean peninsula worse, not better,” he said. “If these weapons get made anywhere, eventually they would certainly turn up in North Korea and they would have no qualms about using them against the South.”
Since you’re here …
… we have a small favour to ask. More people are reading the Guardian than ever but advertising revenues across the media are falling fast. And unlike many news organisations, we haven’t put up a paywall – we want to keep our journalism as open as we can. So you can see why we need to ask for your help. The Guardian’s independent, investigative journalism takes a lot of time, money and hard work to produce. But we do it because we believe our perspective matters – because it might well be your perspective, too.
I appreciate there not being a paywall: it is more democratic for the media to be available for all and not a commodity to be purchased by a few. I’m happy to make a contribution so others with less means still have access to information. Thomasine, Sweden
If everyone who reads our reporting, who likes it, helps fund it, our future would be much more secure. For as little as £1, you can support the Guardian – and it only takes a minute. Thank you.
View all comments >
comments (739)
Sign in or create your Guardian account to join the discussion.
I support them and wish them only the best
"Hanwha is one of South Korea’s largest weapons manufacturers, and makes cluster munitions which are banned in 120 countries under an international treaty. South Korea, along with the US, Russia and China, are not signatories to the convention."
LOL. Big players and their mates bitching about NK and Syria, et al., having naughty weapons which can kill indiscriminately, but themselves upping the ante for researching and manufacturing weapons which can kill indiscriminately. Big grain of salt to wash down with a nice cup of hypocritea.
I am sure the Russians will stop their OWN research in kind .....
Excellent news! These things will finally bring a pax humana because, after the initial mass slaughters, no one will ever dare leave home again, even to watch football down the pub. Great idea, white coats!
Great. That's the Armed Forces on the scrapheap along with truck drivers, taxi drivers, assembly plant workers...
If we develop them for use in the UK I hope the council's have enough money to fix all the pot holes in the roads. Imagine being a poor robot trying to negotiate the A 30 on a dark night without falling over!
If only common sense were more common.
Not sure we will have autonomous "killer" robots, making decisions on who to kill, that is a power noone would give up control of.
I'm mainly be concerned about the people controlling the machines. Just imagine 100 of these things (even remotely) on the ground operating like Robocop riot police.
All civil disobedience or just simple protests will be a thing of the past. (which you might think is a good thing while you think the established global powers have a conscience and are on your side).
Perhaps not though when the thin veil of democracy falls.
A global moral, political and social system where profit is the only aim will in the end regress to nightmare scenarios. As it becomes more 'socially acceptable' to sell sex, your children and your organs, so in the field of AI it will become "mainstream" to move from harmless self propelling vacuum cleaners to robot police etc. That is what happens when you remove all moral and national constraints to behaviour and replace it will petty materialism and selfishness.
The "scientists" developing these killer robots are not evil people but they believe that by combining their own 'right' to invent new things with a willing market to buy it they are creating a 'solution'. And who is to say they are wrong? With what does one hold them back apart from regulation which in any case will not work by itself.
What happens is you end up with a society very similar to the one the Tory government has been trying to bring about in the UK.
It was suggest on BBC R4 that the killer robots were being developed that could power themselves via the consumption of "organic material". Presumably there would be a lot more concentrated calories from discarded human flesh than from leaves?
Tasty!
Sign in or create your Guardian account to recommend a comment