UN chief calls for legal regulations on AI weapons by 2026 UN Secretary-General Antonio Guterres has called on member nations to conclude a legally binding document aimed at regulating or banning Lethal Autonomous Weapons Systems, or LAWS, by 2026.

A LAWS is defined as an artificial intelligence-powered weapons system that can select and attack targets autonomously — meaning without human intervention.

Some observers say that the systems could cause serious civilian casualties and other issues, if used on battlefields.

The UN chief compiled a report, which includes opinions given by member nations and human rights groups after a UN resolution aimed at addressing AI weapons was adopted.

Guterres expressed a strong sense of urgency. He said that autonomously targeting humans with machines is a “moral line that must not be crossed.”

He also said that time is running out for the international community to take preventive actions on this issue.

The UN chief urged member nations to prohibit lethal autonomous weapons systems that function without human control. He also urged them to restrict other autonomous systems.

But some members said a clear definition of LAWS has not been established.

Others cited the benefits of such weapons systems. They say their use lowers the risk of collateral damage by enhancing precision. They add that it eliminates errors that can be caused by a human operator’s mental or physical state.

Observers are waiting to see whether member countries will be able to agree to restrict LAWS at the UN General Assembly and other forums.

Comments are closed.