Two experts in the field of artificial intelligence, known as the ‘godfathers’ of AI, have joined a group of professionals to issue a warning about the potential loss of control over AI systems if action is not taken promptly.
In July 2023, Dr. Geoffrey Hinton made headlines by leaving his position at Google to raise awareness about the risks associated with artificial intelligence. Now, a group of 25 senior experts, including Yoshua Bengio, a recipient of the ACM Turing award, has published a paper highlighting the importance of taking AI safety seriously to prevent AI systems from spiraling out of control.
The paper emphasizes the need for caution, stating, “Without sufficient precaution, autonomous AI systems could become uncontrollable, making human intervention ineffective. This unchecked advancement in AI could lead to serious consequences such as cybercrime, social manipulation, and potential harm to humanity and the environment.”
The group points out that only a small percentage of AI research focuses on safety, with most resources dedicated to advancing AI technology rather than ensuring its safety and mitigating risks.
Why is AI safety important?
In addition to advocating for more research in AI safety, the group challenges governments worldwide to establish standards to prevent reckless and improper use of AI technology. They argue that similar regulatory frameworks, already in place for industries like pharmaceuticals and nuclear energy, should also be implemented in the AI sector.
While acknowledging efforts by countries like China, the European Union, the United States, and the United Kingdom in developing AI governance policies, the group believes that these measures are insufficient given the rapid progress in AI capabilities.
The group calls for governance measures that can adapt to sudden advancements in AI technology and trigger appropriate policies when certain milestones are reached. They stress the importance of acting now to prevent AI from surpassing human control in the near future.
Featured image: Ideogram
Leave a Comment