Biden Takes First Step to Regulate Artificial Intelligence with Executive Order

Mukund Kapoor
By Mukund Kapoor - Author 3 Min Read
3 Min Read

President Biden's new order aims to secure A.I. technology against potential threats and misuse.

In Short
  • President Biden signs an order to make A.I. safer and control its misuse.
  • The order is a first step and more action from Congress is needed.
  • Companies must test their A.I. for safety and report to the federal government.

October 31, 2023: President Joe Biden has signed an important order on Artificial Intelligence (A.I.). He wants companies to tell the government if their A.I. systems could help bad people or countries make very dangerous weapons.

The order also wants to make it hard for fake videos and fake news to fool people.

Mr. Biden said at the White House, “Deep fakes use A.I. to hurt people’s good name, spread fake stories, and cheat people.”

President Joe Biden signed an executive order on artificial intelligence
President Joe Biden signed an executive order on artificial intelligence in the East Room of the White House in Washington. Vice President Kamala Harris looks on at right.

He worries that someone could change a short clip of a person’s voice to make it say bad things. He even saw a fake video of himself and was shocked.

This action shows that America, a leader in A.I. tech, will also lead in making rules for it. Vice President Kamala Harris said, “We have a duty to make sure A.I. is safe for everyone.” She will go to the United Kingdom to talk about this at a big meeting.

But the new rule is just the first step. Biden said, “We still need Congress to help.”

The rule can control A.I. used by the government but not much in private companies.

The rule has many parts. It also wants to make America’s A.I. tech better and stop China from getting ahead.

Big tech companies like Microsoft and Google agree with this order. They worry that if A.I. is misused, they could be in trouble. The new rule was made using an old law from the Korean War time. This rule says companies must test their A.I. so it can’t be used to make very bad weapons like nuclear bombs.

They must tell the government but not the public about the tests. The rule also wants to put a special mark on photos, videos, and sound made by A.I.

This makes it easy to find where they come from. President Biden thinks this will help people know what is real and what is fake.

Some experts say this rule asks for a lot but it will be hard to do all of it. Sarah Kreps, a professor, said it’s hard for the government to hire A.I. experts because private companies pay more.

Note

The information is from multiple sources and is summarized for this article.

Credit to original reporting from Cecilia Kang and David E. Sanger, among others.

Disclaimer

Based on our quality standards, we deliver this website’s content transparently. Our goal is to give readers accurate and complete information. Check our News section for latest news. To stay in the loop with our latest posts follow us on Facebook, Twitter and Instagram. 

Subscribe to our Daily Newsletter to join our growing community and if you wish to share feedback or have any inquiries, please feel free to Contact Us. If you want to know more about us, check out our Disclaimer, and Editorial Policy.

By Mukund Kapoor Author
Follow:
Mukund Kapoor, the enthusiastic author and creator of GreatAIPrompts, is driven by his passion for all things AI. With a special knack for simplifying complex AI concepts, he's committed to helping readers of all levels - be it beginners or experts - navigate the intriguing world of artificial intelligence. Through GreatAIPrompts, Mukund ensures that readers always have access to the most recent and relevant AI news, tools, and insights. His dedication to quality, accuracy, and clarity is what sets his blog apart, making it a reliable go-to source for anyone interested in unlocking the potential of AI. For more information visit Author Bio.
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *