White House: Government should tread carefully on AI
The White House early Wednesday urged regulators to tread carefully when addressing the growing prevalence of artificial intelligence (AI) technologies throughout society.
{mosads}“Government has several roles to play,” the White House said in a report. “It should monitor the safety and fairness of applications as they develop, and adapt regulatory frameworks to encourage innovation while protecting the public.”
“Many areas of public policy, from education and the economic safety net, to defense, environmental preservation, and criminal justice, will see new opportunities and new challenges driven by the continued progress of AI,” the authors added. “Government must continue to build its capacity to understand and adapt to these changes.”
Wednesday’s report represents the first federal guidance on artificial intelligence, a field that is of mounting importance to industry. It was released a day before President Obama hosts a conference on tech issues in Pittsburgh.
Artificial intelligence technologies are becoming increasingly prevalent in modern life. They power the autonomous vehicles that Google and Uber have invested heavily in, for example, and back personal assistant applications that are becoming more commonplace in smartphones.
The report’s authors encouraged regulators to balance the harms and rewards associated with AI while crafting rules for products built with the technologies.
The authors said that, generally speaking, “the approach to regulation of AI-enabled products to protect public safety should be informed by assessment of the aspects of risk that the addition of AI may reduce, alongside the aspects of risk that it may increase.”
Regulators should also use existing regulations as a starting place when possible, they said.
Tech powerhouses have lobbied the White House to encourage light regulatory guidance to AI as part of the process leading up to the report.
Facebook’s James Hairston, for example, told the White House that any regulatory “approach should consider AI’s benefits to consumers.”
The Internet Association’s Michael Beckerman also said that thoughtful “public policy in this space demands a careful weighing of these benefits against perceived risks so that the benefits can be fully realized.”
But the report also pushed federal agencies to evaluate instances in which they are giving money to fund — or directly using — artificial intelligence technologies that help make major decisions in Americans’ lives.
Civil rights advocates have warned that if artificial intelligence uses data or code informed by bias, the product of the technologies could be biased as well. The White House cited as potentially risky applications of AI in the criminal justice system, where software is used to predict risk associated with an offender, or in the hiring process.
“Federal agencies that make grants to state and local governments in support of the use of AI-based systems to make consequential decisions about individuals should review the terms of grants to ensure that AI-based products or services purchased with Federal grant funds produce results in a sufficiently transparent fashion and are supported by evidence of efficacy and fairness,” the report said.
Another key questions about artificial intelligence is what effect it will have on the economy and the job market, where automation as a result of AI threatens jobs in certain industries. The White House elected not to address those questions in its Wednesday release, but promised to do so before President Obama leaves office in January.
“The economic policy questions raised by AI-driven automation are important but they are best addressed by a separate White House working group,” the report said. “The White House will conduct an additional interagency study on the economic impact of automation on the economy and recommended policy responses, to be published in the coming months.”
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..