AI-derived algorithms are getting more and more involved in governmental interactions with its citizens. Activities such as the allocation of government benefits, matching of students to public schools, and even the identification and sentencing of criminals are now mediated by less than transparent software whose decisions cannot be adequately questioned because even their creators do not fully understand how these algorithms reach their conclusions.

DJ Pangburn explores the idea that the federal government should oversee algorithms like the FDA regulates drugs in this report from Fast Company:

One significant obstacle to transparency in the U.S. is the Computer Fraud & Abuse Act, written back in 1984 to protect government agencies’ computer systems against hacking. Private-sector companies are able to avoid external audits of their systems–for bias or other flaws–by claiming auditing is a form of unauthorized access, says Shankar Narayan, Technology and Liberty Project Director at ACLU Washington. Reform of the CFAA could help pave the way for independent audits of algorithmic systems used by various government agencies.

Federal legislation mandating transparency, however unlikely it may be, would be a strong bulwark against hidden data technologies. That could include a new regulatory body to oversee software and algorithms, in the spirit of the U.S. Food and Drug Administration.

“The FDA was established in the early 20th century in response to toxic or mislabeled food products and pharmaceuticals,” media scholar Dan Greene and data scientist Genevieve Patterson recently argued in IEEE Spectrum. “Part of the agency’s mandate is to prevent drug companies from profiting by selling the equivalent of snake oil. In that same vein, AI vendors that sell products and services that affect people’s health, safety, and liberty could be required to share their code with a new regulatory agency.”

In a report published in December by AI Now, a group of researchers echoed that idea, pointing to examples like the Federal Aviation Administration and National Highway Traffic Safety Administration. They also called on vendors and developers who create AI and automated decision systems for government use to waive trade secrecy or other legal claims “that inhibit full auditing and understanding of their software.”