CT Seeks Stricter AI Regulations After Federal Report Suggests Algorithm Bias

David McGuire, chair of the Connecticut Advisory Committee to the U.S. Commission on Civil Rights, during a news conference on April 25, 2023. (Credit: Emilia Otte)

Share

TwitterFacebookCopy LinkPrintEmail

It could decide which school a child will attend, or pick one candidate for a job over another. It might help a landlord determine who gets a certain apartment, guide police to high-crime neighborhoods, or predict who will have trouble paying back a loan. 

All without a human brain. 

Connecticut lawmakers want stricter regulations around the use of artificial intelligence in government after a recent report outlined the ways algorithms can be a source of unintentional bias in decision-making. 

David McGuire, chair of the Connecticut Advisory Committee to the U.S. Commission on Civil Rights, the organization that produced the report, said during a news conference Tuesday that the state has used algorithms in important areas – when hiring for roles within the state Department of Administrative Services and when conducting lotteries to place students in schools, for example. 

“These are areas where civil rights are really being implicated because they’re potentially using algorithms that are either using data sets that are clogged, or the algorithms themselves are set up in ways that are perpetually biased. We have to get ahead of that before they proliferate our state,” McGuire said. 

But McGuire said it’s unclear what other algorithms are being used at state agencies. One of the goals of a bill proposed by legislators would be to take an inventory of the different algorithms being used in different state agencies in Connecticut.  

“There’s very little public transparency around the government’s use of algorithms here in Connecticut,” he said. “I think that we’re at a place where people are excited to latch onto this technology and use it to deliver services to the citizens of the state, but doing so in ways that are not as well-formulated as they should be.” 

McGuire said the way algorithms currently function is protected from public scrutiny under the assumption that it constitutes a trade secret of the company that created it. 

The report found that algorithms in general can be biased in the way they make decisions, either because they use a data set that is biased due to the algorithm poorly interpreting the data or by making decisions based on “proxy variables” — criteria that, while not explicitly looking at race, still leads to a racially biased outcome. For example, the report described an algorithm that landlords used to screen tenants. The algorithm took into account arrest records, which disproportionately affect people of color. 

“These algorithms are opaque, inscrutable and hide their biased behavior in piles of mathematics and computation,” said Suresh Venkatasubramanian, a professor of science and data science at Brown University, who co-authored the Blueprint for an AI Bill of Rights that the White House released last year. 

Algorithms used in child welfare agencies have been found to give faulty and racist predictions around whether or not a child is in danger in a certain residence, according to the report. An algorithm called COMPAS, which is used in some areas to help judges decide what sentences to issue, was found to predict that Black offenders were disproportionately likely to reoffend. And algorithms used to predict crime in neighborhoods have been found to choose areas more likely to have residents of color. The report said that up to one third of the cities in the U.S. — including Hartford — are either using or considering using these types of algorithms. 

Ken Mysogland, the bureau chief of external affairs, for the state Department of Children and Families, said DCF did not use any predictive algorithms in its work. 

State Sen. James Maroney, D-Milford, underscored the importance of creating regulations for artificial intelligence before it became ubiquitous. 

“There’s tremendous potential for AI. And when you look at the general trends of an aging population, shrinking workforce, we’re going to need to use those tools to help take care of our population — to help do work,” he said. “We can’t find people in so many different professions that we can make more efficient with AI, but we just want to make sure that we’re doing it safely.” 

He also noted the challenge of defining precisely what constitutes artificial intelligence — a difficulty noted in the report. 

“It could be as simple as a chatbot or it could be something that makes a decision. And that’s what we want to look at … these algorithms that are making critical decisions and we know that they impact us all,” Maroney said. “Machine learning just picks up what we’re doing now and it amplifies and perpetuates that. So we don’t want to see those biases that we’ve known about be continued.”  

State Rep. David Rutigliano, R- Trumbull, said Republicans supported placing the same restrictions on how the government uses data that private companies are bound by. 

“We support transparency … in how the government uses AI and how it affects citizens,” Rutigliano said. 

The report also recommended a series of changes to the regulations around AI, including regular audits to make sure the algorithms are not biased, creating an appeals process for people who feel they have been discriminated against because of an algorithm, banning the sale of data and changing the Freedom of Information law to require the state to disclose the data sources used by state agencies.  

Maroney said currently changing the Freedom of Information laws went beyond the scope of the law his committee was trying to pass. But the bill does direct lawmakers to form a task force to study artificial intelligence, and also requires companies that contract with state agencies to comply with consumer data privacy law. 

It additionally creates offices directly responsible for reviewing and inventorying algorithms that the state uses, as well as creating regulations around those algorithms — although some of these requirements might be reduced because of cost. 

Maroney said they were also looking at requiring independent assessments to be done on algorithms whenever there is a significant update to the program. He said he was not yet sure who would be tasked with doing the assessments, but that an outside third party would be the ideal evaluator. 

If passed, the state will be required to provide a list of algorithms that it uses for decision-making purposes by December 2023. 


Emilia Otte

Emilia Otte covers health and education for the Connecticut Examiner. In 2022 Otte was awarded "Rookie of the Year," by the New England Newspaper & Press Association.

e.otte@ctexaminer.com