Lawyers cautious as state government agencies automate

By Thomas Franz
BridgeTower Media Newswires

DETROIT - As lead counsel on a class-action lawsuit involving more than 40,000 Michigan unemployment claimants, civil rights attorney Jennifer L. Lord has been on the front lines of expressing caution about using artificial intelligence in law enforcement.

Lord's work on Bauserman v. Unemployment Insurance Agency has involved representing plaintiffs who were accused of benefits fraud based on the results of a computer program called MIDAS, which identified individuals in the Unemployment Insurance Agency for fraudulent activity.

"We had about 40,000 people and 93 percent were falsely accused of fraud," Lord said. "Close to $100 million was seized from these 40,000 people who found they were accused of fraud when their wages were garnished or their tax refunds were seized."

A Michigan Supreme Court ruling on April 5 allowed the case to continue to move forward.

Background

Lord, an attorney with Pitt, McGehee, Palmer, & Rivers in Royal Oak, typically represents clients who have been terminated from their jobs for what they believe are illegal reasons.

In 2013, she began noticing a flurry of individuals with the same story, which led to the Bauserman case.

"We were experiencing this rush of people calling us and saying they were told they committed fraud," Lord said. "We spoke to some of our colleagues who also practice civil rights and employment law and everyone was experiencing this."

Lord and her team then discovered the Michigan Unemployment Insurance Agency in 2013 purchased the algorithmic decision-making computer called MIDAS, and when the agency did so, it also laid off its fraud detection unit.

"The computer was left to operate by itself," Lord said. "The Michigan auditor general issued two different reports a year later and found the computer was wrong 93 percent of the time."

Caution

As a result of this case and other issues like facial recognition software, Lord said the legal industry is soon to be impacted by more cases involving artificial intelligence from law enforcement and government agencies.

Lord said attorneys should seek out public records, and approach cases as a constitutional matter.

"We found out about this through a press release from 2013. It takes creative thinking to ask if a computer was involved in this decision, and then dig in and do some research. It's a lot of work but it's worth it because these systems are affecting a lot of people," Lord said.

"If a computer is making a decision that impacts someone's right to property or their liberty, that is a due process violation. That would be the legal approach. A computer cannot give someone due process."

One of Lord's colleagues, Tony D. Paris of the Maurice & Jane Sugar Law Center for Economic & Social Justice in Detroit, began working in this area of the law at the same time as the Bauserman case began.

Paris stressed that whenever automation takes place in a business or government agency, there will be casualties.

"It's not just the folks who are laid off and don't have jobs anymore, but automation itself could very well and often has major problems with it that need to be worked out," Paris said. "When a system is put in place that is automated, if it's supposed to be serving any kind of government function, it also needs to be held accountable like government actors are."

If the automated program serves a private function, Paris said to treat the issue like a products liability case.

"If it's a private function, it could very well have a products liability aspect to it if it's programmed improperly or without basic requirements of due process that we have a right to," Paris said.

Paris added that when an automated program makes an allegation like fraud, lawyers should know that proving that allegation requires a specific intent type of analysis that can be based on all different types of factors.

"I don't understand how specific intent can be computer-generated without a human investigator meeting the individual," Paris said. "These are situations when you're trying to prove fraud or any specific intent crime that traditionally you need to have a human being reviewing those things."

Published: Mon, Aug 19, 2019