Automated Government Systems Help Millions Access Legal Guidance, but Accuracy Concerns Remain
December 2, 2025Federal agencies across the government are deploying automated chatbots and guidance systems to help millions of Americans navigate complex regulations. While this advice is often helpful to users, it can also oversimplify legal complexities and provide incorrect advice.

Leigh Osofsky, William D. Spry III Distinguished Professor of Law at the University of North Carolina School of Law, co-authored a new book examining how government agencies use automation to interpret legal rules for the public. Speaking at a book release event on Nov. 21, Osofsky detailed findings from “Automated Agencies: The Transformation of Government Guidance,” co-written with Joshua Blank, a professor at the University of California at Irvine School of Law.
The research began approximately 10 years ago when Osofsky and Blank examined resources the IRS uses to help people understand their legal obligations, including publications, tax return forms and frequently asked questions. They noticed the law described in these resources often differed from the actual underlying tax law in terms of specific terms used, exceptions de-emphasized, and standards simplified. Sometimes, this leads taxpayers to forego deductions they’re entitled to take. Other times, it might encourage them to take positions that are more favorable than the underlying law permits.
“These systems actually just get the answers wrong a fair number of times,” Osofsky said. “They’re giving an answer that might be right for most people, but it’s not right in a particular circumstance.”
With the IRS alone operating more than 10 different automated systems that respond to tens of millions of inquiries annually, even a 10 percent error rate affects substantial numbers of users.
Automated tools can tell members of the public they don’t qualify for benefits they’re legally entitled to receive, or conversely, advise them to take positions they cannot legally take. Unlike traditional agency guidance, users cannot legally rely on these automated responses to protect themselves from penalties if audited.
Federal laws are often too complex for people to understand, yet they must apply them in daily life, from tax returns to student loan relief. Osofsky said automated tools can effectively help people understand answers to basic questions they need to fulfill their legal obligations.
“If you were in the U.S. for part of but not the entire year, do you file a tax return or get an extension?” Osofsky said. “If you were trying to figure out the answer by looking at the actual code and regulation sections, it would be quite difficult. Automated tools are a great way to help people understand the answers to basic questions.”
The pandemic accelerated adoption of these systems. Governments overwhelmed with serving people who had questions about unemployment relief or changes to health regulations created automated responses to answer questions relating to unemployment issues and payment schedules. Veterans’ offices also automated legal guidance.
However, remarkably few people oversee these widely used systems. Typically, just two or three staff members within an agency build and maintain a tool that millions will consult. The attorneys who normally develop agency legal guidance often know little about automated systems.
This disconnect creates problems when laws change. Updates to statutes or regulations don’t always filter down to the technology teams maintaining automated systems, meaning users sometimes receive guidance based on outdated law.

Agencies evaluate these systems primarily through user satisfaction surveys rather than testing for legal accuracy. The metrics focus on whether users liked their experience and how often the system had to respond, “I don’t know.” This creates incentives to provide definitive answers even when the correct response might be more nuanced.
“People should be wary about thinking that automation can deliver the same sort of relevant and precise answers that humans can about the law,” Osofsky said. “The law has many ambiguities and uncertainties. Automation tends to smooth these out and make it seem like there are clear answers when there are not.”
Budget constraints drive much of the automation trend. Human customer service representatives are expensive, and agencies consistently operate without adequate resources to handle public inquiries through traditional channels. The systems offer 24-hour availability and immediate responses to straightforward questions.
Equity concerns complicate the picture. The people most likely to use automated systems are those who cannot afford lawyers or professional tax preparers. When these tools provide incorrect answers, the users least equipped to recognize or challenge the errors, suffer the consequences.
The Administrative Conference of the United States commissioned Osofsky and Blank to expand their research across the government. ACUS subsequently adopted 20 of their recommendations for how agencies should handle automated guidance systems.
The researchers propose that agencies should indicate when laws are unsettled or have recently changed, clearly show effective dates for legal provisions, maintain records of guidance provided to users, and improve communication between legal experts and technology teams. They also recommend substantive accuracy testing beyond user satisfaction metrics.
Osofsky and Blank also recommended that agencies be bound to their automated guidance, at least to the extent of protecting users from penalties when they reasonably rely on incorrect advice. ACUS ultimately did not adopt this recommendation.
Looking ahead, Osofsky expects continued expansion of automated systems. The larger question is whether the government will shift from explaining laws to simply applying them through automated compliance systems.
“The IRS may interpret laws for you using automation and send a bill that you pay in an automated fashion,” Osofsky said. “Automation offers us great benefits but comes with potential costs.”
She noted that as automation becomes embedded in decision-making in ways people cannot see, it becomes more likely to miss the ways automation glosses over ambiguities in the law. The book helps identify these benefits and costs as the government expands the use of automation going forward.
Some agencies have implemented changes since the recommendations were issued, including better disclaimers about reliability and improved record-keeping for automated interactions. However, Osofsky believes more work remains to address issues of oversight, accuracy and transparency as these systems continue to proliferate across federal, state and local governments.