How AI Is Helping Government Companies in St Paul Cut Costs and Improve Efficiency
Last Updated: August 28th 2025

Too Long; Didn't Read:
St. Paul agencies use AI (Google Gemini, NotebookLM) to cut task time (four months → four weeks), prevent fraud with predictive analytics, and boost productivity - potentially aiding 500,000 Minnesotans at risk - while pairing pilots with governance, prompt logging, and upskilling.
St. Paul is at the crossroads of Minnesota's AI moment: Saint Paul Public Schools is already integrating tools like Google Gemini and NotebookLM to streamline lesson planning and differentiated instruction (Saint Paul Public Schools AI resources), Minnesota employers are sharing practical pilots that automate data entry and boost productivity (Minnesota employers AI adoption case studies), and local leaders - led by St. Paul CIO Jaime Wascalus - are shaping GovAI templates and vendor guidance so cities can cut costs without sacrificing accountability (GovAI Coalition keynote and guidance by Jaime Wascalus).
The stakes are tangible: North Star Policy estimates 500,000 Minnesota workers - about 17% of the workforce - are at high risk of job alteration from AI, so investing in clear guardrails and practical upskilling is essential.
City departments looking to move from pilot to scale should pair governance with hands-on training - such as Nucamp's 15‑week AI Essentials for Work bootcamp - to ensure staff can write reliable prompts, validate outputs, and realize real savings.
Attribute | Information |
---|---|
Bootcamp | AI Essentials for Work |
Length | 15 Weeks |
Courses included | AI at Work: Foundations; Writing AI Prompts; Job Based Practical AI Skills |
Cost | $3,582 early bird; $3,942 afterwards |
Syllabus | AI Essentials for Work syllabus (Nucamp) |
Registration | Register for AI Essentials for Work (Nucamp) |
“We all do better when we all do better.” - Jaime Wascalus, ICMA (GovAI Coalition)
Table of Contents
- Where Minnesota and St. Paul Are Using AI Today
- Key Drivers Behind AI Adoption in Minnesota
- Governance, Laws, and Risk Management in Minnesota
- Managing Common Risks for St. Paul and Minnesota Agencies
- Practical Steps for St. Paul City Departments to Start or Scale AI
- Real-World Examples and Pilot Results in Minnesota and Beyond
- Costs, Savings, and Measuring Success for St. Paul
- Policy Templates and Resources for St. Paul, Minnesota
- Conclusion: Next Steps for St. Paul and Minnesota Leaders
- Frequently Asked Questions
Check out next:
Use an actionable checklist for city leaders to coordinate timelines and legal monitoring through 2025.
Where Minnesota and St. Paul Are Using AI Today
(Up)Across Minnesota - and in St. Paul's backyard - AI is moving from promise to practice as agencies race to stop provider fraud and tighten program integrity: state leaders have launched an experimental anti‑fraud pilot and are exploring centralized investigative tools that use predictive analytics to flag anomalous Medicaid billing, while audits and high‑profile cases (including an $18.5 million NUWAY settlement and paused payments to dozens of providers after credible fraud allegations) have sharpened urgency for better tech and real‑time detection (Minnesota AI anti-fraud pilot coverage - StateScoop, KNSI report on Minnesota fraud and AI use).
Clinics and benefit administrators are also looking to data‑driven screening, anomaly detection, and collaborative analytics to recover dollars and speed investigations, but watchdogs warn automation can misfire and harm eligible recipients unless models are transparent and paired with human review (Analysis: benefits of AI in reducing Medicaid fraud).
For St. Paul departments, that means piloting narrowly scoped AI tools, measuring false positives, and designing clear appeal pathways so technology protects taxpayers without locking out the people who need help most.
One of the things that is really important is the fraud that exists kind of globally, isn't restrictive to the private sector at all - trillions of dollars in individual fraud were people being exploited, being asked to pay tolls that don't exist online, as well as organizations. I think 60% of global Fortune 100 companies are investing in anti‑fraud efforts, and so government isn't doing this because there's a unique government problem. Really, the sophistication and tooling and AI that's being used to commit crimes, to commit fraud, we have to make sure that we're on the forefront of investing in tools to protect as well.
- Tarek Tomes (StateScoop)
Key Drivers Behind AI Adoption in Minnesota
(Up)Minnesota's move toward practical AI is driven by a clear trio of needs: speed, inclusion, and accountability - tangible wins that make the technology hard to ignore.
Agencies report dramatic time savings (what once took four months can now take four weeks), while pilots for translation and plain‑language accessibility are helping legislative materials reach non‑English speakers faster, easing everyday access to government services (Route Fifty coverage of Minnesota AI translation pilots and efficiency).
At the same time, expanded anti‑fraud efforts show why predictive analytics and automation are being adopted across the state to protect public dollars (StateScoop coverage of Minnesota's anti‑fraud AI pilot).
Those opportunities come with hard constraints: Minnesota cities must follow the Minnesota Government Data Practices Act, classify data by risk, and avoid feeding moderate or high‑risk records into third‑party models that retain inputs (League of Minnesota Cities guidance on AI and the Minnesota Government Data Practices Act).
The net effect is pragmatic: more teams see AI as a “need‑to‑have,” not a novelty, and are pairing pilots with policies, prompt logging, and human review so automation scales without sidelining fairness or privacy - a tradeoff that turns technical speed into real public service improvement.
“treat AI as a power tool and use safety goggles.”
Governance, Laws, and Risk Management in Minnesota
(Up)Minnesota's legal framework makes governance the backbone of any city AI effort: the Minnesota Government Data Practices Act (MGDPA) governs what government data can be collected, who may see it, how it's stored and released, and it forces explicit accountability through a designated Responsible Authority for each entity - a practical requirement when projects touch sensitive records or use third‑party AI vendors (see the Minnesota MGDPA guidance).
Before collecting private or confidential data, agencies must give a clear Tennessen warning explaining purpose, consequences, and who may receive the information, so an AI training feed can't quietly absorb resident data without consent; contractors handling government data are bound by the same rules as the government itself.
Compliance isn't just bureaucratic: violations can trigger lawsuits, civil penalties and even criminal sanctions, so design choices like logging prompts, keeping human review gates, and treating not‑public inputs differently are risk management, not optional extras - think of the RA as the city's data safety lifeguard who must keep swimmers out of dangerous waters.
For a plain‑English primer and deeper legal context, see the MCIT overview and the Mitchell Hamline chapter on Chapter 13.
Requirement | What it means for city AI |
---|---|
Responsible Authority | Designate an official to oversee collection, classification, access procedures, and annual updates |
Tennessen Warning | Notify individuals at point of collection about purpose, necessity, consequences, and authorized recipients |
Penalties & Remedies | Civil suits, fines, and potential criminal or disciplinary consequences for willful violations |
Managing Common Risks for St. Paul and Minnesota Agencies
(Up)Managing common risks for St. Paul and statewide agencies is about straightforward, practical controls: verify every AI output before it informs benefits, enforcement, or licensing decisions; treat nonpublic records as off‑limits for third‑party models that may retain inputs; and run regular bias checks so automation doesn't amplify historic inequities.
Minnesota guidance makes this concrete - MnDOT's generative AI standards require that “all GenAI output used for decision making must be verified to ensure accuracy, relevance, and factuality” - and the League of Minnesota Cities warns cities to classify data as low, moderate, or high risk and to assume popular chat tools will retain inputs unless contractually prohibited.
The stakes are real: reporting on a Lehigh University experiment in Minnesota Reformer showed chatbots recommending denials more often for Black loan applicants than identical white counterparts, a vivid reminder that accuracy and fairness failures can become life‑changing errors.
For St. Paul departments, the checklist is simple and actionable: use only low‑risk public data with external models, log prompts and human reviews, build appeal paths for flagged residents, and pair any pilot with bias testing and subject‑matter sign‑offs so speed doesn't outpace trust.
“All GenAI output used for decision making must be verified to ensure accuracy, relevance, and factuality and to mitigate risks such as AI ...” - MnDOT Generative Artificial Intelligence Standards
Practical Steps for St. Paul City Departments to Start or Scale AI
(Up)City departments ready to start or scale AI should begin with a clear, low‑risk playbook: create an AI inventory for city systems (311, permitting, property assessment and RPA tools) to spot easy wins and dependencies, then pilot narrowly scoped tools on public or anonymized data so outcomes can be verified before any sensitive records are involved - a practical how‑to is laid out in the Nucamp guide: Nucamp guide to create an AI inventory for city systems in St. Paul.
Pair pilots with role‑based training and vendor agreements that lock down data retention (local colleges show how Microsoft Copilot can be used under a secure data‑sharing arrangement; see Saint Paul College AEI Microsoft Copilot guidance and secure data-sharing resources), and lean on existing public‑sector examples for use case selection - for instance, target measurable public‑safety gains like Minnesota's traffic‑safety RSIC pilot during the “100 Deadliest Days.” Equip teams with hands‑on upskilling and clear verification steps (Saint Paul Public Schools AI best practices and staff training resources demonstrate how to adopt tools like Gemini and NotebookLM while managing risk), log prompts, and require human sign‑off on decisions so speed turns into trustworthy service improvements.
"The RSIC platform opens up a new world of possibilities for improving traffic safety," said Office of Traffic Safety Director Mike Hanson.
Real-World Examples and Pilot Results in Minnesota and Beyond
(Up)Real pilots in Minnesota show how focused experiments can turn bold ideas into measurable returns: MNIT's “Shark‑Tank” modernization process has funneled hundreds of agency pitches into fast, people‑centered projects - everything from better crash‑data routing that can shave life‑saving minutes off emergency response to digitizing birth and death records - while the state's $130M investment in MNIT is explicitly meant to speed cloud, cybersecurity, and user‑experience wins across agencies Minnesota Shark‑Tank IT modernization strategy, MNIT $130M modernization funding announcement.
Those operational pilots sit alongside harder lessons from oversight and legal experience: the McNitt v. MNIT case (Oct. 2024) underscores that HR and hiring practices are legally bounded - documented rehabilitation under CORA can bar unemployment based on conviction findings - so automation in screening or staffing must be paired with careful legal review and appeal paths McNitt v. Minnesota IT Services court ruling.
The throughline is pragmatic: choose narrow pilots with clear metrics and legal checks, expect rapid payback, and log decisions so new tools earn trust, not just speed.
“We expect a return value within 12 months. They're really ideas that are expected to be implemented and return value incredibly quickly.” - Tarek Tomes
Costs, Savings, and Measuring Success for St. Paul
(Up)Balancing costs and savings is a practical exercise for St. Paul leaders: while public employees increasingly lean on AI for productivity - turning tasks like drafting FAQs from 50 minutes into five, according to local coverage - the rising price of models, cloud compute, and even questions about electricity and water usage mean every pilot needs clear financial tracking.
Start by building an AI inventory and a simple cost‑benefit dashboard (Nucamp AI Essentials for Work syllabus and AI inventory guide lays out practical steps), then measure time saved, transaction cost reductions, error and false‑positive rates, and downstream impacts such as reduced backlog or faster permit turnaround.
Pair financial metrics with governance checks from the League of Minnesota Cities - classify data by risk, avoid putting moderate or high‑risk records into third‑party models, and include compliance costs in ROI calculations.
A tight dashboard that ties dollars to verification steps, bias testing, and records‑management obligations turns abstract “efficiency” into defensible budget decisions that protect residents and the city's balance sheet.
“That's in a nutshell what I call augmenting the city worker.” - Melissa Reeder
Policy Templates and Resources for St. Paul, Minnesota
(Up)St. Paul leaders can move fast without guessing by leaning on ready-made templates and local guidance: the League of Minnesota Cities guide to cities and artificial intelligence offers a concise primer and a downloadable computer‑use model policy that explicitly calls out AI, recommends classifying data as low/moderate/high risk, and includes sample language (like limiting AI to “low‑risk” public data and routing questions to the city's Responsible Authority) - a practical starting point for city HR and IT policies (League of Minnesota Cities guide to cities and artificial intelligence).
Pair that with the League of Minnesota Cities data practices memo to build Tennessen warnings, designate a Responsible Authority, and document retention rules before any pilot touches resident records (League of Minnesota Cities data practices - Analyze, Classify and Respond).
For a full policy skeleton municipalities can adapt quickly, consider the municipal AI policy template for local governments that lays out principles for transparency, human‑in‑the‑loop decision‑making, bias checks, and an approvals registry so a single lapse can't put sensitive data at risk (municipal AI policy template for local governments).
Think of a Tennessen warning like a road sign - a clear stop before data leaves city control - so pilots scale with safeguards, not surprises, and staff know exactly when to escalate to IT or legal if a model asks for nonpublic inputs.
“Rather than a policy to ban AI, cities should consider a 'yes, and' policy. Yes, you can use it, and here are the best ways to use it. It's already here ...”
Conclusion: Next Steps for St. Paul and Minnesota Leaders
(Up)St. Paul and statewide leaders should move from pilots to durable programs that center people, not just models: adopt worker‑centered governance and transparency, expand AI literacy across city staff and schools using the Minnesota Department of Education's guiding principles for AI in education (MDE Guiding Principles for AI in Education), require impact audits and collective bargaining input per the Department of Labor's employer best practices (DOL AI best practices for employers), and pair every pilot with clear Tennessen warnings, prompt logs, human review gates, and measurable ROI so the hard question - who benefits - gets answered before any data leaves city control.
With roughly 500,000 Minnesota workers at high risk of job alteration, invest in practical upskilling (hands‑on courses that teach promptcraft and verification) and short, role‑based bootcamps so staff can treat AI like a power tool and know when to ask for help; one ready option is Nucamp's AI Essentials for Work bootcamp to build prompt writing and verification skills across departments (AI Essentials for Work syllabus).
Bootcamp | Length | Cost (early bird) | Registration |
---|---|---|---|
AI Essentials for Work | 15 Weeks | $3,582 | Register for AI Essentials for Work |
“Whether AI in the workplace creates harm for workers and deepens inequality or supports workers and unleashes expansive opportunity depends (in large part) on the decisions we make.” - DOL Acting Secretary Julie Su
Frequently Asked Questions
(Up)How is AI being used by government agencies in St. Paul to cut costs and improve efficiency?
St. Paul agencies and Minnesota state programs are piloting AI for tasks like lesson planning (using tools such as Google Gemini and NotebookLM), automating data entry, predictive analytics for fraud detection, translation and plain‑language drafting, crash‑data routing, and digitizing records. These pilots shorten timelines (examples cited include processes dropping from four months to four weeks), reduce transaction costs, and speed case investigations when paired with human verification and governance.
What governance, legal, and risk management requirements must St. Paul follow when deploying AI?
St. Paul must follow the Minnesota Government Data Practices Act (MGDPA), designate a Responsible Authority to oversee data practices, provide Tennessen warnings before collecting private or confidential data, classify data by risk (low/moderate/high), and avoid sending moderate/high‑risk records to third‑party models that retain inputs. Practical controls include prompt logging, human‑in‑the‑loop verification of outputs, bias testing, contractual limits on vendor data retention, and tracking compliance to avoid civil or criminal penalties.
How should city departments measure AI costs, savings, and success?
Departments should build an AI inventory and a cost‑benefit dashboard that tracks time saved, transaction cost reductions, error and false‑positive rates, backlog changes, permit turnaround times, and compliance costs. Include verification and bias‑testing steps in ROI calculations, and measure both financial returns (many pilots expect payback within 12 months) and governance outcomes to ensure savings do not compromise fairness or legal obligations.
What practical steps can St. Paul take to start or scale AI pilots safely?
Begin with a low‑risk playbook: create an AI inventory (311, permitting, assessments), pilot narrowly scoped tools on public or anonymized data, log prompts and human reviews, require human sign‑off for decisions, run bias checks, and draft Tennessen warnings before collecting private data. Use vendor agreements to lock down data retention and rely on local templates (League of Minnesota Cities, municipal AI policy templates) for policy language. Pair pilots with role‑based training and verification workflows so speed yields trustworthy service improvements.
How can St. Paul upskill staff so they can use AI effectively while managing risk?
Invest in hands‑on, role‑based training focused on prompt writing, validation, and verification. Short bootcamps - such as Nucamp's 15‑week AI Essentials for Work program (courses include AI at Work: Foundations; Writing AI Prompts; Job‑Based Practical AI Skills) - teach promptcraft, output validation, and practical controls. Pair training with on‑the‑job pilots and governance practices so staff learn to treat AI as an augmented tool and escalate issues to IT or legal when models request nonpublic inputs.
You may be interested in the following topics as well:
City employees need a clear roadmap to navigate the coming AI disruption in St. Paul government roles, and this guide lays out practical steps.
Find out how AI-ready procurement and vendor scoring can streamline contracts and enforce NIST-aligned clauses for safer buys.
Ludo Fourrage
Founder and CEO
Ludovic (Ludo) Fourrage is an education industry veteran, named in 2017 as a Learning Technology Leader by Training Magazine. Before founding Nucamp, Ludo spent 18 years at Microsoft where he led innovation in the learning space. As the Senior Director of Digital Learning at this same company, Ludo led the development of the first of its kind 'YouTube for the Enterprise'. More recently, he delivered one of the most successful Corporate MOOC programs in partnership with top business schools and consulting organizations, i.e. INSEAD, Wharton, London Business School, and Accenture, to name a few. With the belief that the right education for everyone is an achievable goal, Ludo leads the nucamp team in the quest to make quality education accessible