The Complete Guide to Using AI in the Government Industry in Charlotte in 2025

By Ludo Fourrage

Last Updated: August 16th 2025

City of Charlotte, North Carolina government staff using AI tools in 2025 office

Too Long; Didn't Read:

Charlotte agencies in 2025 should run 6–12 week AI pilots, pair NIST RMF risk assessments and mandatory staff training, follow NCDIT Responsible Use rules, and use procurement guardrails - e.g., Duke's Sepsis Watch saw a 31% mortality reduction; AI can cut backlog and speed responses.

Charlotte's public leaders face a 2025 reality: generative AI can sharply reduce back‑office burden and improve constituent services, but only with local safeguards - North Carolina health systems have been early adopters of ambient documentation, diagnostic, and workflow tools and state leaders are already calling for oversight (North Carolina state leaders call for AI oversight in health care), while national analyses expect gen AI to shift from pilots to wide government use this year to automate workflows and elevate the constituent experience (Generative AI in government 2025: analysis and expectations).

The practical takeaway for Charlotte agencies: pair selective pilots that demonstrate impact (Duke's Sepsis Watch reported a 31% mortality reduction) with staff training and clear procurement rules - start by upskilling teams with targeted programs like Nucamp's AI Essentials for Work syllabus - build practical AI skills for the workplace to build prompt‑writing, governance, and operational skills quickly.

ProgramDetails
AI Essentials for Work 15 weeks; early bird $3,582; syllabus: AI Essentials for Work syllabus (Nucamp); registration: Register for AI Essentials for Work (Nucamp)

“Not only do I truly believe that AI can really improve health care and health, I also believe we need AI to improve health care and improve health.” - Christina Silcox, Duke‑Margolis Institute for Health Policy

Table of Contents

  • Understanding AI Basics for Charlotte Public Sector Leaders
  • What Is the AI Regulation Landscape in the US and North Carolina (2025)?
  • How to Start With AI in Charlotte Government in 2025: A Step-by-Step Roadmap
  • Practical Use Cases of AI in Charlotte Government in 2025
  • Data Privacy, Ethics, and Risk Management for Charlotte Agencies
  • Building Local Capacity: Training, Partnerships, and Education in Charlotte, North Carolina
  • Procurement, Vendors, and Guardrails: Choosing AI Tools for Charlotte Government
  • Looking Ahead: Where Will AI Go in Charlotte Government in 2025 and Beyond?
  • Conclusion: Next Steps for Charlotte Government Leaders in 2025
  • Frequently Asked Questions

Check out next:

Understanding AI Basics for Charlotte Public Sector Leaders

(Up)

Understanding AI for Charlotte public leaders starts with a clear, operational definition: publicly available generative AI are tools anyone can use, but their ease of access masks real legal and security hazards - inputs can become part of a model's training data and may be subject to public records requests - so never submit PII, health or personnel data to public tools and always register state work accounts with state email.

Institute simple technical and governance controls: require independent fact‑checking of AI outputs, disable chat history for high‑risk tasks, run security assessments (use the NIST AI Risk Management Framework), and document agency use and annual reassessments to preserve accountability.

Use regional templates and policy language to speed adoption while managing risk (Centralina generative AI policy guidance for local governments: Centralina generative AI policy guidance for local governments) and follow North Carolina's operational rules for publicly available generative AI for concrete dos and don'ts - examples include prohibited PII uploads, use of state accounts, and disclosure/citation requirements (NCDIT guidance on publicly available generative AI: NCDIT guidance on publicly available generative AI).

The practical takeaway: pair short, well‑scoped pilots with mandatory staff training and a documented approval path so measurable wins (reduced backlog, faster constituent responses) do not come at the cost of privacy or legal exposure.

ActionWhy it matters
Never enter PII into public generative AIPrevents data leakage and public records exposure
Use state email/accounts for agency workEnsures retention, oversight, and compliance
Document use and perform annual risk assessments (NIST AI RMF)Maintains accountability and updates approvals
Fact‑check and human‑review all outputsReduces hallucinations, legal and reputational risk

“The rapid evolution of GenAI presents tremendous opportunities for public sector organizations. DHS is at the forefront of federal efforts to responsibly harness the potential of AI technology... Safely harnessing the potential of GenAI requires collaboration across government, industry, academia, and civil society.”

Fill this form to download the Bootcamp Syllabus

And learn about Nucamp's Bootcamps and why aspiring developers choose us.

What Is the AI Regulation Landscape in the US and North Carolina (2025)?

(Up)

Federal activity in 2025 emphasizes standards, risk management, and nondiscrimination - think NIST's AI Risk Management Framework and playbooks, GAO's generative‑AI deployment guidance, the Bipartisan House Task Force review, HHS guidance on preventing discrimination, and the U.S. Copyright Office's AI report - all referenced and collected by state IT leaders so local agencies aren't building policy from scratch; North Carolina's N.C. Department of Information Technology curates these materials (including an “AI Governance in Practice Report 2024” infographic, NIST profiles, bias‑management guidance, and a long list of foundation‑model acceptable‑use policies) as a practical toolkit for agencies, while the N.C. Department of Public Instruction supplies PK‑13 generative AI recommendations and on‑demand webinars to align schools with state expectations.

The so‑what: Charlotte leaders can accelerate safe adoption by using NCDIT's curated checklist and NIST RMF playbooks to vet vendors, codify rules (for example, prohibiting PII uploads and requiring state accounts), and document use cases, and by tapping NCDPI training to lower operational risk in educational deployments; start with the state's consolidated resources to turn federal guidance into local policy and procurement guardrails.

Read the NCDIT Other AI Resources collection and NC DPI AI Resources for PK‑13 schools and webinars.

ResourceWhy it matters
NCDIT - Other AI Resources (North Carolina Department of Information Technology)Curated federal frameworks, governance infographic, and operational guidance for state/local agencies
NC DPI - AI Resources for PK‑13 Schools (North Carolina Department of Public Instruction)PK‑13 generative AI recommendations, webinars, and educator training to align school implementations

For more information, consult the state resources linked above.

How to Start With AI in Charlotte Government in 2025: A Step-by-Step Roadmap

(Up)

Start with a tightly scoped, time‑boxed pilot that proves value and limits risk: inventory candidate workflows (constituent intake, unclaimed property screening, FOIA triage), set clear success metrics, and run a 6–12 week pilot that mirrors North Carolina's 12‑week Treasurer/OpenAI project to surface measurable efficiency gains before wider rollout; pair that pilot with the state's operational playbook - use the NCDIT Responsible Use of AI framework and guidance for artificial intelligence and the practical rules in the NCDIT “Use of Publicly Available Generative AI” guidance (never upload PII, use state accounts, disable chat history for high‑risk tasks, and disclose AI use) - then lock procurement and vendor checks into the process via the North Carolina Statewide IT Procurement Office resources and OneForm procurement guidance (OneForm, statewide contracts, and vendor readiness templates) so contracts and security reviews happen before paid pilots.

Require mandatory staff training, NIST AI RMF‑based risk assessments, documented review/acceptance criteria, and an annual reassessment cadence; the payoff: a single, disciplined pilot can deliver a repeatable playbook and measurable time savings without exposing sensitive data, making expansion a governance‑driven choice rather than a compliance gamble.

StepResource / Why
Design time‑boxed pilot (6–12 weeks)Replicate measurable goals from NC Treasurer/OpenAI pilot to prove ROI
Apply NCDIT AI framework & guidanceOperational rules, privacy limits, and training requirements
Use Statewide IT ProcurementVendor vetting, OneForm, and contract templates for compliant buys
Mandate NIST RMF assessments & human reviewReduce hallucination, legal and security risk; enable annual reassessment

“Innovation, particularly around data and technology, will allow our department to deliver better results for North Carolina. I am grateful to our friends at OpenAI for partnering with us on this new endeavor, and I am excited to explore the possibilities ahead.” - Brad Briner

Fill this form to download the Bootcamp Syllabus

And learn about Nucamp's Bootcamps and why aspiring developers choose us.

Practical Use Cases of AI in Charlotte Government in 2025

(Up)

Concrete, near‑term AI wins for Charlotte government fall into three practical buckets: operational automation (chatbots and voice bots for intake and 311), safety and situational awareness (license‑plate readers and sensor analytics), and clinical and infrastructure efficiency (medical imaging triage and pipeline monitoring).

Deploy small, measurable pilots that mirror local examples: Bojangles' voice bot Bo‑Linda reduced front‑line strain by an estimated 4–5 hours of employee labor per location per day, showing how conversational AI can stabilize service levels during staffing shortages; Novant Health's Viz.ai imaging platform has shaved up to 20 minutes off stroke diagnosis and treatment time, a clear patient‑outcome and cost payoff; and Northlake Mall's AI‑enabled license‑plate readers - tied to CMPD - illustrate how targeted sensor deployments aid public safety while requiring strict data governance.

Tie these use cases to the City's Smart Charlotte agenda and I&T governance so pilots include digital‑equity considerations and repeatable procurement steps (City of Charlotte Innovation and Technology - Smart Charlotte initiative), learn from local reporting on sector deployments (In-depth Charlotte Magazine coverage of AI deployments in Charlotte), and model vendor pilots on recent state projects like the NC Treasurer's OpenAI trial to document measurable efficiency gains before scale (WBTV report on the NC Treasurer OpenAI pilot program).

“We refer to her as a person and consider her a member of the team.”

Data Privacy, Ethics, and Risk Management for Charlotte Agencies

(Up)

Data privacy, ethics, and risk management must be baked into every Charlotte AI project: follow the North Carolina State Government Responsible Use of AI Framework as the operational baseline, adopt the Fair Information Practice Principles across the AI lifecycle, and make privacy the default in architecture and procurement so sensitive records never become training fodder for public tools (North Carolina Responsible Use of AI Framework).

Practical controls include access gating and data‑quality checks, vendor due diligence and contract language for data handling, and using the Office of Privacy and Data Protection's AI/GenAI questionnaire during the Privacy Threshold Analysis to flag risks before pilots or buys; NCDIT is already building standard RFP language and vendor questionnaires to speed compliant procurement.

The payoff is clear: early, documented privacy review plus mandatory staff training prevents costly exposure or public‑records surprises and keeps constituent trust intact - backed by a new state AI governance lead to drive consistent policy across agencies (Privacy's Role in AI Governance).

“The public has to put a certain amount of trust in the government,” NCDIT Chief Privacy Officer Cherie Givens said.

Fill this form to download the Bootcamp Syllabus

And learn about Nucamp's Bootcamps and why aspiring developers choose us.

Building Local Capacity: Training, Partnerships, and Education in Charlotte, North Carolina

(Up)

Build local AI capacity by linking UNC Charlotte's teaching and research pipelines with hands‑on, public‑sector training: UNC Charlotte's 2025 AI Summit for Smarter Learning (May 14, 2025) drew nearly 300 attendees for workshops and lesson simulations that doubled prior turnout and produced classroom‑to‑career use cases, the Center for Teaching and Learning offers short faculty workshops plus a 10‑hour “Next‑Generation Learning with Generative AI Tools” microcredential to teach prompt engineering and ethics toolkits, and municipal staff can get immediate, task‑focused practice in OneIT's hands‑on “Using Generative AI in Your Daily Work” sessions (live Copilot and Gemini exercises); supplement those with free, self‑paced public‑sector courses from InnovateUS on responsible GenAI adoption and procurement to close policy and skills gaps quickly.

The so‑what: this stacked approach - summits for momentum, short certificates for skill, campus research from the Charlotte AI Institute, and free public‑sector modules - creates a fast, low‑risk pipeline Charlotte agencies can use to upskill cohorts, validate vendor pilots, and deliver measurable service gains without compromising privacy or governance (UNC Charlotte 2025 AI Summit for Smarter Learning details, OneIT “Using Generative AI in Your Daily Work” training details, InnovateUS Artificial Intelligence for the Public Sector workshop series).

ProgramFormat / Benefit
2025 AI Summit for Smarter LearningFull‑day in‑person summit (May 14, 2025); nearly 300 attendees; hands‑on workshops and lightning talks
CTL Workshops & MicrocredentialShort faculty workshops on Gen‑AI tools, ethics, prompt writing; 10‑hour professional microcredential
OneIT AI TrainingIn‑person/virtual hands‑on sessions for staff (Copilot, Google Gemini); practical daily‑work use cases
InnovateUS Public‑Sector CoursesFree, self‑paced courses on responsible GenAI use, procurement, and implementation for government

“The faculty and staff participating in the AI Summit displayed a commitment to thoughtful, responsible and ethical AI integration that supports teaching and learning in meaningful ways.” - Jennifer Troyer, provost and vice chancellor for academic affairs

Procurement, Vendors, and Guardrails: Choosing AI Tools for Charlotte Government

(Up)

Charlotte agencies must treat AI purchases like any mission‑critical IT buy: start with the Statewide IT Procurement Office's templates and OneForm pathways to ensure vendors meet security, contract and Best Value standards, require NIST AI Risk Management Framework assessments and NCDIT's privacy‑aligned controls before any pilot, and insist contract clauses that prohibit uploading PII to public models and require audit logs, data‑handling commitments, and annual reassessments; doing so shortens procurement cycles and reduces legal exposure - NCDIT's office now targets a 90‑day IT procurement cadence while publishing vendor resources to speed compliant buys (NCDIT Statewide IT Procurement resources and OneForm pathways).

For use of public generative tools, adopt the concrete dos and don'ts in the NCDIT guidance - state accounts for agency work, disabled chat history for high‑risk tasks, independent fact‑checking of outputs, and documented agency approvals - to prevent data leakage and public‑records surprises (NCDIT guidance on using publicly available generative AI tools).

Finally, require vendors to complete standardized questionnaires and model‑transparency disclosures, document successful pilots for reusable RFP language, and learn from NCDIT's recent procurement innovation award that pairs automation with strict vetting to accelerate secure adoption (NCDIT procurement innovation award press release (Aug 2025)).

The so‑what: clear, contractual guardrails let Charlotte capture productivity gains from AI while keeping constituent data off public models and procurement defensible in audits.

GuardrailAction
Risk AssessmentMandatory NIST AI RMF assessment before procurement
Data ProtectionsContract clause prohibiting PII uploads; state accounts required
Vendor Due DiligenceStandardized vendor questionnaires and model transparency
Procurement TimelineUse OneForm/state contracts to target faster, auditable procurement (90‑day goal)

“We would never want to use [AI] to tell us who to award to, but even if we're using it for research, we want to make sure we document the research and not just the tool as the final order.”

Looking Ahead: Where Will AI Go in Charlotte Government in 2025 and Beyond?

(Up)

Looking ahead, Charlotte's path to scaling useful, ethical AI is clear: adopt the state's operational baseline, watch the law, and use a dedicated governance lane to turn pilots into repeatable services - North Carolina's Responsible Use of AI Framework already supplies the concrete principles and controls local agencies need to keep sensitive records off public models and require human review (NCDIT Responsible Use of AI Framework: operational principles and controls for government AI), while the appointment of I‑Sah Hsieh as NCDIT's first artificial intelligence governance and policy executive (Apr 25, 2025) gives Charlotte a single statewide touchpoint for procurement templates, vendor questionnaires, and training partnerships that shorten procurement cycles and limit legal exposure (NCDIT press release announcing AI governance and policy leader).

At the same time, the 2025 state legislative wave - “38 states adopted or enacted around 100 measures” - means Charlotte must monitor shifting disclosure, disclosure and human‑review rules and bake compliance into contract language from day one (NCSL 2025 legislation summary for state AI laws).

The so‑what: with a state framework, a named governance lead, and an eye on legislation, Charlotte can safely scale pilots that cut processing times and improve constituent service without sacrificing privacy or auditability.

Near‑term MilestoneWhy it matters
Adopt NCDIT Responsible Use FrameworkProvides operational rules and privacy controls for agency AI projects
I‑Sah Hsieh, NCDIT AI governance lead (Apr 25, 2025)Centralizes policy, vendor language, and statewide training coordination
Track 2025 state AI legislationEnsures procurement and disclosure practices remain compliant as laws evolve

“The public has to put a certain amount of trust in the government.”

Conclusion: Next Steps for Charlotte Government Leaders in 2025

(Up)

Charlotte leaders' next steps are concrete: adopt the N.C. Department of Information Technology's Responsible Use of AI as the operational baseline, start with a tightly scoped 6–12 week pilot that measures specific time or cost savings, and require staff training and NIST‑based risk assessments before any procurement - this sequence turns experimentation into auditable services that protect constituent data while delivering measurable efficiency gains.

Use the NCDIT framework to lock in privacy, human review, and procurement guardrails (NCDIT AI Framework for Responsible Use), and upskill cohorts with practical training so pilots aren't one‑off experiments (consider Nucamp's hands‑on AI Essentials for Work syllabus and registration to build prompt, governance, and operational skills; 15 weeks, early bird $3,582) Nucamp AI Essentials for Work syllabus and registration.

The clear payoff: one disciplined pilot plus mandatory training makes expansion a governance‑driven decision, not a compliance gamble, preserving public trust while cutting back‑office burden and shortening procurement cycles.

Next StepResource
Adopt state AI frameworkNCDIT AI Framework for Responsible Use
Run a 6–12 week pilot with NIST RMF assessmentTime‑boxed pilot + NIST risk assessment
Upskill staff for operational useNucamp AI Essentials for Work syllabus and registration (15 weeks)

“The public has to put a certain amount of trust in the government.”

Frequently Asked Questions

(Up)

What immediate benefits can Charlotte government agencies expect from using generative AI in 2025?

Near‑term benefits include reduced back‑office burden (faster constituent responses, FOIA triage, intake automation), operational automation (chatbots/voice bots for 311 and intake), improved situational awareness (sensor and license‑plate analytics), and clinical/infrastructure efficiency (faster imaging triage). These gains are most reliably achieved via tightly scoped 6–12 week pilots with clear success metrics and mandatory human review.

What governance, privacy, and procurement safeguards should Charlotte agencies put in place before deploying AI?

Adopt North Carolina's Responsible Use of AI framework and NIST AI Risk Management Framework for risk assessments; never upload PII, health, or personnel data to public models; require state email/accounts for agency work; disable chat history for high‑risk tasks; mandate independent fact‑checking and human review of outputs; document use cases and perform annual reassessments; and use OneForm/state procurement templates, vendor questionnaires, and contract clauses that prohibit PII uploads, require audit logs, and specify data‑handling commitments.

How should Charlotte agencies start a safe, effective AI pilot?

Begin with a tightly scoped, time‑boxed pilot (6–12 weeks) focused on a specific workflow (e.g., constituent intake, FOIA triage, unclaimed property screening). Use NCDIT operational guidance and NIST RMF for vendor vetting and risk assessment, set measurable success metrics (reduced backlog, response time savings), require staff training beforehand, lock procurement and security reviews into the pilot plan, and document outcomes to create a repeatable playbook for scaling.

What local training and capacity‑building resources are recommended for Charlotte public sector staff?

Use a stacked approach: attend UNC Charlotte events (e.g., AI Summit for Smarter Learning), take short microcredentials and workshops (prompt engineering, ethics), enroll staff in hands‑on OneIT sessions (Copilot/Gemini exercises), and supplement with free public‑sector courses (InnovateUS) and targeted upskilling like Nucamp's AI Essentials for Work to build prompt‑writing, governance, and operational skills quickly.

How should Charlotte monitor legal and policy changes affecting AI use in 2025?

Track federal standards (NIST RMF, GAO guidance, HHS nondiscrimination guidance) and state updates curated by NCDIT (toolkits, operational checklists, PK‑13 guidance from NCDPI). Monitor 2025 state legislation trends and integrate required disclosure, human‑review, and procurement changes into contract language and agency policies. Leverage the newly named NCDIT AI governance lead and statewide resources to keep procurement templates, vendor questionnaires, and training aligned with evolving rules.

You may be interested in the following topics as well:

N

Ludo Fourrage

Founder and CEO

Ludovic (Ludo) Fourrage is an education industry veteran, named in 2017 as a Learning Technology Leader by Training Magazine. Before founding Nucamp, Ludo spent 18 years at Microsoft where he led innovation in the learning space. As the Senior Director of Digital Learning at this same company, Ludo led the development of the first of its kind 'YouTube for the Enterprise'. More recently, he delivered one of the most successful Corporate MOOC programs in partnership with top business schools and consulting organizations, i.e. INSEAD, Wharton, London Business School, and Accenture, to name a few. ​With the belief that the right education for everyone is an achievable goal, Ludo leads the nucamp team in the quest to make quality education accessible