The Complete Guide to Using AI in the Government Industry in Los Angeles in 2025

By Ludo Fourrage

Last Updated: August 22nd 2025

Los Angeles, California city hall with AI overlay illustrating government AI guidance in 2025

Too Long; Didn't Read:

Los Angeles must pair AI adoption with upskilling, transparency, and procurement reform in 2025: Archistar e‑check pilots cut permit reviews from weeks/months to hours/days; California hosts 33 of top 50 private AI firms; ~70,000 tech jobs lost since early 2023.

Los Angeles needs a focused AI guide in 2025 because wildfire recovery, staff shortages, and legacy systems make faster, transparent permitting and practical workforce training essential: the state partnered with Archistar to provide an AI e‑check - free to local governments - to speed building‑permit reviews and, in LA City and County pilots, trim timelines “from weeks/months to hours/days,” directly accelerating Palisades and Eaton fire rebuilding (California Governor's Office press release on the AI e‑check for building permits, Los Angeles County eCheck AI pilot for faster rebuilding).

As California scales training partnerships with Google, Adobe, IBM, and Microsoft, city leaders must pair tool adoption with upskilling; for public servants and managers looking for practical, workplace-focused AI skills, Nucamp's 15‑week AI Essentials for Work teaches prompts, tool use, and job‑based applications to make AI a safe, productive part of government workflows (Nucamp AI Essentials for Work bootcamp registration).

AttributeDetails
DescriptionGain practical AI skills for any workplace: prompts, tools, and applied use cases
Length15 Weeks
Cost$3,582 early bird; $3,942 after
RegistrationAI Essentials for Work registration page

“AI is the future - and we must stay ahead of the game by ensuring our students and workforce are prepared to lead the way. We are preparing tomorrow's innovators, today.” - Governor Gavin Newsom

Table of Contents

  • What is the AI regulation in the US and California in 2025?
  • AI industry outlook for 2025 and what it means for Los Angeles, California government
  • What is AI used for in Los Angeles government in 2025?
  • Transparency, disclosure and public trust in Los Angeles, California
  • Procurement best practices for Los Angeles, California government
  • Audits, impact assessments and civil-rights protections in Los Angeles, California
  • Workforce, training and capacity building for Los Angeles, California public servants
  • Security, privacy and sector-specific rules affecting Los Angeles, California
  • Conclusion: The future of AI in Los Angeles, California government
  • Frequently Asked Questions

Check out next:

What is the AI regulation in the US and California in 2025?

(Up)

In 2025 the U.S. regulatory landscape for AI is defined by active federal push and an accelerating state patchwork: the White House's “Winning the AI Race: America's AI Action Plan” lays out over 90 federal policy actions to speed infrastructure, exports, and adoption, while Congress and agencies still rely largely on existing statutes rather than a single federal AI law; at the same time California and dozens of other states moved quickly - 38 states enacted roughly 100 AI measures this year - to address transparency, deepfakes, digital‑likeness rights, and automated decision disclosures, leaving local governments to navigate overlapping rules (White House America's AI Action Plan summary and policy actions, National Conference of State Legislatures 2025 state AI legislation summary).

The practical takeaway for Los Angeles: expect strong federal incentives for AI infrastructure and workforce programs, simultaneous state requirements for tool inventories and disclosure, and new federal funding rules that tie eligibility to domestic sourcing and supply‑chain certifications - meaning procurement, transparency, and vendor due diligence must all be updated now to both access funds and limit legal risk.

Level2025 statusPractical implication for Los Angeles
FederalNo comprehensive AI law; White House Action Plan with 90+ actions and executive orders; emphasis on deregulation, infrastructure, exportsOpportunities for funding and data‑center permitting, but compliance teams must track shifting federal priorities and export/supply‑chain rules
California / StateMultiple bills on transparency, deepfakes, digital likeness, and automated decision inventories; active state policy work on frontier modelsLocal agencies must publish inventories, update procurement clauses, and enforce disclosure to meet state mandates

“America's AI Action Plan charts a decisive course to cement U.S. dominance in artificial intelligence. President Trump has prioritized AI as a cornerstone of American innovation, powering a new age of American leadership in science, technology, and global influence. This plan galvanizes Federal efforts to turbocharge our innovation capacity, build cutting‑edge infrastructure, and lead globally, ensuring that American workers and families thrive in the AI era. We are moving with urgency to make this vision a reality,” said White House Office of Science and Technology Policy Director Michael Kratsios.

Fill this form to download the Bootcamp Syllabus

And learn about Nucamp's Bootcamps and why aspiring developers choose us.

AI industry outlook for 2025 and what it means for Los Angeles, California government

(Up)

The AI industry outlook for 2025 is a study in contrasts that Los Angeles government must plan around: California remains a global AI engine - hosting 33 of the top 50 privately held AI companies and signing partnerships to bring industry training to over two million students - but the same forces driving productivity gains are already compressing labor markets, with roughly 70,000 tech jobs lost since early 2023 and major firms announcing large layoffs (Microsoft, IBM, Intel among them); the practical consequence for LA is clear - seize federal and state funding and the Newsom‑led upskilling pipeline while redesigning procurement, disclosure, and workforce‑transition plans so automation improves city services without hollowing out institutional knowledge.

Local leaders should treat vendor partnerships as contingent on training, audited transparency, and contractual worker‑protections to capture AI's efficiency gains without deepening unemployment or service gaps; for a primer on the state's workforce deals see Governor Newsom AI workforce partnerships (Aug 7, 2025), for labor‑market context see the California job‑market analysis (July 2025), and for the evolving regulatory patchwork that will shape city obligations consult the state AI legislation summary.

MetricValueSource
Students targeted by industry trainingOver 2,000,000Governor Newsom AI workforce partnerships press release (Aug 7, 2025)
Top private AI firms based in California33 of the top 50Governor Newsom announcement on California AI companies (Aug 7, 2025)
Tech job losses since early 2023~70,000California job‑market analysis report (July 2025)

“AI is the future - and we must stay ahead of the game by ensuring our students and workforce are prepared to lead the way. We are preparing tomorrow's innovators, today.” - Governor Gavin Newsom

What is AI used for in Los Angeles government in 2025?

(Up)

By 2025 Los Angeles city and county agencies are using AI across core public‑service functions - most visibly to accelerate licensing and building permits but also to analyze zoning and land‑use data, power 24/7 applicant chatbots, and surface compliance or safety issues earlier in review workflows; the state's Archistar e‑check, donated and made free to local governments through a Newsom partnership, uses computer vision, machine learning, and automated rulesets to read floor plans, extract measurements, and pre‑check designs against local codes so human plan‑checkers can focus on nuanced approvals rather than routine measurements (California Governor's Office announcement of Archistar AI e‑check for building permits, National League of Cities overview on using AI to transform city operations and permitting).

The practical payoff is tangible: pilots target cutting preliminary reviews to 2–3 business days and removing review cycles that can add 30–50% to timelines - critical when some rebuild applicants have waited as long as 58 days - so survivors see faster, clearer routes home while planners retain final oversight (LA Times report on AI-assisted permitting after wildfires).

ToolTechniquesPilots / EligibilityExpected turnaround
Archistar e‑checkComputer vision, machine learning, automated rulesetsLA City & County rebuild applicants (Eaton, Palisades); sign up at Archistar e‑check Los Angeles County pilot sign-upPilot target: 2–3 business days for initial review; current waits reported up to 58 days

“What ArchiStart AI does is it reads those floor plans, it extracts out the right information, then it checks it against the rules to tell you if you comply or you don't comply.”

Fill this form to download the Bootcamp Syllabus

And learn about Nucamp's Bootcamps and why aspiring developers choose us.

Transparency, disclosure and public trust in Los Angeles, California

(Up)

Maintaining public trust in Los Angeles' AI deployments starts with clear, proactive disclosure: as state legislatures move fast - many 2025 bills require inventories and public posting of automated decision tools - local agencies should publish a searchable, machine‑readable AI inventory that explains purpose, data sources, and risk level so residents can see where AI is used and why (2025 state artificial intelligence legislation summary).

Pair that inventory with routine impact assessments and human‑review rules: federal reporting already counted more than 1,700 government AI use cases, 227 of which officials flagged as rights‑ or safety‑impacting, a practical reminder that hidden systems erode trust and invite legal scrutiny.

For public‑facing services and local advertising, follow emerging FTC expectations that require clear disclosures when automated decisions materially affect users and when content is substantially AI‑generated - treat disclosure as part of consumer protection, not an afterthought (FTC truth-in-advertising guidance for AI-enhanced local ads).

Operational governance starts with an inventory and risk scoring workflow that ties each entry to monitoring, retention of audit logs, and a documented escalation path so inspectors, advocates, and auditors can verify compliance quickly (best practices for creating an AI inventory) - a single, public inventory plus automated audit trails is the practical step that turns transparency from slogan to measurable trust-building.

Procurement best practices for Los Angeles, California government

(Up)

Procurement best practices for Los Angeles government center on clear requirements, continuous oversight, and contract‑level protections: require vendors to submit a GenAI Disclosure and Fact Sheet and include the State's mandated GenAI solicitation and contract language so bidders identify any generative‑AI components up front, and do not alter the IT General Provisions or GenAI Special Provisions without State approval (California GenAI disclosure and contract language - California Department of Technology guidance).

Treat intentional GenAI buys as high‑risk procurements - run pre‑procurement risk and impact assessments, consult the California Department of Technology for moderate‑ or high‑risk systems, and require contractual commitments that protect agency data and IP, ban vendor use of agency data for training without consent, and mandate prompt notice of significant model changes or new features (California generative AI procurement guidelines and vendor reporting best practices).

Operationalize oversight with post‑award monitoring, audit logs, and clear escalation paths in the contract so procurement becomes a continuous lifecycle - practical payoff: when a vendor must report material model changes to the CDT and submit documentation, LA preserves service continuity and avoids the hidden liability that can shut down mission‑critical systems.

Procurement practiceRequired actionSource
Disclosure & contract clausesInclude State GenAI solicitation/contract language and GenAI Special Provisions when requiredCalifornia GenAI disclosure and contract language - CDT
Vendor transparencyVendors submit GenAI Disclosure/Fact Sheet; report significant model modificationsCalifornia generative AI procurement guidelines - vendor transparency and reporting
Risk review & oversightConduct risk assessments; consult CDT for moderate/high risk; require monitoring and audit logsGenAI guidance & procurement contract language - California Department of Technology

Fill this form to download the Bootcamp Syllabus

And learn about Nucamp's Bootcamps and why aspiring developers choose us.

Audits, impact assessments and civil-rights protections in Los Angeles, California

(Up)

Audits and impact assessments are the backbone of civil‑rights protection when Los Angeles deploys automated decision systems: California's own inventory exercise - where roughly 200 state entities reported no “high‑risk” automation despite well‑documented uses such as recidivism scoring and the Employment Development Department's 2020 fraud‑scoring pause that delayed benefits for more than a million people (about 600,000 later found legitimate) - shows why self‑reporting alone cannot be trusted (California state report on AI risks).

LA should require vendor‑independent bias audits and pre‑deployment impact assessments modeled on NYC and Colorado practices, mandate post‑deployment monitoring and remediation, and lock auditor independence and disclosure into contracts so findings drive fixes rather than disappear into procurement silos; California's new employment rules also push this direction, requiring anti‑bias testing and multi‑year records retention to support enforcement and defense (California ADS rules - employment standards effective Oct 1, 2025).

Practically, LA can prevent large‑scale harms by insisting on annual or event‑triggered impact assessments, independent audits for tools that substantially affect rights, and a four‑year retention policy for ADS data so auditors, advocates, and courts can reconstruct decisions and hold systems accountable (bias audits and impact‑assessment practices).

Required actionWhy it mattersSource
Independent bias audits (annual)Detect disparate impact and publish results for public scrutinyProskauer: AI bias audits
Impact assessments (annual + post‑modification)Identify foreseeable harms and require remediation before scalingWhite & Case: ADM regulation overview
Records retention (minimum 4 years)Preserve audit trails to enable enforcement and legal defensesNixon Peabody: CA employment ADS rules

“I only know what they report back up to us, because even if they have the contract… we don't know how or if they're using it, so we rely on those departments to accurately report that information up.” - Jonathan Porat, Chief Technology Officer, California Department of Technology

Workforce, training and capacity building for Los Angeles, California public servants

(Up)

Los Angeles can close the AI skills gap by pairing city training systems with proven, public‑sector–focused offerings: InnovateUS provides free, self‑paced courses and weekly workshops that teach prompt engineering, data‑privacy safeguards, procurement use cases (including an upcoming "AI for Public Sector Procurement" module), and a two‑part Responsible AI series designed to move teams from experimentation to scaled adoption - critical when agencies must both upskill staff and meet new state disclosure and procurement requirements; integrate these modules into the City of Los Angeles Cornerstone training pipeline so supervisors, IT teams, and program leads complete role‑based certifications and practical exercises before vendor rollouts (see the InnovateUS Responsible AI courses and workshops and the City of Los Angeles engagement & training portal for next steps).

InnovateUS has served 90,000+ learners across 150+ agencies and recently received a $5 million boost from Google.org to expand live coaching and cooperative placements, making it an inexpensive, scalable option for mandatory upskilling that preserves institutional knowledge while accelerating safe AI use in frontline services.

ResourceFormat / ReachKey detail
InnovateUS Responsible AI courses & workshopsFree, self‑paced + live workshopsServed 90,000+ learners across 150+ agencies; Google.org $5M support
Responsible AI for Public Sector Legal ProfessionalsTwo‑part course (~1 hour per part)Designed for government attorneys and support staff; certificate on completion
City of Los Angeles CornerstoneCity training portalExisting platform to integrate role‑based AI upskilling

“For the government to work better and be more accessible to the people it serves, our workers must have the opportunity to take advantage of the latest tools and technologies. By continuing to invest in upskilling programs for public sector professionals offered through InnovateUS, we can improve the effectiveness of how we solve problems while restoring much‑needed trust in our government.” - Beth Simone Noveck

Security, privacy and sector-specific rules affecting Los Angeles, California

(Up)

California's 2025 privacy landscape turns security and sector rules into operational imperatives for Los Angeles: the California Privacy Protection Agency (CPPA) has pushed formal rulemaking on automated decision‑making (ADMT), risk assessments and cybersecurity audits so tool inventories, DPIAs, and audit trails are no longer optional, and enforcement is expanding across state and federal agencies; at the same time the state tightened data‑broker oversight - including a 1550% increase in registration fees and a mandated Delete Request and Opt‑Out Platform (DROP) that opens for residents on January 1, 2026 with brokers required to begin processing deletions by August 1, 2026 - meaning municipal systems that share or receive third‑party data must be ready for public deletion and reporting requests (California state privacy law updates and data-broker rules (2025) - Calawyers).

Practical takeaway: Los Angeles agencies should budget for annual cybersecurity audits and documented ADMT risk assessments now (ADMT compliance timelines extend into 2027 and audit filing dates phase in by revenue tier), or face faster, coordinated investigations from the CPPA, state attorneys general, and federal enforcers that are already prioritizing sensitive‑data and tracking‑tech violations (CPPA ADMT regulations and implementation timelines for audits and risk assessments - PrivacyWorld).

Rule / RequirementKey deadline or detailSource
DROP portal (data‑broker deletion/opt‑out)Open to residents Jan 1, 2026; brokers process deletions by Aug 1, 2026Calawyers article on California privacy law 2025 (DROP portal details)
Data‑broker registration feeIncreased by 1550%Calawyers overview of data-broker rule changes
Automated decision‑making (ADMT) rulesCompliance window extends (profiling/ADMT compliance by Jan 1, 2027 for covered businesses)PrivacyWorld report on CPPA ADMT regulations and compliance timelines
Cybersecurity audits & risk assessment filingsFirst audit filing phased by revenue: earliest due Apr 1, 2028 for $100M+ entities (later dates for smaller entities)PrivacyWorld analysis of audit and assessment filing timelines

Conclusion: The future of AI in Los Angeles, California government

(Up)

The future of AI in Los Angeles government will be defined by a tightrope walk: seize fast‑moving federal incentives for data‑center permitting and workforce funding while meeting California's tougher disclosure, audit, and procurement rules so public trust and civil‑rights protections scale with capability.

Federal action plans in mid‑2025 prioritize infrastructure and aim to accelerate adoption - often tying incentives to state regulatory posture - so Los Angeles must pair any build‑out with mandatory AI inventories, pre‑deployment impact assessments, vendor clauses that ban using city data for model training, and independent bias audits to avoid losing eligibility or creating downstream harms (America's AI Action Plan - federal incentives and regulatory signals, July 2025 AI developments and procurement guidance).

Practical immediate steps: publish a public, machine‑readable AI inventory; harden contracts and post‑award monitoring; and require role‑based upskilling - practical training like Nucamp's 15‑week AI Essentials for Work helps supervisors and frontline staff learn prompts, tool use, and job‑based workflows so automation boosts service delivery without eroding institutional knowledge (Nucamp AI Essentials for Work - 15‑week bootcamp registration).

Doing these three things preserves access to funding, speeds rebuilding and permitting, and keeps transparency from being an afterthought.

ActionWhy it matters
Publish AI inventory & conduct DPIAsBuilds public trust and meets state disclosure requirements
Update procurement: GenAI clauses & audit rightsProtects data, ensures vendor accountability, preserves funding eligibility
Mandate role‑based training (15‑week options)Preserves institutional knowledge and enables safe, productive adoption

“has the obligation not to procure models that sacrifice truthfulness and accuracy to ideological agendas.” - Executive Order 14319 on “Preventing Woke AI in the Federal Government”

Frequently Asked Questions

(Up)

Why does Los Angeles need a focused AI guide in 2025 and what immediate steps should city leaders take?

Los Angeles faces wildfire recovery, staff shortages, and legacy systems that slow permitting and rebuilding. Immediate steps include publishing a public, machine-readable AI inventory and conducting pre-deployment impact assessments (DPIAs); updating procurement to include State GenAI solicitation language, GenAI Disclosure/Fact Sheets, and audit/logging rights; and mandating role-based upskilling (e.g., 15-week practical courses) so automation speeds service delivery without eroding institutional knowledge.

What are the key regulatory and compliance considerations for AI in the U.S. and California in 2025?

In 2025 there is no single federal AI law; the White House Action Plan provides 90+ federal actions emphasizing infrastructure and incentives, while California and many states have enacted transparency, deepfake, digital‑likeness and automated decision inventories. Practically, LA must track shifting federal funding and export/supply-chain rules, publish tool inventories to meet state disclosure mandates, update procurement clauses to access funds, and prepare for CPPA automated decision‑making (ADMT) rules and increased enforcement timelines through 2027–2028.

How is AI already being used in Los Angeles government and what are demonstrated benefits?

Agencies are using AI for building-permit pre-checks, zoning and land-use analysis, 24/7 applicant chatbots, and safety/compliance surfacing. The Archistar e-check pilot (donated to local governments) uses computer vision and rulesets to extract plan measurements and pre-validate code compliance, trimming preliminary review targets to 2–3 business days in pilots and removing review cycles that can add 30–50% to timelines - critical for faster wildfire rebuilds.

What procurement, audit, and civil-rights protections should Los Angeles require from vendors?

Require GenAI Disclosure and Fact Sheets, include State GenAI solicitation and special provisions, ban vendor use of agency data for model training without consent, mandate prompt notice of material model changes, and retain post-award monitoring and audit logs. Insist on independent bias audits (annual or event-triggered), pre- and post-deployment impact assessments for high‑risk systems, and a minimum four‑year retention policy for ADS data so auditors, advocates, and courts can reconstruct decisions and enforce remedies.

How should Los Angeles build workforce capacity for safe, productive AI adoption and what training options exist?

Pair city training platforms (e.g., City of Los Angeles Cornerstone) with public‑sector–focused programs to deliver role-based certifications and practical exercises. Free or low-cost options like InnovateUS (Responsible AI courses and workshops) scale broadly; for deeper, job-focused training Nucamp's 15-week AI Essentials for Work teaches prompts, tool use, and applied workflows to help supervisors and frontline staff safely integrate AI into government processes.

You may be interested in the following topics as well:

N

Ludo Fourrage

Founder and CEO

Ludovic (Ludo) Fourrage is an education industry veteran, named in 2017 as a Learning Technology Leader by Training Magazine. Before founding Nucamp, Ludo spent 18 years at Microsoft where he led innovation in the learning space. As the Senior Director of Digital Learning at this same company, Ludo led the development of the first of its kind 'YouTube for the Enterprise'. More recently, he delivered one of the most successful Corporate MOOC programs in partnership with top business schools and consulting organizations, i.e. INSEAD, Wharton, London Business School, and Accenture, to name a few. ​With the belief that the right education for everyone is an achievable goal, Ludo leads the nucamp team in the quest to make quality education accessible