Top 10 AI Prompts and Use Cases and in the Education Industry in Memphis
Last Updated: August 22nd 2025

Too Long; Didn't Read:
Memphis education pilots (Granville T. Woods 2024) and University of Memphis AI minor show hands‑on AI can boost outcomes: examples include ~50% grading-time reduction, GPA gains (1.75→3.5), predictive models up to 90% early‑dropout accuracy, and 300+ local AI job pathways.
Memphis schools stand at a practical inflection point: local pilots like the Granville T. Woods Academy project-based AI program (launched in 2024) show how hands-on AI teaching can open tech pathways for students from underserved neighborhoods, while the University of Memphis is formalizing campus-wide AI literacy with an upcoming AI for All minor that embeds ethics and prompt skills into degree options; both demonstrate training and career ladders that matter for Tennessee's workforce.
At the same time, high-profile deals and controversies - most recently the Memphis school board's approval of xAI's offer to fund repairs at four neighborhood schools - underscore how infrastructure, community trust, and digital-equity gaps (including a Shelby County risk of losing millions in device reimbursements) will shape whether AI investments actually benefit the region.
Practical pilot programs, aligned policy, and workforce-focused training like Nucamp's AI Essentials for Work can help Memphis turn AI hype into measurable student opportunity and local jobs.
Program | Length | Early-bird Cost | Registration |
---|---|---|---|
AI Essentials for Work (courses: AI at Work: Foundations; Writing AI Prompts; Job-Based Practical AI Skills) | 15 Weeks | $3,582 | Register for the AI Essentials for Work bootcamp |
“There is a mountain of resistance to this project. If the community doesn't want it, and the school system says, ‘We'll support you doing these things to our school,' then the school district is doing a disservice to the citizens that they serve in that community.”
Table of Contents
- Methodology - How We Compiled These Prompts and Use Cases
- Personalized Learning Pathways - Example Prompt for Shelby County Schools
- AI Teaching Assistant - Example Prompt for Granville T. Woods Academy of Innovation
- Automated Grading and Feedback - Example Prompt for University of Memphis
- Teacher Planning and Observation Support - Example Prompt for Shelby County Schools
- Predictive Analytics for Early Intervention - Example Prompt for Ivy Tech-style Pilots in Memphis
- Career Counseling and Workforce Alignment - Example Prompt for Memphis City Schools and University of Memphis
- Accessibility and Inclusive Learning Supports - Example Prompt for University of Memphis Libraries
- Virtual Labs and Simulation-based STEM - Example Prompt for Technological Institute of Monterrey-style Virtual Labs in Memphis Schools
- Mental Health and Student Wellness Support - Example Prompt for Memphis College Campuses
- Curriculum Adaptation and Multilingual Support - Example Prompt for Shelby County English Learner Programs
- Conclusion - Next Steps for Memphis Schools: Pilots, Policy, and Equity
- Frequently Asked Questions
Check out next:
Discover how AI's impact on Memphis classrooms in 2025 is reshaping teaching and learning across K–12 and higher education.
Methodology - How We Compiled These Prompts and Use Cases
(Up)Methodology combined policy, pilots, district context, and regional guidance to shape prompts that work in Tennessee classrooms: the legislative timeline in Tennessee Senate Bill 1711 AI school policy deadline (boards submit AI-use policies by July 1) set district-level compliance windows, while institutional rules such as the University of Memphis Generative AI Policy (July 2025) - instructor discretion and student disclosure guidance determined prompt features for syllabus-aligned instructions and citation steps; local pilots like Granville T. Woods' project-based AI program (2024) supplied classroom-ready project formats; and guidance from SCORE's Tennessee Opportunity for AI in Education - professional development, equity, and workforce alignment prioritized teacher training and equitable access.
Sources were cross-checked for dates, instructor permissions, and district capacity so each prompt includes explicit disclosure language, teacher PD scaffolds, and realistic resourcing assumptions tied to Memphis-Shelby County's operational context.
Source | Relevance | Key date |
---|---|---|
PedagogyFutures (SB1711) | District policy deadline | July 1, 2024 |
University of Memphis AI Policy | Instructor discretion & student disclosure rules | July 2025 |
STEMI / Granville T. Woods | Project-based AI classroom models | 2024 |
SCORE memo | PD, equity, pilot recommendations | June 6, 2025 |
Martini.ai (MSCS report) | District capacity & fiscal context | July 28, 2025 |
Personalized Learning Pathways - Example Prompt for Shelby County Schools
(Up)Design an AI prompt that generates a co‑constructed Personalized Learning Plan (PLP) for each Shelby County student by converting recent assessment results into explicit competencies, suggested pacing, and evidence-of-mastery options - mirroring the Search Institute's PLP framework that ties learning goals to how students will demonstrate them (Search Institute personalized learning plan framework).
The prompt should flag students for synchronous, small‑group intervention and recommend hybrid schedules or extended learning blocks that local leaders say are essential to “reimagine” pandemic recovery and broaden school options (Shelby County reimagine learning opinion (Commercial Appeal)).
Pairing AI PLPs with live virtual co‑teaching (the Proximity Learning model Shelby County already uses to lower student‑to‑teacher ratios) creates actionable rosters and hourly schedules so certified instructors can rotate between in‑person and virtual pods (Proximity Learning co‑teaching model).
The concrete payoff: personalized pathways have lifted students' outcomes in Shelby County pilots - one learner's GPA climbed from 1.75 to 3.5 - showing why an AI prompt must center student voice, pacing, and clear next steps for teachers and families.
Component | Source |
---|---|
PLP definition & co-construction | Search Institute |
Hybrid/personalized recovery models | Commercial Appeal (Hyde) |
Synchronous co-teaching & lower ratios | Proximity Learning |
“It's not just a school. It's not just an academy. It's a family.” - Kerrigan Aldridge, Big Picture Learning Academy student
AI Teaching Assistant - Example Prompt for Granville T. Woods Academy of Innovation
(Up)At Granville T. Woods Academy of Innovation, an AI Teaching Assistant can function as a curriculum‑aligned co‑teacher that shrinks busywork and amplifies instruction: Code.org's AI Teaching Assistant provides rubric‑aligned automated assessment, instant, curriculum‑matched feedback, and teacher-facing lesson scaffolds so classroom educators can ask the tool to “grade this Interactive Animations project against Code.org's rubric, list three concrete improvement steps, suggest one scaffolded practice task per skill gap, and flag any students needing small‑group support”; that workflow matches local project‑based learning and - per coverage cited by Code.org - has cut some teachers' grading time by roughly half, freeing time for targeted interventions.
Pairing the Code.org tool with district-scale assistants like SchoolAI real-time progress dashboards and language/access supports helps Granville T. Woods turn AI feedback into actionable rosters, parent communications, and PD prompts for Tennessee teachers.
Verified teacher access and unit‑specific rubrics (available via Code.org AI Teaching Assistant rubric page) make pilot prompts practical and compliant for the 2024–25 classroom cycle.
Feature | Why it matters for Granville T. Woods |
---|---|
Automated rubric grading | Frees teacher time for coaching and project mentoring (reported ~50% grading reduction) |
Curriculum alignment | Keeps feedback tied to Code.org CSD lessons and local project goals |
Differentiation suggestions | Generates scaffolded tasks so students at different levels can progress in one classroom |
"It's like having a TA!" - Flint teacher testimonial
Automated Grading and Feedback - Example Prompt for University of Memphis
(Up)For the University of Memphis, an effective automated grading prompt balances speed with safeguards: ask an AI to apply a course rubric to student essays, return granular feedback (organization, evidence, conventions), flag low‑confidence scores for human review, and produce a teacher-facing summary of common class errors and suggested mini-lessons - this hybrid workflow leverages auto‑grading for objective checks while using AI‑assisted review for nuance, mirroring distinctions in the OSU synthesis of “Auto‑Grading vs.
AI‑Assisted Grading” and the long record of systems like PEG that showed measurable gains in student writing quality (Ohio State University analysis of AI and auto-grading in higher education, PEG automated scoring research and outcomes).
Include explicit disclosure language in syllabi and require rubric alignment to reduce bias; as reviewers have noted, well‑crafted rubrics are the single biggest determinant of reliable automated feedback (Inside Higher Ed column on auto-graders and rubric impact).
The concrete payoff: use of rubric‑aligned automated feedback plus targeted human audits scales timely, formative comments across large sections while preserving instructor judgment for creativity and high‑stakes decisions.
Component | Purpose |
---|---|
Automated rubric checks | Fast objective scoring (grammar, structure, rubric items) |
AI‑generated student feedback | Actionable revision steps and mini‑lesson suggestions |
Human review gate | Audit biased/low‑confidence cases and assess creativity |
“Rubrics are also especially helpful for three groups of students: first‑generation college students, students who didn't come from elite high schools and students who aren't majoring in your field. In other words, the majority of your classroom can benefit from a clear, comprehensible statement of what makes for an A, C or F paper as far as sources, arguments, mechanics and the like.”
Teacher Planning and Observation Support - Example Prompt for Shelby County Schools
(Up)For Shelby County Schools, a high‑value teacher‑planning and observation prompt asks an AI to produce a standards‑aligned weekly unit (learning objectives, three tiered activity pathways, formative checks and exit tickets), a short observation rubric tied to the district's new Strategic Plan commitments (notably Professional Learning and Continuous Improvement), and teacher‑facing coaching notes that recommend focused PD modules and sample language for family communications; this workflow mirrors how AI lesson‑plan generators can generate differentiated materials and save teacher time while keeping human review central (Shelby County Schools Strategic Plan and commitments, NCCE review of new AI lesson plan generators, Edutopia guide to using AI for differentiated instruction).
The concrete payoff: one prompt converts unit priorities into classroom-ready, leveled tasks plus a three‑item observation checklist that links coaching conversations directly to district goals - turning scattered notes into evidence for PD investment and faster, targeted support for teachers and students.
Tool | Feature (as reported) |
---|---|
ClickUp | AI-powered content creation, task management and collaboration for planning |
Magic School AI | Personalized curriculum development and standards alignment |
Eduaide.AI | Real-time lesson adaptation and student response analysis |
Auto Classmate | Automated lesson plans, content differentiation and an instructional coach chatbot |
School AI | Secure, customizable K–12 platform for differentiation and progress tracking |
“I am pleased to present the new Strategic Plan for the Shelby County School District.” - Superintendent Lewis Brooks
Predictive Analytics for Early Intervention - Example Prompt for Ivy Tech-style Pilots in Memphis
(Up)Ivy Tech–style community college pilots in Memphis should begin with a single, operational prompt that ingests grades, LMS activity, attendance, and financial‑aid or engagement flags to produce an ordered risk roster, confidence score, and a ranked set of actionable interventions (automated texts, advisor outreach, tutoring referrals) so staff can act within the semester's critical early window; studies show predictive models can spot potential dropouts with up to 90% accuracy in the first 12 weeks, making early outreach measurably effective (predictive analytics dropout accuracy studies).
Community colleges implementing this workflow can improve retention by directing scarce coaching resources to the students most likely to benefit (community college retention using predictive analytics), but pilots must bake in privacy, audit trails, and explainability because recent reviews call for human‑in‑the‑loop governance to avoid bias and opaque decisions (systematic review of predictive AI in education).
The concrete payoff for Memphis: a tested prompt + advisor playbook can convert early flags into targeted contacts before midterm attrition becomes irreversible.
Indicator | Predictive Strength |
---|---|
Assignment completion rate | High |
Grades | High |
Attendance records | High |
Online LMS activity | Medium |
Communication with instructors | Medium |
Library/resource usage | Low |
“Our research has shown that when PLAs are coupled with targeted motivational interventions, we see a significant improvement in student retention rates. By reaching out to students who may need extra support, we can help them overcome challenges and achieve their academic goals.” - Dr. Emily Johnson
Career Counseling and Workforce Alignment - Example Prompt for Memphis City Schools and University of Memphis
(Up)Design a single AI prompt that maps a Memphis student's coursework, certifications, soft skills, and stated interests to real, local labor‑market signals - including Memphis's recent AI infrastructure and manufacturing investments - then returns a prioritized pathway: short-term stackable credentials, suggested Nucamp or university courses, internship or apprenticeship matches, employer contact scripts for counselors, and an individualized four‑semester timeline for credential completion and college enrollment.
Feed the prompt with regional labor data so recommendations align with Tennessee's fast-growing tech ecosystem (Oracle, Amazon, xAI's Memphis supercomputer and other projects) documented in the Memphis tech and AI investments overview (Memphis Moves: Tennessee - The Emerging “It” State in IT & AI) and with the University of Memphis AI research commitment press release (University of Memphis AI research and workforce investment announcement); include ROI and employer‑partnership templates so districts can justify pilots to boards and partners (measuring AI ROI for Memphis education case study: Measuring AI ROI for Memphis Education).
So what: tying counselor workflows to local demand turns exploratory college conversations into concrete employer pipelines - Memphis's xAI project alone projects 300+ high‑paying roles that graduating students could fill with short, certificate‑first strategies.
Accessibility and Inclusive Learning Supports - Example Prompt for University of Memphis Libraries
(Up)University of Memphis Libraries pair campus accessibility services and assistive tech so AI-driven prompts can produce usable, equitable learning materials for Tennessee students: the Libraries subscribe to SensusAccess (anyone with an @memphis.edu email can upload files) to convert image‑based PDFs into tagged, readable documents, MP3 text‑to‑speech files, or ready‑to‑emboss Braille formats delivered by email, while Scan & Deliver and a “place a hold” workflow turn physical items into scanned chapters or front‑desk pickups to avoid travel barriers; see the Libraries' accessibility overview for details and forms (University Libraries accessibility of collections and conversion services).
For Deaf and hard‑of‑hearing students, Disability Resources for Students offers ASL interpreters, CART live transcription, and the Genio Notes app to capture lectures and live captions - services that should be referenced in any prompt that requests captioning, alternate media, or human‑in‑the‑loop accommodations (Disability Resources for Students communication access services).
So what: a well‑crafted prompt that automates SensusAccess conversions, flags uncaptions for CART, and routes ILL/Scan requests to library staff can cut wait times and deliver accessible readings the same day, not weeks.
Service | What it provides | Contact / Note |
---|---|---|
SensusAccess | Converts image PDFs → tagged docs, MP3s, digital Braille | Available to @memphis.edu users; conversion delivered by email |
Scan & Deliver / Placing a Hold | Scanned chapters/articles; staff‑pulled items for pickup | ill_borrowing@memphis.edu / lib_circ@memphis.edu; phone numbers on Libraries page |
DRS Communication Access | ASL interpreters, CART, Genio Notes, assistive listening | Request through Disability Resources for Students |
Virtual Labs and Simulation-based STEM - Example Prompt for Technological Institute of Monterrey-style Virtual Labs in Memphis Schools
(Up)Create a single classroom prompt that spins a Monterrey‑style virtual lab into a Memphis-ready STEM lesson: ask the AI to launch PhET's free Projectile Motion simulation (set angle, speed, and mass), generate a step‑by-step student worksheet that records launch parameters and predicted range, and produce a teacher report that compares student data to expected trajectories and flags outliers for discussion; pair that digital workflow with PASCO's two‑dimensional motion protocols (photogate timings, mini‑launcher repeatability up to 2.0 m) for classes that can run hybrid hands‑on checks, and include alternative links to other vetted virtual labs so teachers can choose age‑appropriate modules.
This prompt keeps inquiry central - students experiment with cannon angle and speed, collect real numeric outputs, and practice claims‑evidence‑reasoning - so schools without full lab budgets can still run AP‑level investigations and produce shareable CSVs for grading or advisor review.
For Memphis districts, the concrete payoff is simple: reliable virtual data plus an audit trail of student predictions and trials that makes remote, blended, or COVID‑recovery units measurable and teachable at scale (PhET Projectile Motion simulation, Beakers & Ink virtual labs roundup, PASCO Two‑Dimensional Motion protocols).
Tool | Source | Key feature |
---|---|---|
PhET Projectile Motion | PhET | Free simulation; adjustable angle/speed for trajectory experiments |
Virtual Labs roundup | Beakers & Ink | Curated list of free and paid virtual lab platforms for grade differentiation |
PASCO Two‑Dimensional Motion | PASCO | AP‑level photogate & mini‑launcher protocols with repeatable, high‑precision data |
Mental Health and Student Wellness Support - Example Prompt for Memphis College Campuses
(Up)Memphis college campuses can pilot a single, safety‑first AI prompt that delivers scalable, day‑to‑day wellness support while preserving clinician oversight: the prompt should offer evidence‑based stress‑management exercises and links drawn from vetted sources, triage conversational cues with a confidence score, immediately route high‑risk or low‑confidence interactions to on‑call counselors, and write an auditable log tied to FERPA‑compliant consent so human teams can follow up - mirroring the University of Memphis research team's GenAI chatbot approach that trains on trusted resources like NIH and the APA to support academic‑stress management for a student body where nearly half report above‑average stress levels (University of Memphis GenAI chatbot project supporting student mental health).
Backstops must reflect evidence and limits from the literature - AI can flag early warning signs and scale routine coaching, but reviewers note both promise (predictive signals and preventative outreach) and the need for human‑in‑the‑loop governance to avoid harmful outcomes (Higher Ed Today article on AI and student mental health risks, Stanford HAI report on dangers of AI in mental health care).
The concrete payoff for Memphis: a prompt that shortens waitlists by steering low‑risk students to self‑management tools and immediately elevates true crises to clinicians, freeing counselors to focus on complex cases without surrendering responsibility or student privacy.
“LLM-based systems are being used as companions, confidants, and therapists, and some people see real benefits. But we find significant risks, and I think it's important to lay out the more safety-critical aspects of therapy and to talk about some of these fundamental differences.” - Nick Haber
Curriculum Adaptation and Multilingual Support - Example Prompt for Shelby County English Learner Programs
(Up)For Shelby County English Learner programs, craft an AI prompt that adapts curriculum by auto‑generating clear, standards‑aligned language objectives, student‑friendly sentence frames, and bilingual glossaries tied to each lesson's content - drawing on proven practices like explicit instruction, use of native language, and planned vocabulary exposures (research recommends about 12–14 meaningful encounters per target word) so every scaffold is teachable and trackable for WIDA proficiency bands; see Reading Rockets' summary of 10 key ELL practices for concrete scaffolds (Reading Rockets: 10 key practices for teaching English language learners).
Include automated supports such as closed captions and voice‑to‑text for writing drafts, paired with structured peer discussion prompts and visual organizers so technology reduces barriers rather than replaces instruction (Edutopia: 10 strategies to support English language learners across all subjects).
Embed a curriculum‑design kernel (Graves' systems approach) so the prompt returns teacher‑ready lesson plans, differentiated tasks, and assessment checkpoints that map to content and language goals (Pepperdine: TESOL curriculum design best practices).
The concrete payoff: an automated prompt that posts student‑friendly language objectives, schedules vocabulary repetitions, and outputs bilingual exit tickets can cut teacher prep time while making language demands explicit for students, families, and coaches.
Strategy | Function for EL Prompts | Source |
---|---|---|
Language objectives | Clarify academic language students must use to meet content goals | Colorín Colorado / Pepperdine |
Vocabulary (12–14 exposures) | Planned repetition across activities to build retention | Reading Rockets |
Native‑language supports & visuals | Cognates, translations, and graphic organizers to scaffold comprehension | Reading Rockets / Edutopia |
Tech accessibility (captions, voice typing) | Reduce barriers to listening, speaking, and writing tasks | Edutopia |
“If you want to see language development, language objectives are a great first step.”
Conclusion - Next Steps for Memphis Schools: Pilots, Policy, and Equity
(Up)Next steps for Memphis schools should pair short, accountable pilots with explicit governance and staff training so equity isn't an afterthought: launch school‑level pilots modeled on the Granville T. Woods Academy chatbot/STEM pilot to test classroom workflows, require district review against an AI preparedness rubric like the 1EdTech AI Preparedness Checklist for educational AI policy and procurement for policy, procurement, and literacy safeguards, and adopt faculty-facing guidance such as the UT Health Science Center AI resources for syllabus statements and human-in-the-loop limits to standardize syllabus statements and human‑in‑the‑loop limits.
Build measurable success criteria into each pilot (example: aim to reproduce Code.org's ~50% grading‑time reduction for automated feedback while auditing 100% of low‑confidence items), require vendor DPAs that forbid model‑training on student data, and fund a cohort of teacher leaders by enrolling district coaches in a practical staff bootcamp (a 15‑week option like the Nucamp AI Essentials for Work 15-week bootcamp accelerates prompt literacy for non‑technical staff).
The payoff is concrete: short pilots plus clear policy create auditable outcomes - faster feedback, narrower achievement gaps, and documented privacy protections - that boards and families can evaluate before scale.
Program | Length | Early‑bird Cost | Registration |
---|---|---|---|
AI Essentials for Work (AI at Work: Foundations; Writing AI Prompts; Job‑Based Practical AI Skills) | 15 Weeks | $3,582 | Register for Nucamp AI Essentials for Work 15-week bootcamp |
“Generative AI (Gen AI) “should be used as a supplement to rather than a replacement for human expertise and judgment” (Bhattacharya et al., 2023).
Frequently Asked Questions
(Up)What are the top AI use cases and example prompts for Memphis education institutions?
Key use cases include: 1) Personalized Learning Plans (prompts that convert assessment data into competency-based PLPs, pacing, and intervention flags for Shelby County students); 2) AI Teaching Assistants (rubric-aligned grading, feedback, and scaffolds used in Granville T. Woods project-based classes); 3) Automated grading and feedback (rubric-driven essay scoring with human review at the University of Memphis); 4) Teacher planning and observation support (standards-aligned weekly units, observation checklists, and coaching notes for Shelby County); 5) Predictive analytics for early intervention (risk rosters and ranked interventions for community college pilots); 6) Career counseling and workforce alignment (mapping student records to local labor-market signals and stackable credential pathways); 7) Accessibility and inclusive learning supports (automating SensusAccess conversions, captioning, and alternative media); 8) Virtual labs and simulations (PhET/PASCO workflows for inquiry-based STEM); 9) Mental health and wellness triage (safety-first chatbots with clinician escalation); and 10) Curriculum adaptation and multilingual supports (language objectives, bilingual glossaries, and planned vocabulary exposures). Each prompt must include disclosure language, teacher PD scaffolds, and resourcing assumptions tailored to Memphis-Shelby County context.
How were the prompts and use cases compiled and vetted for Memphis schools?
Methodology combined district and state policy timelines (e.g., board AI-use policy deadlines), local pilots (Granville T. Woods project-based program), higher-education plans (University of Memphis AI literacy initiatives), and regional reports (SCORE, Martini.ai/MSCS). Sources were cross-checked for dates, instructor permissions, and district capacity. Prompts were designed with explicit disclosure language, human-in-the-loop safeguards, teacher PD scaffolds, and realistic resourcing assumptions that map to Memphis-Shelby County operational conditions and compliance windows.
What governance, privacy, and equity safeguards should Memphis districts require when piloting these AI prompts?
Districts should require: vendor Data Protection Agreements that forbid model-training on student data; syllabus disclosure statements and explicit consent language; human-in-the-loop review for low-confidence or high-stakes outputs; audit trails and explainability for predictive models; targeted PD for staff (e.g., prompt literacy bootcamps like Nucamp's AI Essentials for Work); measurable pilot success criteria (e.g., audit 100% of low-confidence automated grades); and equity checks to ensure device access and digital reimbursement continuity. Local governance rubrics and vendor procurement reviews should be used before scale-up.
What measurable outcomes and pilot designs should Memphis schools use to evaluate AI projects?
Design short, accountable pilots with clear metrics such as grading-time reduction targets (example: reproduce Code.org's ~50% reduction), improvement in targeted student outcomes (e.g., GPA increases from pilots), retention/early-intervention impact (percent reduction in midterm attrition), timeliness of feedback, accessibility turnaround times (e.g., same-day SensusAccess deliveries), and audit coverage for low-confidence cases. Each pilot should pair an operational prompt with staff training, human review gates, vendor DPAs, and an evaluation window tied to board reporting requirements.
How can Memphis educators and counselors use AI to connect students to local workforce and training opportunities?
Use a single, seeded prompt to map student coursework, certifications, skills, and interests to local labor-market data (including Memphis AI and manufacturing investments). The prompt should return prioritized pathways: stackable credentials, recommended Nucamp or university courses, internship/apprenticeship matches, employer contact scripts for counselors, and a four-semester timeline for credential completion. Include ROI templates and employer-partnership artifacts to justify pilots to boards and track placement outcomes. This ties classroom and advising workflows directly to regional hiring needs and short-term credential strategies.
You may be interested in the following topics as well:
Many Administrative Assistants and School Office Clerks face automation of routine tasks like attendance and scheduling unless they upskill.
See how project-based AI programs at Memphis schools are preparing students with workforce-ready skills and saving districts money long-term.
Ludo Fourrage
Founder and CEO
Ludovic (Ludo) Fourrage is an education industry veteran, named in 2017 as a Learning Technology Leader by Training Magazine. Before founding Nucamp, Ludo spent 18 years at Microsoft where he led innovation in the learning space. As the Senior Director of Digital Learning at this same company, Ludo led the development of the first of its kind 'YouTube for the Enterprise'. More recently, he delivered one of the most successful Corporate MOOC programs in partnership with top business schools and consulting organizations, i.e. INSEAD, Wharton, London Business School, and Accenture, to name a few. With the belief that the right education for everyone is an achievable goal, Ludo leads the nucamp team in the quest to make quality education accessible