AI Governance · Spoke 9 · Literacy & Training

AI Literacy and Training for Employees: The 2026 CEO Playbook

DataCamp's 2026 State of Data and AI Literacy report finds 82 percent of enterprise leaders say their organization offers some AI training; 59 percent still report an AI skills gap. EU AI Act Article 4 became applicable February 2, 2025. National-authority enforcement begins August 3, 2026. Here is the structured CEO playbook that closes the shortfall.

By Harrison Painter May 10, 2026 Updated May 12, 2026 9 min read

DataCamp's 2026 State of Data and AI Literacy report (500-plus US and UK enterprise leaders, fielded with YouGov) finds 82 percent of leaders say their organization offers some kind of AI training, and 59 percent still report an AI skills gap. That is the diagnostic signal: most current AI training is fluency-tier work sold as proficiency-tier work. The EU AI Act Article 4 mandate became applicable February 2, 2025. National market surveillance authorities begin enforcement August 3, 2026. The CEO playbook that closes the shortfall has five steps: baseline every employee, map roles to required levels, pick a structured program, measure pre and post, sustain across 12 months. The 7 Levels of AI Proficiency provides the measurement spine.

82 percent offer AI training; 59 percent still report a skills gap

DataCamp's 2026 State of Data and AI Literacy report, fielded with YouGov across 500-plus US and UK enterprise leaders, finds 82 percent of leaders say their organization offers some form of AI training. In the same report's accompanying definitions and statistics blog, 59 percent still report an AI skills gap inside their organization. The pattern DataCamp characterizes is structural: training exists, resources exist, urgency exists, but capability outcomes lag.

The pattern shows up in adjacent enterprise data as well. Deloitte's State of AI in the Enterprise 2026 (3,235 senior leaders across 24 countries, fielded August through September 2025) reports 84 percent of organizations have not redesigned jobs or workflows around AI capabilities. The two top talent-strategy adjustments organizations are making: 53 percent educating the broader workforce to raise AI fluency, and 48 percent designing upskilling and reskilling strategies. Deloitte names insufficient worker skills as the biggest barrier to integrating AI into existing workflows.

Read the DataCamp finding plainly. Eight in ten enterprise companies are paying for training. Six in ten still cannot find the skill they paid to develop. The training is happening. The capability is not.

This is the structural diagnostic for the entire training market in 2026. Fluency-tier training is being sold as proficiency-tier work. CEOs who buy literacy and call it readiness arrive at the same shortfall position the next quarter, only with a slightly larger budget line.

Two adjacent datasets corroborate the pattern. The EDUCAUSE 2026 study by Jenay Robert (N=1,960 staff, administrators, and faculty across 1,800-plus public and private US institutions) reports 94 percent of higher education workers use AI tools while only 54 percent are aware of their institution's AI use policies and guidelines. That is a 40-point split between use and policy awareness. McKinsey's State of AI 2025 (1,993 participants across roughly 105 countries) reports 88 percent of organizations now use AI in at least one business function while only 28 percent of respondents whose organizations use AI report CEO-level oversight of AI governance. The shape repeats: deployment ahead of governance, training ahead of measurement.

EU AI Act Article 4: what it requires and who is on the hook

Article 4 of the EU AI Act became applicable February 2, 2025. The European Commission AI Office FAQ states that "the supervision and enforcement rules apply from 3 August 2026 onwards," and that supervision of Article 4 sits with national market surveillance authorities rather than the AI Office. The text of Article 4 is short and binding:

Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on whom the AI systems are to be used.

The obligation reaches non-EU companies whose AI outputs are used in the EU. A US software vendor with EU customers is in scope. A US manufacturer whose AI-driven output flows to an EU buyer is in scope. A US professional services firm whose AI-assisted deliverables are consumed by EU clients is in scope. The extraterritorial reach mirrors GDPR's effective scope, and EU buyers have already started adding AI Act language to procurement reviews and master service agreements.

On penalties: Article 4 is not assigned a specific penalty tier in Article 99 of the EU AI Act. The AI Office FAQ states that national market surveillance authorities "could impose penalties and other enforcement measures to sanction infringements of Article 4," with penalties being "proportionate, based on the individual case." For comparison: Article 99 caps general non-compliance with other key obligations (Articles 16, 22, 23, 24, 26, 31, 33, 34, and 50) at €15 million or 3 percent of global turnover, and prohibited-practice violations under Article 5 at €35 million or 7 percent.

Indiana exported $60.2 billion in goods in 2024, with Germany ($4.2 billion), the United Kingdom ($1.7 billion), and France ($1.4 billion) among the top European trading partners for the state's pharmaceutical, automotive, and aerospace manufacturers. Many Indiana mid-market companies have direct EU exposure today. National-authority enforcement of Article 4 starts August 3, 2026, roughly three months from this article's publish date. This article is informational; consult counsel before making compliance decisions on the EU AI Act or any other regulation.

AI literacy vs AI fluency vs AI proficiency

Three layers, not synonyms. Treating them as the same word is the structural reason most AI training programs underdeliver.

AI literacy is the recognition layer. The employee understands what AI is, what it can and cannot do, and how it shows up in the tools they already use. They can recognize a generated email when one arrives in their inbox. They can explain at a high level what a large language model is. This is the layer EU AI Act Article 4 addresses at minimum.

AI fluency is the interaction layer. The employee can use AI tools for their personal work: drafting, summarizing, brainstorming, editing, coding assistance. They have working comfort with at least one tool. This is what most AI training programs sold today actually deliver.

AI proficiency is the performance layer. The employee can deploy AI inside the workflows that produce company outcomes: pricing decisions, customer routing, financial analysis, hiring screens, content production, customer service operations. The output is measurable in dollars, hours, or quality. This is where The 7 Levels of AI Proficiency lives.

For the canonical breakdown of all three layers and how to choose between them, see the dedicated read at AI Proficiency vs AI Literacy vs AI Fluency.

A 5-step CEO framework that actually moves the needle

Each step points at the infrastructure that operationalizes it. The article answers "how do I train employees on AI"; the answer points at real instruments built for the job, because the alternative is the DataCamp finding of 82 percent training prevalence alongside a 59 percent skills gap.

Step 1: Establish the baseline. Before designing any program, measure where every employee actually sits on a defensible scale. The free assessment at assess.launchready.ai produces a 7 Levels score per individual and a team distribution in roughly 10 minutes per employee. Without a baseline, every training dollar is spent in the dark and the after-program ROI conversation has nothing to anchor on.

Step 2: Map levels to roles. Not every role needs Level 7. A receptionist needs Level 1 to Level 2 reliably. A marketing manager needs Level 3 to Level 4. A revenue operations leader needs Level 4 to Level 5. A CTO or COO needs Level 5 to Level 6. The CEO needs to operate at Level 6 to Level 7 for AI strategy decisions to be defensible at the board level. The 7 Levels of AI Proficiency provides the level-by-role mapping language that holds up in those discussions.

Step 3: Choose the training path. Training has to follow the literacy then fluency then proficiency progression. Buying fluency-tier training when the workforce needs proficiency-tier work is the most common spend mistake in 2026. The 7 Levels Engagement is built to move L1-L2 employees to L3 over six weeks with structured workshops, capability audits, and Day-90 measurement built into the price.

Step 4: Measure the outcome. A program without a post-measurement is a workshop rather than a training program. The 7 Levels Engagement delivers a written capability audit at close. The DataCamp finding of 82 percent of organizations offering AI training alongside 59 percent still reporting an AI skills gap is what happens at scale when training is bought without measurement. CEOs who require a pre and post score on every program close that shortfall in two cycles.

Step 5: Sustain the gain. The Ebbinghaus forgetting curve, replicated in PLOS One in 2015 by Murre and Dros, documents substantial loss of unreinforced material within days of initial learning. The 7 Levels Mastery Track is a 12-month annual program built on monthly cadence, role-specific cohort design, and quarterly re-measurement. Structure is what defeats the forgetting curve. Spaced reinforcement is the established intervention. A single high-density workshop without follow-on cadence cannot do the job the math requires.

Indiana operators: where this applies today

Indiana mid-market companies sit on three vectors of exposure that compound. The first is direct EU exposure. Indiana exported $60.2 billion in goods in 2024, with Germany ($4.2 billion), the United Kingdom ($1.7 billion), and France ($1.4 billion) among the top European trading partners for the state's pharmaceutical, automotive, and aerospace manufacturers. EU buyers are now adding AI Act language to RFPs, security reviews, and master service agreements. An Indiana mid-market company without documented AI literacy training is at a measurable disadvantage in those procurement conversations starting today.

The second is the IN AI initiative announced by Governor Mike Braun on April 28, 2026. The program is positioned to reach 1 million Hoosiers and is executed by the Central Indiana Corporate Partnership (CICP) through workshops, virtual demonstrations, and direct outreach via regional partners and industry networks, per the Governor's announcement and Indiana Capital Chronicle coverage. The state's vocabulary is "human-centered AI," which sits at the L1-L2 literacy floor. CEOs who treat the IN AI initiative as a substitute for internal proficiency development will arrive at the same shortfall position the EDUCAUSE 2026 study documented for higher education itself. Awareness is not capability.

The third is local higher education. Ivy Tech, Indiana University, and Purdue all run AI training programs of varying depth. The Indiana University GenAI 101 course is free and well-built (covered in our piece on the program), but a free 6-hour course is a literacy artifact rather than a proficiency program. The companies in Indiana that compound a real workforce advantage over the next 24 months will be the ones treating the public programs as the floor and building structured internal proficiency programs on top.

The 7 Levels of AI Proficiency: the framework that makes literacy-to-proficiency progression measurable

Most AI training market vocabulary collapses literacy, fluency, and proficiency into one word. The 7 Levels of AI Proficiency separates them into seven measurable individual stages, each with role examples and progression criteria. Article 4 only requires roughly Level 1 across the workforce. CEOs who treat L1 compliance as the goal will satisfy the regulator and lose the market.

  • Level 1: Cadet. The literacy floor. The employee recognizes AI in their workflow. Adequate for roles with no AI-touch surface (some operational, some compliance). Article 4 minimum.
  • Level 2: Ensign. Basic interaction. Can use one AI tool for personal productivity. Most knowledge workers should be at L2 minimum by end of 2026.
  • Level 3: Lieutenant. Fluency. Can structure prompts, choose between models for task type, use AI across multiple tools. The L1-to-L3 movement is the standard 7 Levels Engagement outcome.
  • Level 4: Commander. Early proficiency. Can deploy AI inside specific business workflows and measure the output. This is the inflection point where AI moves from personal productivity to company KPI.
  • Level 5: Captain. Design proficiency. Can architect AI-native workflows from scratch, choose between agentic and prompt-based approaches, govern model selection across teams.
  • Level 6: Admiral. Organizational proficiency. Can build and oversee a multi-team AI deployment, design organizational governance, balance compute spend against output quality.
  • Level 7: Mission Director. Strategic AI orchestration. Can set company-wide AI strategy, evaluate AI investments at the board level, and lead through AI-driven business model shifts.

The 7 Levels of AI Proficiency is the individual-level scale. It pairs with the org-level AI Governance Maturity Model for a complete read on a company's position. Org maturity Level 2-3 needs individual L4-L5 on the team building governance. Org maturity Level 4 needs individual L5-L7. Both required.

Six ways AI training programs fail

These are the structural failure modes the DataCamp 82-vs-59 finding is measuring at scale, with corroborating signal from Deloitte on the workflow-redesign side. CEOs evaluating training vendors should treat each as a disqualifying question.

One: the one-and-done workshop. A two-day intensive with no Day-30 reinforcement, no Day-90 re-measurement, no manager-side accountability. The Ebbinghaus forgetting curve documents substantial loss of unreinforced material within days. The cost is wasted, the capability is unchanged, and the CEO believes the team was trained because the LMS marked the box complete.

Two: training without measurement. No pre-score. No post-score. No way to defend the spend to the board. This is the structural cause of the pattern DataCamp documents: 82 percent of organizations offering training alongside 59 percent still reporting an AI skills gap. Any vendor pitching training without a measurement layer is selling fluency at a proficiency price.

Three: same training for every role. A receptionist and a CFO get the same 4-hour module. The receptionist is over-trained, the CFO is under-trained, and the company spends 5 to 10 times what role-mapped training would have cost. Role mapping is what makes training programs efficient at scale.

Four: tool training instead of skill training. "Here is how to use ChatGPT Enterprise" is fluency training. It does not teach the employee how to evaluate when to use a different model, when to escalate to human review, or how to integrate AI output into a downstream workflow. Tool training expires the moment the tool changes, and tools change every quarter now.

Five: training without governance integration. AI literacy training that never references the company's acceptable use policy, data classification rules, or escalation paths produces employees who use AI confidently and dangerously. Article 4 is explicit on this point: literacy must be tied to the context the AI systems are used in, including the persons on whom they are used.

Six: training as a one-time line item rather than an annual program. Companies that compound advantage are running 12-month structured programs with quarterly cohort re-measurement, rather than signing up for a single Q3 workshop and treating the box as checked. Deloitte's 84-percent-have-not-redesigned-jobs finding suggests the workflow side of the equation is even less well-served than the training side. The annual structure is the only one the math supports.

Frequently asked questions

How do I train employees on AI?

Start with a baseline assessment of every employee's current AI level. Map roles to required levels, since not every role needs Level 7. Choose a structured program that includes pre-measurement, post-measurement, and Day-90 retention checks. Run it as a 12-month annual program rather than a one-time workshop. The Ebbinghaus forgetting curve, replicated in PLOS One in 2015, shows substantial loss of unreinforced material within days; spaced reinforcement is the established intervention.

What is the EU AI Act Article 4 AI literacy requirement?

Article 4 requires providers and deployers of AI systems to ensure a sufficient level of AI literacy among staff and other persons dealing with AI on their behalf. It became applicable February 2, 2025. Supervision and enforcement by national market surveillance authorities apply from August 3, 2026 onwards, per the European Commission AI Office FAQ. Article 4 is not assigned a specific penalty tier in Article 99; national authorities can impose proportionate penalties based on the individual case. The obligation reaches non-EU companies whose AI outputs are used in the EU.

How much should I budget for AI training?

Full programs benchmark $1,000 to $1,500 per employee per year, including training delivery, tool licenses, and platform support. Small intensive cohorts run to $3,500 per employee. At-scale rollouts of 1,000-plus seats can drop to $800 per employee. Plan AI training as 10 to 25 percent of L&D budget in the first 1-2 years, stabilizing to 5 to 15 percent. See the full breakdown at AI Training Cost for a Leadership Team in 2026.

What is the difference between AI literacy and AI proficiency?

AI literacy is the recognition layer (knowing what AI is and what it can do). AI fluency is the interaction layer (using AI tools effectively for personal work). AI proficiency is the performance layer (deploying AI inside workflows that move company KPIs). A large share of AI training sold today is fluency-tier work priced as proficiency-tier work. Full breakdown at AI Proficiency vs AI Literacy vs AI Fluency.

How long does AI training take to show results?

Pure literacy training: 1 to 3 days. Fluency training: 2 to 4 weeks of structured practice. Proficiency: 6 weeks for L1-L2 to L3 movement under a structured program; 12 months for measurable cross-team capability lift. Day-90 retention is the variable that distinguishes a training program from a workshop. Without spaced reinforcement, the Ebbinghaus forgetting curve documents substantial loss of unreinforced material within days.

Should AI training be the same for every employee?

No. Role-mapped training is the difference between effective programs and wasted budget. A receptionist needs Level 1 to Level 2. A revenue operations leader needs Level 4 to Level 5. A CTO needs Level 5 to Level 6. Same training for everyone over-trains some employees, under-trains others, and costs 5 to 10 times what role-mapped training costs.

How do I measure AI training effectiveness?

Three measurements are required. Pre-program assessment of each employee's AI level. Post-program assessment on the same scale. Day-90 re-measurement to confirm retention. Without all three, the program is undefendable to a CFO or board. The 7 Levels Engagement includes all three plus a written capability audit at close.

Does my Indiana company have to comply with EU AI Act Article 4?

If your company sells products or services into the EU, or if your AI outputs are used in the EU (think SaaS with EU customers, content distributed through EU platforms, B2B services to EU companies), then yes. The Article 4 obligation reaches non-EU providers and deployers. Indiana exported $60.2 billion in goods in 2024, with Germany ($4.2 billion), the United Kingdom ($1.7 billion), and France ($1.4 billion) among the top European trading partners. Indiana mid-market companies in pharma, auto parts, aerospace, and software have direct EU exposure today, and EU buyers are starting to add AI Act language to procurement contracts.

Sources

This article is informational only. It is not legal advice. Consult counsel before making compliance decisions on the EU AI Act or any other regulation.

Harrison Painter
Harrison Painter
AI Business Strategist. Founder, LaunchReady.ai and AI Law Tracker.

Harrison helps teams build AI systems that cut cost and grow revenue. Nearly twenty years of business experience. 2.8M YouTube views. Founder of LaunchReady.ai and the creator of The 7 Levels of AI Proficiency framework. Author of You Have Already Been Replaced by AI and The White-Collar Factory is Closing.

Connect on LinkedIn

Find your AI Proficiency level

The free 7 Levels of AI Proficiency assessment places you across seven stages of AI capability. Under ten minutes. Research-backed scoring. Free to take. Your team baseline starts here.

Get the weekly briefing

LaunchReady Indiana delivers AI news, compliance updates, and case studies for Indiana leaders. Every Tuesday. Five minutes.

Subscribe free