In many talks with IT teams and L&D teams, I kept hearing the same sentence, we train a lot, but the gap still grows.
-
Budgets exist, platforms exist, course lists exist, yet key roles stay hard to fill.
-
At the same time, the ground keeps moving, the World Economic Forum expects around 39% of core skills to change by 2030.
-
Generative AI speeds this up, Gartner expects that by 2026 more than 80% of companies will use it productively.
-
In Germany, the pressure is even higher, Bitkom reports a shortage of around 109,000 IT specialists for 2025.
So I wanted to stop guessing, and map what is really happening across planning, roles, time, and daily work.
The research shows a pattern, we often measure training activity, but we do not link learning to real work and performance. OECD findings also point to the same blocker again and again, lack of time, not lack of courses.
That is why we created this case study and a step by step gameplan up to 2030, so teams can move from more training to real skill impact.
Our new case study IT Training 2030 shows what will change by 2030 and what really matters: skills, roles, learning formats, and governance. The report gives a clear overview of trends, myths, and effective levers, and includes a practical game plan for companies.
Why tecnovy did this research
We noticed a recurring pattern: companies invest heavily in training, but once employees return to their daily work, they often struggle to apply what they learned.
We wanted to understand what separates teams that build real, measurable skills from teams that mainly complete programs and courses.
Our guiding questions
-
Why do training budgets remain unused or get spent only late in the year?
-
Why do certificates help at the beginning, but later lose their value as proof of real skills?
-
Why does the skills gap often remain, even after extensive training programs?
Quick look back. Which phase shaped you most?
Most IT teams moved through a few learning waves. Each wave changed how you learn.
Phase 1, the classroom years
Travel, hotel, printed slides, one trainer, one room.
Learning felt slow, but time stayed protected.
Question: Do you still block learning time today, or does learning happen after work?
Phase 2, the e-learning years
LMS rollouts, video libraries, completion rates as proof.
Question: Did learning get easier, or only easier to track?
Phase 3, the agile and cloud years
More change, more tools, and skills had to move faster than yearly plans.
Question: When did training start to feel too slow for your day-to-day?
Phase 4, microlearning and platforms
More content, less clarity, lots of choice, weak focus.
Question: Did choice help your team, or add noise?
Phase 5, AI at work
AI shows up in daily tasks, with new risks and new rules.
Question: Is AI use supported in your org, or happening on the side?
What we found, the few truths that kept showing up
1- Training volume is not the bottleneck
More courses won’t solve the problem. Learning needs to be designed as a system within the team, with clear roles, defined goals, and a smart sequence.
2- Certificates still matter, but in a new way
Use them as tools for orientation and hiring, but assess impact through real work outcomes, not formal credentials.
3- The budget problem is often a visibility problem
When skill goals stay vague, impact stays vague. Budgets get delayed or spent late without a plan. Set clear goals and link spend to results.
4- Time beats motivation
People want to learn, time limits them. Block learning time in your calendar and protect it like a meeting.
5- Input metrics hide the real story
Hours trained and completion rates look fine on paper: track delivery quality, speed, and risk to see real progress.
Common beliefs, tested against reality
Belief: Offering more courses will automatically raise skill levels.
Reality: More courses often reduce focus and hide results.
Quick test: Name the 5 skills your team must build this quarter.
What helps instead: Set role based skill goals and link them to project outcomes.
Belief: Certificates prove skill.
Reality: They guide hiring and orientation, but impact shows up in delivery and decisions.
Quick test: After a certification, what changed in real work within 4 weeks?
What helps instead: Add practice tasks, reviews, and proof in live projects.
Belief: Motivation is the main problem.
Reality: Time is the main problem. Without protected time, learning moves to after work.
Quick test: Does your team have fixed learning time on the calendar?
What helps instead: Block learning time and treat it like delivery time.
Belief: We measure learning, so we manage learning.
Reality: Many metrics track activity, not capability.
Quick test: Do your metrics show better quality, speed, or lower risk?
What helps instead: Shift from input KPIs to outcome KPIs tied to work.
What is changing until 2030, and what stays constant
Skills change faster
Core skills will shift heavily by 2030. More roles need cross skill thinking, and people must validate outputs, not just produce them.
Learning moves into the workflow
Training events alone don’t build skills. Real work, mentoring, and feedback do.
AI adds a new layer
AI is not only a tool topic. It adds risk, quality needs, accountability, and governance rules that teams must learn and follow.
Step by step game plan, where to start when you feel stuck
Phase 1, months 0 to 3, create clarity
Goal: Stop doing “a bit of everything.”
Moves: Pick priority roles, set skill targets per role, choose a small set of skill clusters.
Proof: One or two outcome metrics per cluster, so progress shows up in real work.
Phase 2, months 3 to 12, build the system
Goal: Make learning part of delivery.
Moves: Link learning to projects, reviews, and team habits, add protected learning time.
Proof: A simple skill view leaders and teams use, not a hidden LMS report.
Phase 3, 12 months plus, scale what works
Goal: Turn learning into long term capability.
Moves: Build role paths, refresh often, support internal mobility instead of only hiring.
Proof: Fewer random trainings, more repeatable growth across teams.
Example path
Cloud Engineer. Target skills include cloud basics, security by design, cost control, incident response. Train with short inputs plus practice on real cloud projects.
It’s not only an HR job. Ownership must be clear.
Real skill impact needs shared ownership. If one part of the system is missing, learning stays activity, not capability.
-
HR and L&D: build the learning system, choose partners, measure outcomes.
-
IT leadership: set priorities, define roles, protect time, back the plan with funding.
-
Team leads: run feedback loops, connect learning to real tasks, coach growth.
-
Employees: practice, show proof in work, learn with peers.
-
Procurement and finance: buy for outcomes, not course volume.
-
Compliance and security: set rules for AI and risk, support safe tool use.
Self check: Which role is missing most in your org, leadership, time, or measurement?
What tecnovy changed based on the findings
This study also challenged us. We used the findings to adjust how we design learning, how we measure it, and how we connect it to real delivery.
What we stopped doing
We stopped treating learning like a long course menu, where more choice equals more progress.
What we started doing
We focus more on role based paths, with clear skill targets, protected time, and practice that shows up in real work.
We also put more weight on outcome signals, not only attendance, completion, or certificates.
How this maps to the game plan
Phase 1 is clarity, roles, skill clusters, and measurable outcomes. Phase 2 is learning in projects and team routines. Phase 3 is scaling paths and mobility. Our offer follows that logic, so training does not stay an isolated event.
What this looks like in practice
-
Role based learning paths, not just course lists.
-
More practice formats, reviews, labs, and coaching.
-
AI courses that include safe use and governance, not only tools.
-
Clear learning outcomes teams can show in real delivery.
If you want the full framework, the visuals, and the detailed game plan, you can read the full report here.