Followers

Sunday, February 8, 2026

The Future of Work Has Changed—Is Your Education Ready?

Imagine telling your computer, "Prepare my presentation using last week's sales data," and then walking away to make coffee. When you return, a complete 24-slide presentation is waiting for you. The AI found the data, organized it, and built the whole thing—without any step-by-step instructions.

This isn't science fiction. It's happening right now (Schram, 2026b).

Welcome to the "Jarvis moment"—named after the AI assistant from Iron Man. For years, AI was like a smart librarian: you asked questions, it gave answers. Now, AI can think ahead, remember past conversations, and complete complex tasks on its own. It doesn't just respond anymore. It acts (Schram, 2026b).

This is exciting. But it's also creating some serious challenges for anyone starting their career.


The Disappearing First Job

Here's the problem: beginner jobs are vanishing.

Think about how careers used to work. You graduate, get an entry-level job, and do simple tasks—collecting data, writing basic reports, scheduling meetings. It's not glamorous, but you learn how things work. After a few years, you move up.

That ladder is breaking.

One consulting firm used to hire 12 fresh graduates every year. This year? Just 3. The reason is simple: tasks that took a junior employee two days now take a senior employee 45 minutes—with AI help (Schram, 2026a).

Law firms tell the same story. Young lawyers used to spend days researching old court cases. Now AI does it in 20 minutes—and often catches things humans miss (Schram, 2026a).

So here's the uncomfortable question: if AI handles all the beginner work, how do people gain the experience needed for senior roles?


What AI Can't Do (Yet)

The good news? AI isn't good at everything. Five skills will matter more than ever (Schram, 2026a):

Working with AI, not against it. The winners won't fight AI—they'll use it as a powerful partner, knowing when to trust it and when to question it.

Handling messy problems. AI loves clear rules. Real life is messy. Humans are still better at figuring out what the actual problem is before solving it.

Connecting different fields. Someone who understands both technology and psychology can solve problems that specialists can't. As AI handles narrow tasks, broad thinkers become more valuable.

Building real relationships. Teamwork, trust, and understanding emotions—these remain deeply human skills.

Never stopping learning. What you learn today might be outdated in five years. The ability to keep learning is your most durable advantage.


The Risks Nobody Talks About

AI in schools isn't all positive. There are real dangers (Schram, 2025).

When students let AI write their homework, they skip the thinking process—and learning happens in the struggle. Research shows 32% of students are ready to use AI for assignments. That's a problem.

AI can also be unfair. Some exam-monitoring tools work less accurately for students with darker skin. And schools collecting student data—grades, behavior, even mental health information—create targets for hackers. In 2025, a data breach in Vancouver exposed thousands of private student documents.

The new "agentic" AI systems create even bigger security risks. When AI can access your files, emails, and browsing history, there are more ways for things to go wrong (Schram, 2026b).


Rules Are Coming—For Everyone

Governments are paying attention. The European Union's AI Act (2025) now bans certain AI uses in schools—like systems that try to read students' emotions. Other AI tools require careful checking before schools can use them (Schram, 2025).

Here's the interesting part: even if you don't live in Europe, these rules will probably affect you. It's called the "Brussels Effect." Companies want to sell products in Europe, so they follow EU rules everywhere. European standards often become global standards.

Schools using AI for admissions or grading are now classified as "high-risk" and must prove their systems are fair (Schram, 2025).


What Needs to Change

Schools can't keep teaching the same way. Here's what experts recommend (Schram, 2025; 2026a):

Embrace AI in the classroom. Instead of banning it, teach students to use it properly. Focus exams on judgment and decision-making—not just finding information.

Create new paths to experience. If entry-level jobs disappear, schools should build alternatives: real-world placements where students work alongside AI, learning skills companies actually need.

Invest in human development. Emotional intelligence, ethics, communication—these belong at the center of education, not the edges.


What This Means for You

If you're a student today, the message is clear:

Don't fear AI—learn to work with it. Develop your uniquely human abilities. Stay curious and keep learning. And always think critically, because AI makes mistakes too.

The entry-level job as we knew it may be disappearing. But the need for capable, thoughtful, adaptable people isn't going anywhere (Schram, 2026a).

The future belongs to those who prepare for it.


References

Schram, A. (2025, May 20). Future-proofing education: Navigating AI integration through the Brussels Effect. LinkedIn. https://www.linkedin.com/pulse/future-proofing-education-navigating-ai-integration-through-schram-2ucze/

Schram, A. (2026a, February 7). The entry-level job is disappearing. Here's what universities should do now. LinkedIn. https://www.linkedin.com/pulse/entry-level-job-disappearing-heres-what-universities-should-schram-f2nqf/

Schram, A. (2026b, February 7). The Jarvis moment has arrived. Is your organization ready? LinkedIn. https://www.linkedin.com/pulse/jarvis-moment-vibe-orchestration-radical-work-education-schram-nvwxf/

Wednesday, February 4, 2026

The Missing Middle: Why AI Training Fails and How to Fix It

 

A Wake-Up Call from Redmond, Microsoft's HQ

The numbers are in, and they are both startling and unsurprising. At the end of 2025, Microsoft conducted a study that most people overlooked. They tracked 300,000 employees using their AI assistant, Copilot. For the first three weeks, excitement was palpable. People were experimenting, sharing discoveries, marvelling at what the technology could do. Then came the cliff. Enthusiasm dropped sharply, and most people quietly stopped using AI altogether.

Let that sink in. Microsoft, one of the world's largest technology companies, with presumably some of the most tech-savvy employees on the planet, watched 80% of their workforce abandon their own AI tool after the initial honeymoon period.




The employees who continued using AI discovered something important: AI is not just a tool you learn to operate. It is something you learn to manage. This insight applies to all AI tools—not just Copilot—and it fundamentally changes how we should approach AI training. The challenge is not technical. It is, as I have argued before, psychological and institutional (Schram, 2025).

Sunday, January 25, 2026

From Telegraph to AI: Why Learning the Language of Innovation Still Matters



Introduction: Finding Echoes in History

As a trained economic historian specializing in 19th century's large technical systems like railways and telegraphs, I am always tempted to find historical parallels with today's emerging technologies—particularly what has come to be called artificial intelligence.

This impulse is not mere academic nostalgia. Understanding how past technological revolutions unfolded, who benefited from them, and why some innovations endured while others faded can offer crucial guidance for leaders, educators, and innovators navigating the current AI landscape. The question I keep returning to is simple but profound: Is AI genuinely transformative, or is it another overhyped technology destined to disappoint? How will we know?



Europe's Private R&D Innovation Divide: Which Companies Lead in R&D Investment?

The 2025 EU Industrial R&D Investment Scoreboard | IRI

Recently the European Commission published its Industrial R&D Investment Scoreboard for 2024. It is remarkable the so many countries are (far) below the EU average in this sense. In fact, all sub-scandinavian countries, except Germany, spend below the EU average per employee on Research and Development.




Here is the top-20 ranking for individual companies:


These numbers are hard to interpret without looking at the same indicators in the world's other industrial power houses for which reliable data are availalbe, which leaves out China.

Key Observations from the EU Data:

  • Germany dominates the list with the highest number of companies (227), the highest total sales (€1.88 trillion), and the highest total R&D spending (€118.6 billion).
  • Denmark has the highest R&D spending per employee (~€44,792), driven largely by high-intensity pharmaceutical companies like Novo Nordisk.
  • Romania shows a very high R&D per employee figure, but this is based on a single data point (Bitdefender Holding B.V.), which is a software security company with high R&D intensity relative to its size.
  • France ranks second in total sales and R&D spending, maintaining a strong R&D per employee ratio of €18,740.
  • Sweden and Finland also show strong innovation metrics, with R&D per employee figures exceeding €23,000 and €25,000 respectively.

Note: The "Grand Total" row represents the sum/average of the EU member states listed in the file. Companies with missing employee data were excluded from the denominator of the "per employee" calculation to ensure accuracy.



Thursday, January 22, 2026

The Educational Shield: Navigating Truth in a Post-Fact World

Introduction

In the modern era, we are often told we live in a "post-fact" world—a landscape where emotion, repetition, and tribalism frequently override empirical evidence. What to do? The words of the philosopher Bertrand Russel come to mind in his message to future generations (1959): "When you are studying any matter, or considering any philosophy, ask yourself only: "What are the facts, and what is the  truth that the facts bear out?" Never let yourself be diverted, either by what you wish to believe, or by what you think could have beneficial social effects, if it were believed." 

He insisted in his message to future generations to make a second point: "The moral thing I should wish to say to them is very simple. I should say: Love is wise, hatred  is foolish. In this world, which is getting more and more closely interconnected, we have to learn to tolerate each other. We have to learn to put up with the fact, that some people say things that we don't like. We can only live together in that way. And if we are to live together and not die together, we must learn a kind of charity and a kind of tolerance, which is absolutely vital to the continuation of human life on this planet. More about this second point in another article.

Such is the reputation for hate speech, lying and misrepresenting facts of the current (and maybe last) President of the USA, Donald Trump, that you wonder why he would bother with facts at all. 


The Economist cover 23 Jan: deserved ridicule

Due to my training as economic historian, what I found most upsetting in his speech at Davos the 21st of January, were these grains of truth in some of the economic statistics he presented, not his preposterous misrepresentation of history on Greenland, nor his mental decline. 

The Nazi Minister of Propoganda, Josef Goebbels called this the "principle of plausibility" or selective truth-telling, involved constructing arguments from credible snippets or verifiable facts drawn from diverse sources, then embedding them within broader narratives of deception. By anchoring lies to isolated truths—like accurate economic data or historical events—propagandists created an "illusion of veracity," exploiting people's tendency to generalize trust from partial accuracy. Even only 10% or 20% of true statement is enough to create the illusion of veracity. In combination with the effect of repeating lies long enough so that they become accepted as facts (e.g. the 2020 election being stolen). The true statements, however, should never be more than 50% in order to raise suspicion and invite further scrutiny (Tella et al., 2011).

Here we used Gemini 3.0 Deep research feature on a transcript of his speech to identify the facts in his speech, which on the whole was a bombastic misrepresentation of his own achievements.

Sunday, December 14, 2025

The Untapped Dividend: Professionalizing the "Shadow Use" of AI in Education


Introduction

In the discourse surrounding Educational Technology (EdTech) in Low-Income and Lower-Middle-Income Countries, there is often a reliance on technological determinism—waiting for a future breakthrough to save the system. However, the reality is that the technology is already present. A significant number of educators are already utilizing Generative AI (GenAI) tools to reduce their workload, often described as "taking back" their time (World Economic Forum, 2023). The challenge is that this usage often occurs without institutional strategy or ethical guardrails.

We propose a shift toward a voluntary training program designed to professionalize the usage teachers in those countries have already adopted. This approach moves from haphazard experimentation to strategic mastery, focusing specifically on planning, material generation, and ethical oversight.


Monday, November 24, 2025

The Teacher and the Tool: A Story of Transformation


Introduction

Sarah, a ten-year veteran of high school English and History, felt the familiar weight of a Sunday evening. The glow of her laptop screen illuminated two things: a half-written lesson plan for Monday’s class on the Federalist Papers, and a digital mountain of 120 student essays waiting for feedback. She was a passionate teacher, but the passion was being slowly eroded by an avalanche of routine work. Her dream of facilitating deep, Socratic debates and providing one-on-one mentorship was constantly being sacrificed for the urgent reality of grading, planning, and paperwork.



The whispers about AI in education had, until now, felt like a threat. To Sarah, they represented three daunting hurdles:

  1. The Hurdle of Time and Training: The idea of learning a complex new technology felt like being handed a shovel while already buried in a landslide. She simply didn't have the time to become a tech expert.

The Future of Work Has Changed—Is Your Education Ready?

Imagine telling your computer, "Prepare my presentation using last week's sales data," and then walking away to make coffee. ...