Unlearning & Relearning in the Age of AI

March 2025 

It was MWC 2018, and like every year, I was in Barcelona to meet clients, investors, and companies. The city was shaking off winter—sunlight poured onto the Gothic streets, and tech energy buzzed in the air.

Barcelona in March is something else. Rooftop dinners, late-night parties, endless cava, and a thousand conversations about deals and ideas. It’s thrilling, chaotic, and utterly exhausting.

One evening, just as I wrapped up a meeting, my phone buzzed. It was my boss. He had a conflict—an invite to a private night tour of the Picasso Museum with the curator, but he was heading to an FC Barcelona match with clients. “Can you take my place?” he asked.

I paused—barely. I knew the name Picasso, a bit about Cubism, and we had a replica of Guernica at home growing up. That was about it. But something about this felt rare. An intimate look at Picasso’s world, after-hours, guided by someone who knew every detail? I canceled my plans and said yes.

What I walked into was more than just a tour. The museum, tucked into connected Gothic mansions, holds over 4,000 original works. We moved through Picasso’s life—from shockingly realistic sketches as a teenager to the wild shapes of Cubism. It was like watching someone break down everything they’d learned and rebuild from scratch.

What stood out wasn’t one masterpiece—it was the volume. Thousands of works, many unfinished, messy, or experimental. Picasso wasn’t chasing perfection. He just kept showing up. That night, it hit me: creativity isn’t about waiting for brilliance. It’s about putting in the reps. Inspiration often shows up after you start, not before.

And then there’s the myth of originality. It’s tempting to think of Picasso as this lone genius who invented Cubism out of thin air. But the truth? He started by copying the greats. He learned through imitation. Only after absorbing those influences did he begin bending the rules. Originality, I realized, usually begins with honoring what came before. You have to learn the game before you can change it.

But what moved me most was how Picasso never stopped evolving. From lifelike portraits to the Blue Period’s melancholy tones to fractured Cubism—he kept reinventing. That wasn’t just growth. It was unlearning. Even though he had classical training, he spent his life trying to paint with the rawness and honesty of a child.

Unlearning isn’t passive. It’s not forgetting. It’s intentional. You let go of what used to work so you can make room for something better. That takes guts. It’s uncomfortable. But it’s also where growth lives.

And it’s not just an artist’s thing. It applies to work and life too—especially in a world that’s constantly shifting.

Paul Klee, another artist and Bauhaus teacher, put it well. He believed creativity wasn’t about stacking on more knowledge—it was about subtraction. Shedding assumptions. Seeing with fresh eyes. It’s the same idea behind the Zen concept of shoshin, or “beginner’s mind”—an open, curious way of seeing the world.

This tension between mastery and starting over keeps showing up. Steve Jobs did it at Apple when he came back, cut products, and reshaped the company. The U.S. Navy did it after Pearl Harbor, when it shifted from battleships to aircraft carriers. In both cases, the signs had been there—but change only came when the pain of staying the same outweighed the risk of doing something new.

The problem? Most institutions aren’t wired to unlearn. They double down on what worked. They recycle ideas instead of reinventing them. Leadership narrows, dissent quiets, innovation slows. The organization stiffens. And by the time they notice, it’s often too late.

But some companies do it right. Microsoft let go of its old obsession with proprietary tech and went all in on open source and cloud. Nike shifted from wholesalers to a direct digital strategy. These weren’t just business decisions—they were acts of unlearning. Letting go of past success to build something new.

The ones that thrive don’t treat the past as a blueprint. They treat it as a stepping stone. They create cultures where questioning is normal. Where it’s okay to say, “Maybe this doesn’t work anymore.” They learn, unlearn, relearn. And that cycle becomes their edge.

On a personal level, unlearning is just as hard—and just as necessary. A marketer raised on print starts to trust algorithms. A mid-career pro realizes their once-useful habits are now holding them back. Neuroscience backs this up: to learn something new, we have to interrupt our old patterns. We have to make space.

That takes humility. The courage to say, “I was wrong,” or “The world’s changed.” And it requires psychological safety—a culture where people can speak up without fear. The best leaders don’t just allow this—they build for it. They reward reflection. They know that letting go isn’t weakness. It’s wisdom.

Unlearning in the Age of AI

Imagine being a master artisan at the dawn of the Industrial Revolution. Your hands, once skilled in weaving textiles, are now made obsolete by a mechanised loom. For generations, mastery meant patience, precision, and pride in handcrafted work. Then, almost overnight, machines redefined what productivity looked like. This wasn’t merely a technical shift—it was a profoundly human one. Those who adapted weren’t just those who learned how to operate the new tools; they were the ones who could unlearn long-held beliefs about work, value, and identity. 

The Luddites who resisted weren’t anti-technology at their core—they were overwhelmed by a future they couldn’t yet see. They extrapolated from the past and found themselves unable to imagine the possibilities of an industrialised world. Their story is a cautionary tale: when the world transforms, survival doesn’t depend on clinging to what you know—it depends on the willingness to let go of it.

Today, we stand at the cusp of another seismic shift—this time, powered by artificial intelligence. Generative AI is transforming the way we learn, work, create, and lead. Much like the looms of the 18th century, today’s AI systems are disrupting fields that were once thought to be immune to automation. 

For much of the modern era, success in education and work has been built on left-brain mastery—logic, analysis, structured reasoning, and methodical planning. These traits, prized in professions ranging from engineering to law to medicine, defined what it meant to be intelligent and professional. However, machines can now generate reports, summarise research, write code, and even simulate judgment with surprising accuracy. What once took years of learning can now be synthesised in seconds.

This shift is not just technological—it’s cognitive, even existential. As AI becomes increasingly capable of performing the tasks we once equated with human expertise, our value is shifting. The competitive edge is no longer found in out-analyzing the machine, but in out-imagining it. This requires a profound reevaluation of the assumptions we’ve long held about intelligence and success. We are being nudged—sometimes uncomfortably—toward the right brain: the realm of creativity, emotion, empathy, intuition, aesthetic judgment, and storytelling. These are qualities that remain uniquely human, and increasingly essential in a world of intelligent machines.

To navigate this transformation, we must let go of the belief that certainty is strength and relearn the art of being comfortable with ambiguity. We must unlearn the idea that expertise is about having the correct answers and instead embrace the power of asking better questions. In a world where AI offers instant outputs, our role is to frame problems with nuance and originality. We need to move beyond linear thinking and adopt systems-level, nonlinear approaches that connect patterns across disciplines and time. We must also stop waiting for perfection and instead embrace fast iteration, rapid feedback, and the creative messiness that innovation demands.

This mindset shift is already visible. Product managers are moving away from rigid roadmaps toward more adaptive visions. Data scientists are learning to work with messy, real-world inputs. Engineers are becoming orchestrators of AI-generated systems. Founders are building with long-term AI moats in mind, and investors are looking beyond predictability toward disruption.

Unlearning is not easy. It can feel like a loss—like letting go of a playbook that once worked. But in that letting go, there’s also liberation. Like pruning a tree, unlearning clears space for new growth. In my own experience, every time I’ve released an outdated assumption, something new and unexpected has taken root. In many ways, this moment invites us back to a beginner’s mindset—open, curious, and ready to experiment.

The beauty of unlearning is that it doesn’t diminish us—it expands us. It deepens creativity, invites reflection, and rekindles the joy of discovery. Teams that shed old norms become more experimental. Professionals who let go of fixed identities often grow in fulfilling, surprising directions.

As we step into this next chapter, the most valuable skill may not be mastery of any single tool, but the ability to let go. To stay curious. To begin again. Just as the Industrial Revolution redefined work, this AI revolution will redefine what it means to be human in the workplace. The future won’t belong to those who know the most—but to those most willing to unlearn and relearn.