AI in Higher Education: Students need playgrounds, not playpens

Artificial intelligence capabilities are moving so fast, and the implications are so profound, that we restrict the ability of students to learn through curiosity, experimentation, and hands-on experience at our peril. From the Future of Being Human Substack.

This is not the Substack I set out to write this morning. I was intending to post about a conversation I recently had about the future of higher education with Brian Piper on the AI for U podcast. But as I started writing, the article morphed into a piece about playgrounds and playpens — and AI.

Playgrounds and playpens, it turns out, are a powerful metaphor for thinking about AI and education. And they’re a metaphor I’ve been thinking and talking about for some time now.

Inspired by Mitch Resnick at MIT and the work of Marina Umaschi Bers which, in turn, inspired him, I wrote about the metaphor last year in the broader context of undergraduate education. Since then I’ve become increasingly interested in how it opens up ways of thinking about approaches to AI and learning/education where the technology is challenging nearly every aspect of not only how we teach and what we teach, but why we teach.

I’ll come back to the broader conversation with Brian in a follow-up post. But for now I’ll follow the story and dive deeper into metaphor of playgrounds and playpens in education, and how it’s potentially useful in thinking about artificial intelligence.

As a starting point, it’s worth taking a moment to think about why AI presents such a unique challenge and opportunity to learning and education — especially in universities.

Since ChatGPT hit the public scene in 2022, generative AI has impacted nearly every part of higher education. In some cases this has led to new AI tools being embraced by educators and administrators. In others there’s been active resistance to AI in any form being used in teaching or by students. And of course there are the continuing fears that AI makes it easier for students to cheat, or to become lazy learners, or simply not to retain understanding that comes from using AI as a learning aid.

But whether you’re an AI optimist, an AI pessimist, or simply in denial, it’s nearly impossible to ignore the reality that AI is having a substantial and growing impact on learning and education.

At the same time, we’re seeing a large gap in understanding between where the leading edge of AI capabilities are, and where educators think they are. As a result, there’s still a tendency to think of AI as a tool that can complete assignments or write essays, or create personalized learning environments, or simply act as a form of Google on steroids.

Yet the reality is much more complex — and much more transformative. Because advanced AI models are becoming increasingly capable of simulating aspects of ourselves that define us at a fundamental level — such as the ability to think, to reason, and to solve problems with agency — they stand apart from pretty much any previous technology or tool that we’ve created.2 And because of this, they cannot be approached as just another technology to teach students about, or another tool to enhance traditional approaches to education.

Rather, we’re seeing a growing need for completely new ways of thinking about the intersection between learning, education, and AI — especially where educators are sometimes (perhaps often) further behind the curve than the students they’re trying to educate.

And this is where the perspective shift inherent in moving from a playpen to a playground mentality becomes useful — and important …

Can AI write your PhD dissertation for you?

I spent four days trying to get OpenAI’s new tool Deep Research to research and write a complete dissertation. This is what I discovered.

Rethinking Higher Education in the Age of AI

How embracing playful experimentation can redefine universities in an era of limitless intelligence.

Are educators falling behind the AI curve?

As tech companies release a slew of generative AI updates, there’s a growing risk that educational practices and policies are struggling to keep up with new capabilities.

HomeArtificial Intelligence

Andrew Maynard

Director, ASU Future of being Human initiative