Parents Discuss: What Should AI in Schools Look Like?
A Tech-Savvy Mom and a Policy Wonk Dad Talk AI in the Classroom
Hi everyone - we’ve got a treat for you today: our first conversation piece, where we have some great back and forth on AI in the classroom. But before we jump in, I want to give a quick heads up that I will be away for a couple of weeks, beginning this Friday. Fortunately, I’ve had the pleasure to work with some really great writers who will be helping me fill the void with some pieces that will go out while I’m gone - I’m really excited for you to get a chance to read them!
Okay, announcements are over and class is session - let’s talk AI and schools!
Hey y’all welcome in! Artificial intelligence is top of mind for leaders across the country and around the world - there isn’t an industry or sector that isn’t grappling with AI’s impact and how best to incorporate it into their work. And while AI is often the subject of corporate board meetings and online chatrooms, it’s beginning to find its way into an area closer to home: our schools.
Just this July, the American Federation of Teachers, the second largest teacher’s union in the country, announced the launch of a ‘National Academy for AI Instruction’, a $23 million teacher training initiative funded by Anthropic, Microsoft and OpenAI, three of the biggest names in the AI space. It’s no longer a question of ‘if’ AI will impact education, but a question of ‘how and when.’
I find the subject wildly interesting, not to mention critically important - I’d argue that, in terms of long-term impacts on the country and the world, how AI will be used in the classroom will be one of the most consequential decisions we make this decade. I also find the subject immensely complex, on top of the already complicated and controversial policy realm that is American education.
That’s why I invited Dhani Ramadhani, writer of Parenting in AI and Tech for a conversation on AI, student learning, and where they meet. As her Substack title suggests, Dhani is a mother working in GovTech, where she sees the impact of technology on how our students learn and interact with the world on a daily basis. Her Substack is a great resource, and if this subject interests you as much as it interests me, I definitely recommend you give it a look.
If you haven’t subscribed already, and are interested in conversations like this one, parent debates, commentary and policy analysis, all through the lens of being a parent, be sure to check us out below. Now, on to the good stuff!
Dylan:
Hey there, Dhani, welcome in! Thanks so much for joining us today - excited to get your perspective and expertise on AI and where you see it going in the classroom. But before we do, can you tell us a little bit about yourself, your background, and since we’re talking AI in the classroom, what your educational experience was like growing up?
Dhani:
Hi Dylan! I’m Dhani. My educational journey started at age 2 in Jakarta, Indonesia, with preschool through college in a typical developing country system. Picture rigid classrooms where teachers are never wrong, students don’t really speak up, and your worth is measured by test scores. Oh, and school regularly closed for floods.
But two experiences outside the classroom really shaped me: flag ceremony & drill team (Indonesia’s version of color guard) and English debate. These taught me discipline and critical thinking in ways no textbook ever could. Without them, I never would have landed my first job at McKinsey Indonesia or earned my Harvard master’s. Sometimes the most important education happens beyond the classroom walls.
Looking back, I realize the skills that actually mattered-critical thinking, adaptability, collaboration-were barely part of the official curriculum. Now with AI doing more of the ‘knowing’ for us, shouldn’t we be redesigning education around what humans uniquely need and bring to the table?
Dylan:
Hey Dhani, and hey to everyone reading! For Dhani’s readers, I’m Dylan. I’m a dad living in Austin, Texas, with more than 10 years of experience in government and politics, including five years as a policy staffer in the U.S. House of Representatives. I’m really passionate about both parenting and public policy, so naturally, education is something I think about a lot. I think it's safe to say, AI has officially entered the chat. We're just starting to see the broader public wrap their heads around what AI means for different parts of life, and for parents especially, it’s becoming clear that this could seriously shape our kids’ futures. It’s an exciting (and maybe a little overwhelming) moment.
A bit about my own education: I bounced between public and private schools growing up, wherever my parents thought the opportunities were best. I definitely had my share of tough, old-school teachers. My high school and college were both Catholic, so yes, nuns with rulers were real. But I also had some great teachers who really encouraged discussion and made it okay to question things. That said, test scores still mattered - especially when it came time for college. And while I never got a day off for flooding, growing up in Kansas meant plenty of snow days!
I think you're right, education is changing fast, and a lot of that started with the COVID-19 pandemic. Before COVID, tech in the classroom felt more like a bonus than a necessity. But after all the remote learning, it’s become the norm, at least here in the U.S., where most classrooms now rely heavily on laptops or tablets. That shift has definitely opened the door for more conversations about what AI might bring to the table, not just for parents and teachers, but for students too.
Since you’re the expert on AI, I’ll flip it back to you. How do you see AI currently being used in classrooms? And where do you see it heading?

Dhani:
Dylan, you are spot on. COVID was a major disruptor that marked a point of no return for tech in classrooms. I see AI as the next educational disruptor, and here‘s what concerns me: instead of a thoughtful rethink of what education should look like post-COVID, we’ve been coasting through a hybrid approach without deeper consideration. Then ChatGPT entered the chat.
The reality? Teens are way ahead of us. In 2024, 54% have used genAI and 26% were using ChatGPT for schoolwork. That‘s doubled since 2023. I imagine it’s actually higher now.
But teachers and parents are still playing catch up. Most schools haven’t made AI an intentional classroom tool yet. So, AI has been mostly an individual experimentation, not systematic integration.
But everything just changed, again. It started when the current administration signed federal guidance pushing AI in schools In April 2025. Now 26 states have AI guidance for K-12, up from just 13 in March so expect this number to increase. And Google just announced Gemini for ed is free for all schools, like we are talking 30+ new AI tools for educators and students.
What’s coming this fall? Real implementation at scale with lots of unknowns. Think Google’s Project Astra tutoring students through chemistry problems or Khanmingo providing 24/7 personalized support. More EdTech, AI-powered tools, will be entering schools like never before. This AI ed market hype is projected to hit $20B by 2027.
All of us will need to get smart quick and constantly informed as these trends will rapidly change our kids’ learning landscape.
Dylan:
Totally agree, Dhani. The pace of change right now is staggering. One minute we’re still debating ChatGPT’s classroom role; the next minute Google and Khan Academy are flooding the zone with new tools, and half the country has AI guidelines for K-12. It’s hard enough for policymakers to keep up - so imagine being a parent or teacher just trying to figure out how to prep kids for a future that looks wildly different from the one we grew up in.
And that’s exactly why I think the most important shift we can make isn’t about tech itself. It’s about goals. If we’re honest, no one really knows what the future workforce is going to look like. But what we do know is that it’s coming fast. That means our education system can’t just teach what’s currently useful; it has to teach students how to adapt when everything changes again. Flexibility, resilience, and problem-solving should be core goals, not electives. If AI is going to take over repetitive tasks and information recall, then students need to lean into the skills machines can’t easily replicate: human connection, emotional intelligence, creativity, leadership, and moral reasoning.
I think we should also be asking: What are the jobs people will still want humans to do? I’m talking about roles that require trust, empathy, judgment, or a sense of presence. Think nurses, teachers, therapists, community leaders - roles where people don’t just want a smart answer. They want a real person behind it. If we’re serious about preparing kids for the future, we need to be just as serious about helping them become the kind of people others choose to rely on.
But that’s just my take. Dhani, I’d love to hear yours. What do you think the goals of education should be in this AI age? And how do you see that shaping the design of an AI-infused classroom?
Dhani:
Dylan, you’ve nailed the human skills piece, especially emotional intelligence, creativity, moral reasoning. And how those nurses, teachers, and community leaders you mentioned embody what will always matter.
Here’s where I want to build on your insight, with something I think many parents will relate to.
When I think about education’s goals in the AI age, I picture a tree.
Education should help kids grow strong roots and reach wide branches.
The basics, like literacy, numeracy, digital fluency, are the soil that nourishes those roots. Essential, but just the starting point.
The roots are timeless human capacities. Think: curiosity, critical thinking, resilience, empathy, moral clarity. These keep kids grounded no matter how fast the world shifts.
The branches are how kids amplify their impact. Using AI not to cut corners, but to remove obstacles, test ideas faster, and scale creativity and connection.
This isn’t about handing over thinking to machines. It’s about knowing when to stay rooted in deep thinking and when to stretch branches with technology.
In practice, that might look like:
Starting with deep brainstorming to generate ideas for a problem they care about. Then using AI to test those ideas or build an MVP.
Leveraging translation tools to share work beyond their community.
Partnering with AI to simulate high-risks scenarios before acting.
The key is discernment.
Kids rooted in who they are, but able to choose the right tools to grow, connect, and lead. Sometimes unplugging to go deep, other times plugging in to reach further.
That’s the kind of education that prepares kids not just to keep up with change but to shape it.
Dylan, does this roots-and-branches approach resonate with you? From your policy perspective, what do you think it would take to bring this vision into real classrooms?

Dylan:
I really like the roots-and-branches metaphor. It’s a good way to frame what education should aim for: grounded in timeless values and skills, while reaching wide towards the future. While we can't predict what the branches might look like, we can identify the roots; that includes not just character traits like curiosity, empathy, and resilience, but also the fundamentals - reading, writing, math, and the ability to make sense of the world around us. Those core skills are non-negotiable. No matter how advanced our technology gets, students will still need a strong foundation to think clearly, ask good questions, and adapt to whatever comes next. And as a parent, that’s exactly the kind of base I want my own son to build on.
From a policy standpoint, I think your framework gives us a North Star to reach for. But the challenge now is turning that vision into reality, especially at the local level, where most education decisions in the U.S. are actually made. It’s encouraging to see some of the big players in AI, like Microsoft and OpenAI, beginning to work directly with educators. Their recent partnership with the American Federation of Teachers to launch an AI training academy shows that this is starting to get real attention. But meaningful change won’t come from a few pilot programs or press releases. It’s going to take time, leadership, and a willingness to rethink some of the assumptions we’ve long held about what school is and how it works.
That starts with investing in teachers. If we want classrooms to reflect this roots-and-branches model, we need to give teachers the space and support to learn, experiment, and adapt. We can’t just drop a new tool in their lap and hope for the best. We also need local leaders who understand that smart integration of AI is going to look different in different communities. There’s no one-size-fits-all model here, but there is a shared opportunity: to help students get better at what makes them human, while giving them new ways to grow, create, and solve problems.
Parents have a critical role to play in that conversation, if they choose to step into it. Education in America isn’t controlled by a federal agency or a tech company; it’s shaped by local school boards, superintendents, and communities. That means parents aren’t powerless. In fact, they’re often the ones with the most influence. The key is knowing what to ask: Are our schools treating AI as a supplement to student thinking, not a replacement for it? Are we teaching kids to use these tools with intention and judgment? Are we still prioritizing deep thinking, character, and collaboration, even as we adopt new technology?
No one can fully predict what the future will look like. But we can prepare our kids to meet it with clarity, confidence, and purpose. If we get that right, they won’t just keep up with change. They’ll be ready to lead through it.
Dhani:
Dylan, you’ve hit on something crucial! It’s going to be a challenge to turn the vision into reality at the local level. The Microsoft-OpenAI partnership with the American Federation of Teachers and other initiatives is a start, but as you said, real change happens in individual classrooms, with individual teachers, in individual communities.
This brings me to what I think we both want to explore next: What would these reimagined educational experiences actually look like in practice?
Because here’s the thing, we can have the best framework in the world, but if we can’t paint a picture of what Monday morning looks like in a roots-and-branches classroom, we’re not giving parents and educators the roadmap they need.
While there shouldn’t be one size for all, there should be some basic elements that apply in each classroom. For example, I’m thinking about elements like:
How can we leverage AI to create truly student-centered, active learning experiences? I see a great potential in learning personalization to adapt to each kid’s learning style, like visual, auditory, or kinesthetic, for a more inclusive classroom. As a parent of a neurodivergent child, I hope AI can help lesson plans respond dynamically to sensory preferences for example without simply adding more screentime. The idea is creating meaningful and engaging experiences that center on our child.
How do we design learning experiences that intentionally toggle between deep, analog thinking and AI-amplified exploration? I think it’s crucial that we move beyond just AI literacy to AI agency. Where do we build room to empower students and teachers to confidently opt-in and opt-out of AI tools based on what the learning moment actually requires?
How do we ensure peer-to-peer collaboration and a positive social skill development remain even when AI is at play? Could we design AI tools that support group collaboration without flattening diverse perspectives or homogenizing student voices?
Of course there’s some guardrails and safety elements here. How do we at the classroom level embody the ethical values in using AI and transparently help kids and parents know how their data is being used?
Lastly, and probably most contentious for some people, what does assessment look like when students ”partner“ with AI? Do we measure comprehension without AI? Or look at their learning process, how they question or iterate, when they do engage with tech? Maybe we move away from measuring ”output” altogether and focus on process and growth?
What other elements would make a parent walk into a classroom and think, “Yes, this is preparing my child for the future”?
And, to your point on questions, how we, as parents, can ask better questions to our schools and collaborate with teachers? because I see a lot of ”homework” being thrust into our educators and they shouldn’t be blindly navigating this alone.
If we can start to map out what this looks like, together, we’re not just shaping tools for the classroom. We’re shaping a more human future for learning.

Dylan:
I really appreciate how you brought this back to the classroom, Dhani. You’re right - if we can’t describe what Monday morning looks like in a “roots-and-branches” classroom, we’re not giving families or teachers a roadmap they can actually use. The questions you’re asking - about personalization, collaboration, ethics, and assessment - are exactly the right ones. And I think the most important part is what you said last: this can’t just fall on teachers to figure out alone.
Parents have to be more than just observers in this process. We need to be active partners: asking school leaders how AI is being used, what’s guiding those choices, and whether students are learning to use these tools with judgment, not just convenience. We should be looking for classrooms where technology supports deeper thinking, not shortcuts around it, and acknowledging that sometimes the smartest use of AI is knowing when to turn it off.
The future of education is being decided right now - not in some Silicon Valley board room, but in local classrooms, school board decisions, and hallway conversations between parents and teachers. If we want our kids to be prepared for a world with AI - not just to keep up, but to lead- we can’t afford to sit this moment out.
So, let’s stay curious. Let’s stay involved. And let’s help build the kind of learning future our kids actually deserve.
Dhani Ramadhani is a tech growth strategist, writer, and mom of two. Drawing from her multicultural, neurodivergent family life and years in GovTech, she explores the evolving realities of raising thoughtful, resilient kids in a world shaped by AI. Through aiPTO, she shares practical insights and research-backed reflections to help families navigate tech with intention - not hype or fear.
Dylan Macinerney is a father, husband, and political professional with over a decade of experience in policy and communications, including five years as a former Congressional staffer. He writes The Fatherhood Framework, a Substack exploring modern parenting, politics, and culture with clarity, conviction, and maybe a few memes. Whether unpacking what’s driving the rising costs of daycare or writing about bedtime with his toddler, Dylan works to bring heart and strategy to the biggest questions facing today’s families.
What a thoughtful and timely discussion of the role of AI in classrooms. I love the “roots-and-branches” framework and the questions you ask.
Hey Dhani
Great job. These honest efforts by both the authors shows the need of the hour as responsible parents👏👏👏👏👌🏻👌🏻💯