It’s been 2 years since I released a two part series on generative AI, specifically ChatGPT: ChatGPT: A Practical Guide for Curious Parents and 5 Thinking Skills To Help Your Child Thrive in the Era of ChatGPT.
It’s time for an update.

This post answers the following questions:
- What has changed in genAI since 2023?
- What does the research (or lack thereof) say?
- What does the relationship between AI & education mean for parents?
1. What has changed in generative AI (genAI) since March 2023?
The following themes trended in the initial news craze over genAI: general excitement toward genAI capabilities, grave concern over the potential pitfalls of AI, and predictions of AIs market disruption across many disciplines (including education).
While some things haven’t changed: there are still substantial concerns for accuracy, bias, plagiarism, ethical use and regulation, data privacy, lack of students practicing human to human communication skills, and students passing off opportunities for critical thinking.
There are new developments including:

- New capabilities
- As genAI tools have gained popularity, they have much more data to train from. As a result, they have grown more intuitive, more personalized, and have improved in their output and formatting. They can provide dynamic feedback tailored to the user. New functions on some platforms address the concerns for misinformation and bias using “explainability” functions that display reasoning and sources to verify accuracy. Filtering and customization options can set guidelines on what genAI tools can and cannot generate.
- New products
- We are seeing a boom in genAI educational products from elementary to higher-ed and graduate school programs. These genAI educational products range from improved adaptive standardized testing software to personalized chat “tutors” that can personalize feedback and provide resources to individual students. Basic genAI functions are also marketed to educators to relieve administrative burden by grading, papers, crafting emails and newsletters, reading out loud to students, or crafting lesson plans.
- New concerns
- Transparency, large scale misinformation/deepfakes, and manipulation of individuals are new concerns as genAI transforms the education sector. It is important to note that not all genAI is equally transparent or customizable, and explainability functions are often found in the paid tiers of popular AI tools. The ability to generate highly convincing fake content (images, videos, voices) has grown, making it harder to differentiate real from AI-generated content. This raises concerns for education, where critical thinking skills are crucial. And, genAI is now highly personalized, adapting responses based on user behavior. While this can enhance learning, it also raises concerns about content filtering—are students getting a balanced education, or just AI-curated viewpoints?
- New questions
- The biggest question of all is no longer what CAN genAI do, but what SHOULD AI do?

- Should AI be shaping kid’s opinions and thinking patterns?
- This is no longer a question of IF. The responses of AI can absolutely shape the knowledge presented–but is that appropriate?
- What should we define as “cheating” versus “AI-assisted learning?”
- Which subjects should belong to AI and which should belong to humans?
- Should AI be used to teach soft skills like emotional intelligence?
- Can AI role-play social scenarios to help kids navigate relationships? And even if it CAN, SHOULD we rely on this as a training tool?
- How do we handle AI’s role in creativity?
- If a student creates something with AI, who gets the credit?
- How do we teach children the difference between AI-generated creativity and their own unique voice?
- This is especially relevant as AI tools allow young students to produce professional-level creative work—raising questions about originality, authorship, and skill development.
- What new skills will children need to thrive in a world where AI is everywhere?
- Should schools be teaching AI literacy the way they teach reading, writing, and math? Is it of the same level of importance as these core subjects?
2. What the (lack of) Research says
High quality research takes time. There is still very little high-level evidence (controlled experimental trials) regarding genAI’s educational utility (at the time of this post’s release). Many of the academic papers published in the last 2 years are qualitative gatherings of opinions on AI use, or they are simply survey descriptive studies. There are also narrative discussion papers, and research recommendations. These can help provide foundation to future experimental research for AI in education, but they do not “prove” or provide a causal link between AI and educational outcomes.
3. What does the relationship between AI and education mean for parents?
Two things for parents to consider: regulation and preparation. Parents need to know how AI is integrated into their child’s curriculum (if at all), how their school regulates the use of AI, and parents need to be prepared to train their own child responsible AI use. Here’s why…
- Regulation (Policies)
While some schools have adopted it wholeheartedly, outright bans are also common. EdWeek Research Center surveyed 924 educators and 79% reported their districts had no clear policies on AI use, but more than half expected to use more AI in their district over the next year (Klein, 2024). The lack of regulation is exacerbated by how fast AI technology is changing–”… any policy a district or state crafts could be outdated the moment it is released.” (Klein, 2024).There are currently no federal or state statutes regulating the use of AI in classrooms (Linderman, 2024).
Speaking of regulation…there’s been some recent litigation! In October 2024, parents of a high school student filed a federal lawsuit against their son’s school, claiming that he was unfairly punished for using AI during a history project. He claims AI wasn’t used to write the content, but was used to help research and outline the paper. The student received a failing grade and was also not allowed admission to the National Honor Society. This case is an excellent example of the ambiguity of AI policies and the critical need for alignment between parents, students, and teachers on AI use in education.

What can parents do about AI regulation? Use the following list of questions to openly discuss genAI use with your student’s school:
- What is this school’s policy on genAI use in the classroom or at home for assignments?
- What type of assignments or parts of classroom teaching will incorporate genAI?
- How will we ensure that AI supports rather than replaces the thinking process? (Spencer, 2024)
- How will my child’s collected data be used?
- “How can we work with the school to help our children learn about AI, ensuring its safe and meaningful use?” (Fitzpatrick, 2024)
- “How are you equipping students with the skills to use AI safely, ethically, and responsibly?” (Fitzpatrick, 2024)
- Preparation
Parents CANNOT simply assume that school will prepare their child to use AI well. Moms, Dads, and caregivers, it is our responsibility to prepare our children to navigate AI with wisdom. We need to be informed, curious, and cautious.
“…78% of educators surveyed said they don’t have the time or bandwidth to teach students how to think about or use AI because they are tied up with academic challenges, social emotional-learning, safety considerations, and other higher priorities” (Klein, 2024). Not only do most schools have ambiguous policies or no policies in place, teachers are too tied up with other priorities and have limited AI literacy resources. Despite this lack of time and training, I predict that students will be interacting with AI more frequently, especially in larger class settings where AI can potentially ease teachers’ administrative workloads.

If it’s our job to train our children to use AI well, how do we do it? Here are ways to teach our children to engage with AI critically:
- Focus on AI literacy. Assess your student’s current knowledge of chatGPT and other genAI tools.
- Use this FREE printable discussion guide to talk with your child(ren)!
- Keep AI in its proper place. Teach them to think critically and then use AI to help, rather than letting AI do the mental lifting for them.
- Ask questions like, “Is AI the right tool for this job?”
- Encouraging fact-checking of AI-generated content
- Ask questions like “How do you know this is true?” “Why did it give me this answer?”
- Use this FREE printable workbook to walk you through it!
- Discuss ethical AI use in learning and everyday life.
Conclusion
The world of genAI is becoming more entwined with the world of education. New capabilities are leading to new products and generating more questions. However, serious concerns still linger. Create an open dialogue with your child(ren) AND their school to prepare them for genAI’s safe and meaningful use.
References
ChatGPT: A Practical Guide for Curious Parents
5 Thinking Skills To Help Your Child Thrive in the Era of ChatGPT.
Spencer, 2024. Seven Questions to Ask Before Having Students Use AI Tools
Klein, 2024. Schools Are Taking Too Long to Craft AI Policy. Why That’s A Problem
Linderman, 2024. Parents Sue Massachusetts School Over Student’s Use of AI
Fitzpatrick, 2024. 5 Questions Every Parent Should Ask Their Child’s School About AI