AI and your child: what every parent should know in 2026
James
Co-founder of Smash Your AI, middle leader at a large UK ed tech company, 18 years in education, parent.
This month, Boston Public Schools became the first big city in the world to make AI a graduation requirement. If you want to leave school in Boston from September 2026, you have to prove you are AI literate.
They will not be the last. Ohio has told every public school to have an AI policy in place by July. New York City's Department of Education has just released guidance for its 1.1 million students. In the UK, the Department for Education published its first AI framework for schools last year and is updating it again this summer.
Meanwhile, your child is probably already using ChatGPT. Maybe to help with homework. Maybe to chat to. Maybe in ways you do not know about.
I am a parent. I have spent 18 years in education and the last four years using AI every day at work. I have talked to plenty of schools, teachers, and parents about this. And I want to walk you through what is actually going on, what you need to worry about (and what you do not), and six things you can do this week to help your child use AI the right way.
First, what is "AI literacy"?
AI literacy is the ability to understand what AI is, how it works, what it can do well, what it gets wrong, and how to use it responsibly. It is the 2026 version of internet literacy.
For a child, being AI literate means:
- Knowing what AI actually is and is not.
- Being able to use AI tools to help with work (not do the work for them).
- Spotting when AI is wrong or biased.
- Understanding the privacy and safety issues.
- Knowing the difference between a human and an AI online.
This is the minimum. Within a few years, I genuinely believe this will be as fundamental as reading and maths. That is not a scare story. It is just where the world is going.
Your child is almost certainly already using AI
A recent survey from RAND Corporation (a respected non-profit research organisation based in the US) found that nearly 70% of secondary school students are worried AI is eroding their critical thinking. Not their parents. The students themselves. Which tells you two things.
First, they are clearly using it. A lot. Second, they can feel that something is off but do not know how to think about it.
Here are the most common ways children are using AI right now:
- Homework help. ChatGPT, Gemini, or specialist revision tools like Smash Your Revision.
- Writing essays. Sometimes properly, sometimes copied straight into their work.
- Coursework and revision. Summarising notes, generating practice questions, explaining topics.
- Chatting socially. AI "friend" apps like Character.AI and Replika, where children have long conversations with AI characters.
- Making images, videos, and memes. Often harmless, occasionally not.
- Social media. Some TikTok and Snapchat features are now AI-powered.
If you have not had a conversation with your child about AI yet, assume they are using it and you just do not know about it. That is not a judgement. It is just where we are.
What you actually need to worry about
I do not like scare stories. Most of what you read about AI and children is wildly overblown. But there are three things that are genuinely worth paying attention to.
1. Losing the skill of struggling with a problem
This is the RAND survey finding, and it matters more than any of the others. When you can ask AI any question and get a perfect sounding answer in three seconds, you stop doing the mental work that makes you clever.
A student who asks AI to write their English essay has missed the point. The essay was never the point. The point was learning to think, structure an argument, and write. AI should be an aid, not a replacement.
2. Emotional chatbots and loneliness
Some children are having long, frequent conversations with AI "companions". These apps are designed to feel like a real friend who is always available, always supportive, and never pushes back. That sounds nice. It is not.
Children need real relationships with real friction. An AI that agrees with everything you say, at any time of day, is not a friend. It is a product designed to keep you engaged. This is particularly concerning for teenagers who are struggling socially or with their mental health.
3. Deepfakes, nudification apps, and AI bullying
This is the darkest one. A deepfake is a fake image, video, or voice clip made by AI to impersonate a real person. "Nudification" apps use AI to generate fake nude images of real people from a normal photo. In the UK, creating or sharing these is a criminal offence under the Online Safety Act 2023, but the apps are still easy to find. Some children have been victims. Some have used the apps on classmates without understanding how serious it is.
If your child is in secondary school, please have a conversation with them about this. Not a lecture. Just a calm, clear conversation so they know it is happening, why it is wrong, and what to do if it happens to them or someone they know.
FREE AI TUTOR FOR GCSE AND A LEVEL
A revision AI built just for UK exam boards
Smash Your Revision is a free AI tutor built on the actual AQA, OCR, and Edexcel specifications. Safer and far more accurate than generic ChatGPT for revision.
Try Smash Your RevisionWhat you do not need to worry about
To keep things balanced, here are the AI fears I hear all the time from parents that are genuinely overblown:
- "AI is making them dumb." Used wrongly, yes. Used well, it makes them smarter and more productive. The tool is not the problem. How you use it is.
- "ChatGPT is going to replace teachers." It is not. It is going to make good teachers more effective and give students extra support outside the classroom.
- "My child is going to get into trouble for using AI." Most schools now have policies that allow AI in some contexts. The trouble comes from submitting AI work as your own. The solution is to talk to your child about the difference between "AI helped me think" and "AI wrote it for me".
- "All AI is dangerous." No more than all of the internet is dangerous. The answer is education, not a ban.
Six things every parent should do this week
Here is the practical part. Pick two to start with. You do not need to do all six at once.
1. Have "the AI chat" with your child
Not a big formal sit-down. Just a conversation over dinner. Ask:
- Do you ever use ChatGPT or anything like it?
- What do you use it for?
- What do your friends use it for?
- Has a teacher ever said anything about AI at school?
Listen. Do not lecture. You will learn a huge amount in ten minutes.
2. Try ChatGPT together
Sit down with your child and play with a free AI tool for 15 minutes. Ask it something silly. Ask it to plan a family day out. Ask it something it will almost certainly get wrong (a specific historical fact, a piece of local knowledge) and then look up the real answer together.
This does two things. It shows you care. And it shows your child that AI is not magic, that it makes mistakes, and that the human in front of it still has to do the thinking.
3. Agree the "AI rules" at home
You do not need a legal contract. Just a simple shared understanding. Here are the rules that work in my house:
- You can use AI to help you understand something. You cannot use it to do your homework for you.
- You never put your full name, address, or school into an AI chat.
- You never share photos or videos of real people with an AI app without asking them.
- If anything an AI says upsets you, scares you, or feels weird, you tell a grown-up.
4. Ask the school what their AI policy is
You are entirely within your rights to email the headteacher or head of year and ask:
- Does the school have an AI policy?
- What are students allowed to use AI for?
- How do teachers check whether work is AI-generated?
- What happens if a child is caught using AI inappropriately?
If the school does not have a policy yet (and many UK schools are still working on theirs), the fact that parents are asking helps move it up the priority list.
5. Pick AI tools that were built for students
General chatbots like ChatGPT are powerful but not always accurate, especially on exam board content. For school work, specialist tools do a much better job.
I will be honest, I am biased here because I built one. Smash Your Revision is a free AI tutor that only uses the actual AQA, OCR, and Edexcel specifications. So when your child asks it a GCSE Biology question, the answer is tied to what they will actually be tested on. Generic ChatGPT cannot do that reliably because it does not know which exam board your child is on.
Whether you use ours or someone else's, the principle is the same. For school work, pick a specialist tool.
6. Learn a bit of AI yourself
The single most effective thing you can do for your child is understand this stuff yourself. You do not need to become an expert. You just need to know enough to have the conversation.
Spend 30 minutes playing with ChatGPT. Read one blog post a week. If you want something more structured, we built a short online course specifically for people who have never used AI before. It will not make you a tech expert, but it will make you confident enough to guide your child.
Red flags that something is wrong
Keep an eye out for these. They do not always mean something bad, but they are worth a calm conversation:
- Schoolwork that suddenly reads very differently to how your child normally writes.
- Long hours on the phone late at night, in their room, with no social contact.
- Mentions of an AI "friend" or "companion" app by name.
- Any unexplained changes in mood, social life, or sleep that coincide with screen time.
- An unwillingness to discuss what they are doing online. This is normal for teenagers in general, but worth watching if combined with other signs.
How this all looks in the UK
A quick note on the UK specifically, because most of the big AI news comes out of the United States.
The UK government's AI in Schools framework, published by the Department for Education, sets out how schools should think about AI. Most UK exam boards (AQA, OCR, Edexcel, WJEC) have now issued guidance on AI in coursework, with tough penalties for submitting AI-generated work as your own.
The Online Safety Act 2023 now covers AI-generated content, including deepfakes and child sexual abuse material generated by AI. Creating or sharing this is a criminal offence.
The practical upshot for UK parents:
- Schools are increasingly expected to teach AI literacy. Ask what yours is doing.
- Coursework faked with AI can cost your child their grade, even their qualification.
- Seriously abusive AI use (deepfakes, nudification) is a crime.
- Free or low-cost AI literacy training for parents is starting to pop up. Worth looking into.
A final thought
The students who will do best over the next ten years are not the ones who avoid AI. They are the ones who learn to use it well. Your job is not to keep AI away from your child. Your job is to help them learn to use it like an adult uses a car. Something powerful, useful, and worth taking seriously.
Have the conversation. Set a few rules. Play with the tools together. Ask the school the right questions. That is most of the job.
If you want to build your own AI confidence first so you can guide your child, our online course is a good place to start. And if your child is preparing for GCSEs or A Levels, send them to Smash Your Revision. Both are built by educators who are also parents, which means we obsess about getting this right.