Can AI improve how we care for children and young people?

Apr 2025

Written by Kelly Royds

Anyone working in child protection, youth justice, disability services, or out-of-home care knows the weight of documentation. Whether you’re a case manager, youth worker, behaviour support practitioner, or clinical lead, the reports, assessments, and plans you write don’t just fill a file—they shape decisions about a young person’s life. And let’s be real: documentation can feel relentless. Writing case notes, behaviour support plans, risk assessments, and meeting minutes takes up a huge amount of time—time that many professionals wish they could spend more directly supporting young people. 

With AI-powered tools like ChatGPT becoming more widely available, there’s growing curiosity about whether they could help with the workload—not for handling confidential information about children, young people and families, but for supporting other aspects of the job like structuring reports, drafting general templates, or highlighting key points in policy updates. However, the risks of AI in children, young people and families-facing documentation are serious, and it’s critical that professionals know where the boundaries are. Recent cases, like the false court report information in Victoria, have highlighted just how easily AI can produce misleading or fabricated information. Just as concerning, the biggest red flag for privacy regulators is the possibility of workers unknowingly inputting sensitive children, young people and families data into AI tools—a major confidentiality breach. 

So, where does that leave us? Could AI be useful in client-facing tasks, helping professionals think, plan, and organise more effectively—without the risks of handling personal data?

What is AI and how is it being used? 

AI (Artificial Intelligence) refers to computer programs that can generate text, summarise information, or automate tasks based on patterns in data. While AI has existed for years in predictive analytics and automation, the most recent wave of AI tools—called generative AI—can produce human-like text, answer questions, and assist with writing and research. 

Some of the most well-known AI tools include: 

  • ChatGPT (by OpenAI) – A chatbot that generates text-based responses, summarises information, and helps structure documents. 
  • Microsoft Copilot – An AI-powered writing assistant integrated into Microsoft 365 (Word, Outlook, Teams) to help summarise emails, draft reports, and improve writing clarity. 
  • Google Gemini (formerly Bard) – A conversational AI tool similar to ChatGPT that integrates with Google tools.
  • AI-driven analytics tools – Some organisations are exploring AI to analyse patterns in case data, flag risk factors, or identify trends across services. 

These tools are not designed for handling confidential or personal client information—and most sector policies explicitly prohibit inputting private data into them. However, they can be used in ways that support teams and professionals without breaching confidentiality. 

Adults planning

AI in documentation: where it gets risky 

AI’s use in client-facing documentation—like case notes, behaviour support plans, and court reports—raises major ethical concerns. 

Recently in Victoria, a child protection worker used ChatGPT to draft court submissions, only to find that the AI had fabricated information. The inaccuracies were so serious that the Victorian government banned AI use in child protection work, citing concerns about privacy, errors, and bias. 

The Victorian Privacy Commissioner has made it clear: the biggest risk is workers inputting personal or sensitive client information into AI tools. Many of these tools store, learn from, and sometimes retain data, which creates serious privacy risks—especially in child protection, youth justice, and disability settings. 

Even without direct privacy breaches, AI isn’t neutral. It can: 

  • Reinforce bias – AI pulls from existing data, meaning it might unintentionally repeat harmful stereotypes about young people’s behaviours. 
  • Misinterpret trauma – AI can summarise an event, but it can’t understand the emotions, context, or trauma responses behind a young person’s behaviour.
  • Create false objectivity – AI-generated text often sounds polished and neutral, but if the underlying data is flawed, it can reinforce harmful narratives in a more subtle way. 
Phone

So, what’s the right approach? 

AI isn’t going away—but neither should human judgement, professional experience, and ethical oversight. The challenge now is finding a balanced approach that makes AI useful without putting young people at risk. 

Instead of banning AI outright, the focus should be on:

  • AI-assisted, not AI-reliant work – Using AI to generate discussion prompts, summarise policies, or structure internal documents is very different from letting it draft case notes or reports on young people. 
  • Clear ethical guidelines – Organisations need strong policies on what AI can and can’t be used for, especially when it comes to sensitive or legal documentation. 
  • Training on AI literacy – Many workers don’t know how AI stores or processes data. Training is essential to ensure privacy isn’t compromised. 
  • Transparency in AI use – If AI is used in internal documents, training materials, or research summaries, organisations should be upfront about it—but never in ways that involve private client information. 
  • Applying our existing risk lens – In child protection, out-of-home care, and youth justice, professionals already make constant risk assessments—both formal and informal—across every aspect of their work. That same lens can be applied to AI. Before using a tool, assess the potential risks, implement appropriate safeguards, and monitor use over time. The sector is well-practiced in evolving alongside new risks; AI is no different. 
Scale balance

Final thoughts 

AI has the potential to support professionals in non-client-facing work, making teams more efficient, structured, and informed. But its use in client documentation or decision-making carries serious risks that can’t be ignored. 

Whether it’s structuring a team meeting or summarising research, AI can be a useful tool for the sector—but it should never replace the professional judgement, critical thinking, and ethical responsibility that comes with working with young people. 

For professionals in child protection, out-of-home care, and youth justice, the key question is: 

Are we using AI to strengthen our practice—or are we letting it shape the way we see young people? 

What do you think? Is AI helping or harming the sector? 

Keen to learn more about how AI is impacting our sector? Check out the following links: 

ICMEC Australia: A discussion paper on AI and child protection in Australia 

Voluntary AI Safety Standards – Helping businesses be safe and responsible when using AI.  

Australian Government: Guidance on privacy and the use of commercially viable AI products  

You may be interested in: Online safety Organisation

How do we create excellence in Intensive Therapeutic residential care practice?
How do we create excellence in Intensive Therapeutic residential care practice?
What creates high quality therapeutic residential care? This is the question often asked of agencies, of staff, of policy makers and of the young people themselves. There is no simple...
Read more
Client mix and matching in intensive therapeutic care
Client mix and matching in intensive therapeutic care
Young people living in residential care are highly vulnerable and have commonly experienced a significant level of trauma and abuse. They often present with complex needs and a range of...
Read more
'It happens to boys too': Child sexual exploitation
'It happens to boys too': Child sexual exploitation
Over the last few years, there has been an increase in the profile and awareness of Child Sexual Exploitation (CSE). Undoubtedly there is a benefit of increased awareness and understanding...
Read more
How can you support safe social media use in out-of-home care organisations?
How can you support safe social media use in out-of-home care organisations?
Facebook, Twitter, WhatsApp, TikTok, WeChat, Instagram – as common as these social media platforms and many others are, most of us are still learning the best ways to use (or...
Read more
Social media for personnel - Practice guide
Social media for personnel - Practice guide
Social media can be a powerful and valuable tool for communication, engagement and promotion, offering convenience and communication opportunities in many instances. But social media can also raise questions around...
Read more
Do ‘no touch’ policies in residential care keep workers and children safe? It’s not that simple
Do ‘no touch’ policies in residential care keep workers and children safe? It’s not that simple
Lyn was 16 and had grown up in foster and residential care. Lyn was interviewed about her experience and views about out of home care. She was extremely positive about...
Read more
Hard vs soft skills: which are more important in residential care work?
Hard vs soft skills: which are more important in residential care work?
Not everyone is suited to being a therapeutic residential worker. Working in therapeutic care requires special skills and qualities, some that can be taught or mentored, and others that are...
Read more
Hearts of Gold: Reflecting on Foster Care Week 2023
Hearts of Gold: Reflecting on Foster Care Week 2023
Foster Care Week, observed from September 10-16, is an annual celebration acknowledging the incredible contribution our volunteer foster carers make to the lives of children in out-of-home care. The theme...
Read more
Spotlight on a therapeutic specialist in out-of-home care - Jess Wright
Spotlight on a therapeutic specialist in out-of-home care - Jess Wright
Foster Care Week 2023, celebrated from September 10-16, is an annual celebration of foster carers and their supporters for the contributions they make to the lives of children, young people,...
Read more
When systems designed to protect do harm
When systems designed to protect do harm
What comes to mind when you think about the child protection or youth justice system?  Protection and safeguarding? Rehabilitation? Trauma-informed care? These two complex and often interacting systems are intended...
Read more
What you told us about restrictive practices
What you told us about restrictive practices
In July 2023 we asked for your help to better understand how ‘restrictive practices’ were understood and interpreted in therapeutic residential care (TRC) across Australia. Through a blog on our...
Read more
Consent conversations with young people in out-of-home care
Consent conversations with young people in out-of-home care
Having meaningful conversations with children and young people in out-of-home care about consent, while critically important, can be tricky. These consent conversations need to keep both young people and carers...
Read more
Online gambling harms: Why young people in care need targeted protection
Online gambling harms: Why young people in care need targeted protection
Online gambling is a significant risk for young people, particularly those in out-of-home care. The Australian Government is currently considering its response to the 31 recommendations from the parliamentary inquiry...
Read more
When Yes Means YES! Wrap-up for 2024
When Yes Means YES! Wrap-up for 2024
November 30 is the International Day of Consent. We would like to take this opportunity to acknowledge a fantastic year working collaboratively across ACF, CETC with OzChild on the When...
Read more
A Traumatised Organisation: The system as a client
A Traumatised Organisation: The system as a client
The concept of organisations as living systems is not new. Extensive work, particularly in the area of systems thinking, has brought this perspective into the limelight.Out-of-home care (OOHC) organisations are...
Read more
Reflecting on the first year as Chair of Out-of-Home Care Research
Reflecting on the first year as Chair of Out-of-Home Care Research
As the inaugural Chair of Out-of-Home Care Research at Southern Cross University (SCU), co-funded by the Australian Childhood Foundation, I am privileged to reflect on my first year in this...
Read more
Safe and Connected: Supporting online safety for children and young people in care
Safe and Connected: Supporting online safety for children and young people in care
This practice guide is designed to strengthen the capacity of staff in out-of-home care services to actively support and promote the online safety of children and young people in their...
Read more
Building digital resilience: a relationship-based approach to online safety in out-of-home care
Building digital resilience: a relationship-based approach to online safety in out-of-home care
The internet can be a space for connection, learning, and self-expression—but it is not always safe. For young people in out-of-home care (OOHC), the online world presents both opportunities and...
Read more