Stronger parental controls only one piece of phone safety puzzle

We’re sorry, this feature is currently unavailable. We’re working to restore it. Please try again later.


Stronger parental controls only one piece of phone safety puzzle

By Tim Biggs

As part of its re-election promises, the Morrison government has unveiled a policy to ensure all smartphones have strong parental controls, through a binding industry code.

But while the Prime Minister’s announcement and the government’s press release put the focus on industry developing stronger tools (“if industry doesn’t act within 12 months, we will regulate to force them”), in line with the Liberal Party’s broader promise to hold big tech to account, a bigger part of increasing safety may require a cultural and educational shift.

Scott Morrison says his government will force tech giants to do more to protect children’s safety.

Scott Morrison says his government will force tech giants to do more to protect children’s safety.Credit:James Brickwood

Part of the government’s announcement was $23 million in funding to build on existing education programs for parents, teachers and students, as well as $10 million for regulation and law enforcement.

“There is a role for everyone to play in keeping kids safe online: government, industry, schools and parents. There is no silver bullet, and different families will have different needs,” said Communications Minister Paul Fletcher.

“That is why our plan comprehensively covers regulation, education, consumer information, support for victims, the needs of CALD communities and international engagement.”

CALD refers to people of culturally and linguistically diverse backgrounds, who may experience barriers such as limited English-language skills.

Digital wellness and digital literacy specialist Dr Joanne Orlando says that parental controls can always be strengthened, given that the scope and complexity of online interactions keeps growing. But just asking for stronger parental controls is no guarantee that our children will stay safe online, and risks putting the emphasis in the wrong place.

“[Existing tools] allow parents to oversee to a large degree, depending on the controls they put on. But this is only one way of keeping a child safe, and I think we’re kind of stuck on this as the only solution,” she says.

“From the time a child begins using a device and going online, we should slowly over time be able to take them to a point where they can independently make decisions about their own safety.”


In general, children are allowed to open social media accounts and control their device profiles from the age of 13. Relying too heavily on surveillance tools before that point risks dropping teens in the deep end without properly teaching them how to swim, Orlando says.

In general, children are allowed to open social media accounts and control their device profiles from the age of 13.

In general, children are allowed to open social media accounts and control their device profiles from the age of 13.Credit:

“A child can’t go from being 100 per cent surveilled by a parent to the next week or the next month being able to make the decisions. Surveillance isn’t education or guidance.

“There has to be [education] to help them have those kinds of digital literacy skills to understand what’s going on there, to know the kinds of red flags.”

One way to foster those skills and help children understand risks is to encourage parents to encounter new technology and potential dangers alongside their children, rather than blocking them entirely with parental controls. That could mean setting time to look through TikTok together rather than banning the app, or even going through and discussing the parental controls themselves.

Another issue is that, even though Apple and Google already provide robust parental-control tools that are impossible to bypass if used properly, many parents don’t know they’re there or don’t feel confident using them.

Parental controls only work properly if children have accounts set up with their actual age and other information, and if that account is linked to a parent’s. Yet for convenience, many parents let children use devices signed in through the parent’s account, or allow children to lie about their age and expedite set-up processes which can put devices and accounts out of parental reach.

“Parents have been giving their kids their mobile phones, normally since the kids were around one. So I think we do get into a little bit of complacency [when we have to change our behaviour to properly set up parental controls],” Orlando says.

“Thinking ‘this is going to take me a long time to do this’ is a big part of it.”

Google’s official parental-control software is called Family Link, and is used to monitor and set boundaries for Android devices. Parents can use their own Android device, or a PC or Apple device, to set app limits, approve downloads and payments, see the location of a device, schedule downtime and more.

If a child is under 13, they’ll be prompted to use Family Link when they set up a device or their account. Family Link can be used with older kids as well, but since over-13s are eligible to manage their own accounts, both the child and parent need to agree before supervision is put in place.


Lucinda Longcroft, Google’s director of government affairs for Australia, said the company welcomed the government’s focus on online safety.

“Google has a long-standing commitment to ensuring that our products provide a safe experience, and parents have the controls to set healthy and positive digital habits for their children,” she said.

Apple’s controls work similarly, including being mandatory for users under 13, although they’re baked into the company’s Family Sharing tools and require everyone involved to be using an Apple device.

In addition to setting app limits, viewing screen time, tracking the device’s location and setting content rules, Apple has announced that its “Communication Safety” technology for Messages would soon roll out in Australia. These tools were announced for the US last year and, if parents turn them on, scan every image sent to their children in Messages. If nudity is detected, the image is blurred and the child warned of potential danger. They have the option to view it or notify their parents.

Apple declined to provide a spokesperson for this story.

The office of the eSafety commissioner was unable to comment on election promises due to caretaker conventions, a spokesperson said. But eSafety does distribute information about smartphone controls, where it points parents to Family Sharing and Family Link.

Get news and reviews on technology, gadgets and gaming in our Technology newsletter every Friday. Sign up here.

Most Viewed in Technology