TRANSCRIPT:
"There was a statistic that came out that said by the time a child is thirteen, on average, there's about 72 million data points on them all over the net, and that's going to keep growing, so as we go into the metaverse that could be on hyperdrive."
That's Kirra Pendergast, a cyber security expert who has spoken to thousands of children, parents and teachers about how to stay safe online.
She does that through the organisation she founded called Safe on Socials.
"My team usually works with between three and five thousand students a week, so we see clear patterns. We've been doing this for a long time, almost 12 years, and the patterns are always -- it's like, literally like a train smash from about eleven to 14, when a child gets their first device, and starts using apps and the parents aren't up to speed with the apps and then they can't keep up with the gaming that's going on, and then they don't have a concept of what the metaverse is."
The metaverse is often called the 'next version' of the internet.
Immersive, interconnected, and often involving augmented or virtual reality technologies, it's a place where the physical and digital worlds blend together.
Chief Operations Officer at Standards Australia Kareen Riley-Takos says people don't realise what it consists of.
"Eight out of 10 parents and seven in 10 teachers don't actually know what the metaverse is; 30 per cent of parents don't know who their children are interacting with; and only 44 per cent are aware of the risks like bullying, grooming and identity theft."
And there are lots of kids in the metaverse.
Research released this morning [[December 03]] by Standards Australia suggests two thirds of metaverse users are aged under 16.
Some of the most common examples of the metaverse platforms they use include Roblox, Minecraft, and Fortnite.
Kareen Riley-Takos says there is a worrying gab between the popularity of these kind of platforms with children, and awareness among adults of what they involve.
"While social media is under scrutiny right now, many of the same kids that are on social media are also engaging in the metaverse. While the metaverse has many positive and exciting benefits, like anything online, there are also risks."
The Standards Australia survey findings come as experts like Pendergast push for guidelines for the metaverse, essentially a framework for protecting children before the new technologies become entrenched.
"This is a different version of the internet. Let's not forget that. And so I think where we've missed the boat in the past, we've got an opportunity to fix for the future through getting this standard right, because it's still coming, if that makes sense. It's not commonly used, like social media. Trying to retrofit things into social media is very, very difficult, but as we're just starting to look at moving into these metaversal worlds through games like Roblox and Fortnite and spaces like Decentraland and things like that. We have a small window of opportunity to get it right where we missed it with social media."
A white paper has been released exploring what the new standards should look like.
It's open for public consultation until the 24th of January 2025.
The document highlights how technology in the metaverse will not only track what users click on and buy, but where they go, who they're with, what they look at and do.
It suggests eyewear, headsets, and body-worn sensors could be used to track facial expressions, vocal inflections, even vital signs like blood pressure, heart rate, respiration and pupil dilation, enabling new levels of behavioural tracking and emotional profiling that could supercharge marketing efforts.
Monash University Professor Mark Andrejevic says this is exactly why big tech companies are investing so heavily in the metaverse.
"I think the reality of what's taking place is that we live in a world now where the dominant modes of social interaction that are popular and available to us take place on really super commercialised platforms that structure the nature of our interaction, the type of content we see, the way that we're invited to frame and portray ourselves. All of those are dictated not so much by the social character of those platforms but by their commercial character and their economic model."
In addition to age verification, parental controls, and content moderation, Standards Australia is suggesting a range of protections to curb the worst of the potential effects of intensified data harvesting.
These include standards to maintain a clear line between marketing materials and authentic encounters in the metaverse; a ban on storing behavioural data over time; and a right to emotional privacy, which would limit the use of biometric data to figure out how users are feeling.
Professor Andrejevic says that several years ago Facebook inadvertently revealed internal research that showed the company could identify when teenagers were feeling inadequate or insecure, and was trying to sell that data to advertisers.
He says that shows why greater protections are important.
"Why would you want to share that fact that you can identify that to advertisers. It's precisely because advertising capitalises on insecurity. And you know, if you can find teens who are feeling insecure, that looks like a vulnerable audience for manipulation. And so the very fact that platforms were thinking this way tells you a lot, I think, about the extent to which they care about the well being of the kids who are on those platforms, which is to say, not very much at all. So the appetite for data is bottomless. The uses of that data are precisely to be able to make more money off of kids. That's the only thing these platforms are really interested in."
Kareen Riley-Takos says she hopes that new standards will provide the basis for safer online spaces moving forward.
"We have a range of standards in the physical world that intend to ensure the safety of children this. This particular draft standard, and this process for safety of children in the metaverse is our first in the virtual world, and it is a really important step, we see the world around us is changing and we need to ensure that, you know, we will support our kids safety, whether it's in the physical world, or whether it's on the internet or in an environment such as the metaverse."