Attempts to protect children’s safety in the two-dimensional realm of online social media could adversely impact the 3D world of augmented and virtual reality, according to a report released Tuesday by a Washington, D.C., technology think tank.

Legislative efforts, like the Kids Online Safety and Privacy Act (KOPSA), which has passed the U.S. Senate and is now before the House of Representatives, could lead to harmful censorship of AR/VR content, maintained the report by the Information Technology & Innovation Foundation.

If KOPSA becomes law, AR/VR platforms may be forced to ramp up enforcement in the same manner as traditional social media platforms, the report explained.

By giving the FTC authority to deem content on these platforms harmful, it continued, the FTC may over-censor content on AR/VR platforms, or the platforms themselves may censor content to avoid liability, which could include content pertinent to children’s education, entertainment, and identity.

“One of our fears that we have with KOPSA is that it opens the door for potential over-censorship by giving the FTC [Federal Trade Commission] power to decide what qualifies as harmful,” said the report’s author, Policy Analyst Alex Ambrose.

“It’s another way for a political party to decide what’s harmful,” she told TechNewsWorld. “The FTC could say content like environmental protection, global warming, and climate change is anxiety-inducing. So we need to completely get rid of anything related to climate change because it could lead to anxiety in children.”

Over-Censorship Can Be Avoided

Andy Lulham, COO of VerifyMy, an age and content verification provider based in London, acknowledged that the specter of over-censorship looms large in discussions about online regulation. “But I firmly believe this fear, while understandable, is largely misplaced,” he told TechNewsWorld. “Well-crafted government regulations are not the enemy of free expression, but rather its guardian in the digital age.”

Lulham maintained that the key to regulation lies in the approach. “Blanket, heavy-handed regulations risk tipping the scales towards over-censorship,” he said. “However, I envision a more nuanced, principle-based regulatory framework that can enhance online freedom while protecting vulnerable users. We’ve seen examples of such balanced approaches in privacy regulations like GDPR.”

The GDPR — General Data Protection Regulation — which has been in effect since 2018, is a comprehensive data protection law in the European Union that regulates how companies collect, store, and use the personal data of EU residents.

“I strongly believe that regulations should focus on mandating robust safety systems and processes rather than dictating specific content decisions,” Lulham continued. “This approach shifts the responsibility to platforms to develop comprehensive trust and safety strategies, fostering innovation rather than creating a culture of fear and over-removal.”

He asserted that transparency will be the linchpin of effective regulation. “Mandating detailed transparency reports can hold platforms accountable without resorting to heavy-handed content policing,” he explained. “This not only helps prevent overreach but also builds public trust in both the platforms and the regulatory framework.”

“Furthermore,” he added, “I advocate for regulations requiring clear, accessible appeal processes for content removal decisions. This safety valve can help correct inevitable mistakes and prevent unwarranted censorship.”

“Critics might argue that any regulation will inevitably lead to some censorship,” Lulham conceded. “However, I contend that the greater threat to free expression comes from unregulated spaces where vulnerable users are silenced by abuse and harassment. Well-designed regulations can create a more level playing field, amplifying diverse voices that might otherwise be drowned out.”

Good, Bad, and Ugly of AR/VR

The ITIF report noted that conversations about online safety often overlook AR/VR technologies. Immersive technologies foster social connection and stimulate creativity and imagination, it explained. Play, imagination, and creativity are all imperative for children to develop.

The report acknowledged, however, that properly addressing the risks children face with immersive technologies is a challenge. Most existing immersive technologies are not made for children under 13, it continued. Children explore adult-designed spaces, which leads to exposure to age-inappropriate content and can build harmful habits and behaviors in children’s mental and social development.

Addressing these risks will require a combination of market innovation and thoughtful policymaking, it added. Companies’ design decisions, content moderation practices, parental control tools, and trust and safety strategies will largely shape the safety environment in the metaverse.

It conceded, however, that public policy interventions are necessary to tackle certain safety threats. Policymakers are already addressing children’s safety on “2D” platforms such as social media, leading to regulations that may affect AR/VR tech, ITIF noted.

Before enacting those regulations, the report recommended policymakers consider AR/VR developers’ ongoing safety efforts and ensure that these tools maintain their effectiveness. When safety tools are insufficient, it continued, policymakers should focus on targeted interventions to address proven harms, not hypothetical risks.

“Most online services are working to remove harmful content, but the sheer amount of that content online means that some of it will inevitably slip through the cracks,” Ambrose said. “The issues we see in platforms today, like the incitement of violence, vandalism, and spreading harmful content and misinformation, will only continue on immersive platforms.”

“The metaverse is going to thrive on massive amounts of data, so we can assume that these issues will be pervasive — maybe even more pervasive than what we see today,” she added.

Safety by Design

Lulham agreed with the report’s contention that companies’ design decisions will shape the safety environment of the metaverse.

“In my view, the decisions companies make regarding online safety will be pivotal in creating a secure digital environment for children,” he said. “The current landscape is fraught with risks, and I believe companies have both the responsibility and power to reshape it.”

He maintained that user interface design is the first line of defense to protect kids. “Companies prioritizing intuitive, age-appropriate designs can fundamentally alter how children interact with online platforms,” he explained. “By crafting interfaces that naturally guide users towards and educate them on safer behaviors, we can significantly reduce harmful encounters.”

Content moderation is at a critical juncture, he added. “The volume of content demands a paradigm shift in our approach,” he observed. “While AI-powered tools are essential, they’re not a panacea. I argue that the future lies in a hybrid approach, combining advanced AI with human oversight to navigate the fine line between protection and censorship.”

Parental control tools are often overlooked but crucial, he maintained. These shouldn’t be mere add-ons but core features designed with the same attention as the main platform. “I envision a future where these tools are so intuitive and effective that they become integral to family digital life,” he said.

He contended that trust and safety strategies will differentiate thriving platforms from faltering ones. “Companies adopting a holistic approach, integrating robust age verification, real-time monitoring, and transparent reporting, will set the gold standard,” he declared. “Regular engagement with child safety experts and policymakers will be non-negotiable for companies serious about protecting young users.”

“In essence,” he continued, “I see the future of online safety for children as one where ‘safety by design’ isn’t just a buzzword but the fundamental principle driving all aspects of platform development.”

The report noted that children, as drivers of the metaverse, play a crucial role in the market adoption of immersive technologies.

Ensuring innovation can flourish in this nascent field while also creating a safe environment for all users of AR/VR technology will be a complex challenge, it acknowledged, adding that parents, corporations, and regulators all have roles to play by balancing privacy and safety concerns while creating engaging and innovative immersive experiences.

Share.
Exit mobile version