fbpx

The Effects of Unregulated Content for Gen Z

The Internet as an Irreplaceable Tool

Gen Z’s are the first generation to be born in a world where the internet plays an irreplaceable role, and in some way, these children and adolescents are not just consumers but have become inhabitants of the digital society. Apart from school, generation Z spends most of their time on their cellphone, computer, and videogame console, more than they do outside playing sports or any other activity that doesn’t require a technological device. Their fluency with technology surpasses that of prior generations, and this is not a “bad thing” in a world that is always evolving and pumping out new technologies. However, this unprecedented access to the internet exposes them to an almost infinite realm of experiences, both positive and negative.

Social Media

From primary school to university, many kids, adolescents, and young adults use social media daily. Apps like Instagram, Snapchat, and TikTok have emerged and really resonated with Gen Z’s, so much so that they have become the cornerstone of modern connectivity, enabling constant interaction and cultivating relationships among young users. They offer a space to chat with friends, share bits of their lives, find new hobbies and interests, stay informed about their favourite bands and artists, and so much more.

Video Games

Video games have also become a big pillar of Gen Z’s lives, especially among the male population. These forms of entertainment have evolved from their precarious beginnings with games such as Pong, Pacman, and Mario Bros, which were single-player games, to more modern and complex environments. Games like Call of Duty, Fortnite, and Minecraft have transcended mere entertainment, morphing into immersive experiences that stimulate skill development, strategic thinking, and social camaraderie. They offer a dynamic platform where friends and strangers can play together, interact with each other to improve their characters, complete campaigns, and ultimately level up. 

Videoconferencing Classes

The introduction of video conferencing for educational purposes has been revolutionary, particularly in unforeseen circumstances like the COVID-19 pandemic. It has ensured uninterrupted learning when physical attendance at school wasn’t feasible, and it wouldn’t have been possible without the students expert knowledge of technology. This same expert knowledge was what separated them from many of their teachers, who weren’t ready to make videoconferencing one of the main spaces in which to spend their time. It is this disparity in skill that leads to some of the challenges that will be covered next. 

What’s negative about all of this?

While all of these environments can lead to positive experiences and improvements, they can also lead to extremely dark outcomes, especially for younger demographics. Let’s take social media, for example. It’s super fun; it lets them keep up with family, the world, and more, but it can also leave them vulnerable to fake profiles run by bad actors, harassment, offensive comments in their pictures, explicit content, and many others.

Video games can be a great tool for cultivating strategic thinking and comradery, but they can also morph into a hub for hate. Chats with stragers can lead to kids encountering offensive language, aggressive players, and more.

And last, because of the disparity between the teacher’s knowledge of technology and the student’s, kids are vulnerable to cyberbullying through chat groups and online harassment from other kids.

Its clear by now that the unregulated nature of these digital spaces poses significant risks to young users.

What’s the solution?

If children are the concern, parents can employ many strategies to create a safe digital environment. Implementing app restrictions and managing screen time are effective measures. By utilising parental control apps or device settings, parents can limit access to certain apps or websites deemed inappropriate for their child’s age or maturity level. Setting time limits encourages a healthy balance between screen time and other activities, promoting overall well-being.

But what about online classes? Or what about adolescents and young adults? This is where this type of protection comes short, and promoting awareness of content moderation technologies for tech companies is needed.

The Crucial Role of Content Moderation

What is content moderation, and how can it protect Generation Z? We define content moderation as “the strategic process of evaluating, filtering, and regulating user-generated content on digital ecosystems” in one of our latest articles that you can check out by clicking here.

Implementing efficient moderation tools, especially AI-driven solutions, becomes critical in shielding young users from potential harm. AI-based moderation offers real-time identification and mitigation of harmful content, serving as a proactive shield against explicit imagery, offensive language, and cyberbullying.

Empowering platforms with effective content moderation tools is crucial for creating safer online environments for children and adolescents. Increasing awareness and advocating for their integration can catalyse the development of more secure digital spaces, ensuring the well-being and growth of young users.

Conclusion

The internet has unequivocally embedded itself into Generation Z’s life, providing new opportunities and difficulties. Kids, adolescents, and young adults face both rewarding experiences and possible threats as they navigate social media, video games, and online schooling. 

While social media helps people connect with each other, it also exposes them to cyberbullying and inappropriate content. Video games, on the other hand, are entertaining and can work as a tool to learn strategic thinking, but they also give people a place to say hurtful things and be violent. Lastly, the rapid adoption of videoconferencing in education has emphasised the gap between students’ technological ability and teachers’ preparation, leaving young minds vulnerable to cyberbullying.

Ultimately, raising awareness and advocating for comprehensive content moderation mechanisms are critical steps towards creating safer digital environments for the current and next generations. AI-based moderation has the ability to recognise, report, and censor offensive comments, harassment, and bad actors before they get to interact with people, making it an irreplaceable tool for navigating the internet.

More posts like this

We want content moderation to enhance your users’ experience and so they can find their special one more easily.

Expert’s Corner with Community Building Expert Todd Nilson

Checkstep interviews expert in online community building Todd Nilson leads transformational technology projects for major brands and organizations. He specializes in online communities, digital workplaces, social listening analysis, competitive intelligence, game thinking, employer branding, and virtual collaboration. Todd has managed teams and engagements with national and global consultancy firms specialized in online communities and the…
7 minutes

Unmasking Fake Dating Sites: How to Spot and Avoid Scams

In today's digital age, online dating has become increasingly popular, especially with the COVID-19 pandemic limiting traditional in-person interactions. Unfortunately, scammers have taken advantage of this trend, creating fake dating sites to exploit vulnerable individuals. These fraudulent platforms not only deceive users but also put their personal information and finances at risk. In this article,…
5 minutes

The Role of a Content Moderator: Ensuring Safety and Integrity in the Digital World

In today's digital world, the role of a content moderator is central to ensuring the safety and integrity of online platforms. Content moderators are responsible for reviewing and moderating user-generated content to ensure that it complies with the platform's policies and guidelines, and the laws and regulations. Their work is crucial in creating a safe…
5 minutes

Trust and Safety Teams: Ensuring User Protection

As the internet becomes an integral part of our daily lives, companies must prioritize the safety and security of their users. This responsibility falls on trust and safety teams, whose primary goal is to protect users from fraud, abuse, and other harmful behavior.  Trust and Safety Teams Objectives  The Role of Trust and Safety Teams…
6 minutes

The Evolution of Content Moderation Rules Throughout The Years

The birth of the digital public sphere This article is contributed by Ahmed Medien. Online forums and social marketplaces have become a large part of the internet in the past 20 years since the early bulletin boards on the internet and AOL chat rooms. Today, users moved primarily to social platforms, platforms that host user-generated content. These…
7 minutes

AI Ethics Expert’s Corner : Kyle Dent, Head of AI Ethics

This month we’ve added a new “Expert’s Corner” feature starting with an interview with our own Kyle Dent, who recently joined Checkstep. He answers questions about AI ethics and some of the challenges of content moderation. AI Ethics FAQ with Kyle Dent If you would like to catch up on other thought leadership pieces by…
4 minutes

Misinformation Expert’s Corner : Preslav Nakov, AI and Fake News

Preslav Nakov has established himself as one of the leading experts on the use of AI against propaganda and disinformation. He has been very influential in the field of natural language processing and text mining, publishing hundreds of peer reviewed research papers. He spoke to us about his work dealing with the ongoing problem of…
8 minutes

Checkstep Raises $1.8M Seed Funding to Combat Online Toxicity

Early stage startup gets funding for R&D effort to develop advanced content moderation technology We’re thrilled to announce that Checkstep recently closed a $1.8m seed funding round to further develop our advanced AI product offering contextual content moderation. The round was carefully selected to be diverse, international, and with a significant added value to our business. Influential personalities…
3 minutes

Expert’s Corner with Checkstep CEO Guillaume Bouchard

This month’s expert is Checkstep’s CEO and Co-Founder Guillaume Bouchard. After exiting his previous company, Bloomsbury AI to Facebook, he’s on a mission to better prepare online platforms against all types of online harm. He has a PhD in applied mathematics and machine learning from INRIA, France. 12 years of scientific research experience at Xerox…
3 minutes

Digital Services Act (DSA) Transparency Guide [+Free Templates]

The Digital Services Act (DSA) is a comprehensive set of laws that aims to regulate digital services and platforms to ensure transparency, accountability, and user protection. In other words, it’s the European Union’s way of regulating and harmonizing separate laws under one universal piece of legislation to prevent illegal and harmful activities online and the…
7 minutes

Live Chat Moderation Guide

Interactions have moved online, and people now have the ability to interact as users, share content, write comments, and voice their opinions online. This revolution in the way people interact has led to the rise of many businesses that use live chat conversations and text content as one of their main components. Let's take, for…
10 minutes

The Digital Services Act (DSA) Guide

What is the Digital Services Act (DSA)? The Digital Services Act, otherwise known as the DSA, is the first attempt by theEuropean Union to govern platforms at the regulatory level. Up until this point, all 27 EUmember states have each had their own laws that may or may not apply to onlineplatforms. The DSA is…
7 minutes

Virtual Reality Content Moderation Guide

Its’s no surprise that virtual reality (VR) and the Metaverse have become buzzwords in the world of technology. Notably, these immersive experiences are revolutionising the way we interact with digital content and each other. However, as the popularity of VR continues to grow, attracting more and more users, so does the need for content moderation.…
14 minutes

Top 3 Digital Services Act Tools to make your compliance easier

Introduction The Digital Service Act (DSA) is a European regulation amending the June, 8th 2000 Directive on electronic commerce (Directive 2000/31/EC). Its goal is to modernize and harmonize national legislation within the internal market in response to the risks and challenges of digital transformation. The DSA applies to a large range of digital services such…
12 minutes

Expert’s Corner with Head of Research Isabelle Augenstein

This month we were very happy to sit down with one of the brains behind Checkstep who is also a recognized talent among European academics. She is the co-head of research at Checkstep and also an associate professor at the University of Copenhagen. She currently holds a prestigious DFF Sapere Aude Research Leader fellowship on ‘Learning to…
5 minutes

Prevent unwanted content from reaching your platform

Speak to one of our experts and learn about using AI to protect your platform
Talk to an expert