5th Annual Cyberbullying Month – Closing Thoughts

As we finish the last day of our 5th Annual Cyberbullying Month activities in virtual reality, we are meeting in ENGAGE XR for a discussion of the month’s special event activities, workshops, guest speakers, and training.

My person journey with cyberbullying – called online bullying in the early days of the web – evolved over the decades. When we started Educators in VR, it was one of the most important topics I knew we would need to tackle. Over the years, we’ve learned so much, connected and partnered with organizations doing amazing things including the University of Maryland Global Campus Center for Security Studiesm Virtual World Society and Girls STEAM Institute, all working together to build awareness around privacy, safety, and security, and develop the standards necessary for this emerging technology and world of the metaverse.

I thought I’d take a moment to share with you the things I’ve learned this month during our annual cyberbullying month as food for your own thoughts.

People Think It Won’t Happen to Them

Cyberbullying event in ENGAGE XR with Center for Security Studies with Maryland University Global Campus.Crimes on the Internet began even before it had a time, finding its footing in the 1990s with personal as well as criminal attacks. The earliest days of Usenet groups faced bullying within the bulletin boards and forums. So why do people still believe that this won’t happen to them?

I teach that cyberbullying and cybercrimes are a matter of when not if, yet many people attending our events still believe that they practice safe internet sex, so to speak. While cyberbullying is personal, most of the time, many cyberbullying attacks turn into cybercrimes. The conservative estimate is that 1 in 4 people will be bullied online in their lifetime, with some reports stating 38% experience cyberbullying on social media daily. Repeated studies found 100% of us will be witnesses, bystanders to these attacks, which impacts each of us psychologically. Estimates are based upon reporting, and such attacks are under-reported and focused mostly on children. Add to this the estimates on cybercrimes at 23.000 cyberattacks per second worldwide, impacting our work and personal lives… you guess at the numbers and how they impact you.

Yes, you. It will happen to you.

Cyberbullying Impacts Everyone

Bodyswaps team discusses training in VR for racism and microaggressions. During many of our cyberbullying special events, the topic of the domino effect came up over and over again. When a person is cyberbullied, they are impacted. The fear, anxiety, and psychological trauma that may result impacts them, but also the others around them. Bullied people tend to not trust others, often pulling away from friends and family and all social contact. They lose interest in what used to excite them in their personal and professional lives. In prolonged exposure, some people will even change schools, jobs, and move to a new location.

As many keep the bullying private fearing judgement, the people around them often feel helpless as they know something is wrong and don’t know what it is or what to do about it.

Online privacy, safety, and security is often misunderstood and underestimated. When someone puts on a modern VR headset, about 6 million data points may be collected (PDF) within 20 minutes of usage. This data includes enough information to not just identify someone by their IP address, but with massive non-verbal behavior such as posture, eye gaze, gestures, facial expressions, an interpersonal distance and positions within the virtual environment.

A new law in California requires data brokers to register and delete California citizen’s data from their databases – all databases – upon request as a result of the nearly impossible task of trying to get your personal and private data removed online by regulating the system. I learned this personally when myself and my company was doxxed and personal information sold by courthouses, mortgage companies, and other public utilities was collected and collated and available on data broker sites going back over 30 years! For someone paranoid for decades about what personal data I share online, this was a shock. We need to do a better job explaining this to our young and to ourselves.

More than the rules, laws, and procedures, we need to discuss the concept of microaggressions, the sometimes harmful innocuous things that we say that hurt each other. These are things people say that pick away at our psyche, often doing more harm than good, and rarely recognized as harmful. Examples include casual grouping of race, gender, or sexual orientation with “those people” style comments or asking a woman to “get the coffee” at work. We’ve all heard them. We need to discuss how these impact us and how to change our language to be more inclusive.

We need to make bullying, online and offline, a “dinner conversation.” We must let people know they are not alone. If one person is being bullied by someone, that is not a singular act. Bullies tend to go after multiple people. We need to make it safe for people to report, take action, and get the help and support they need. This is an “everyone” conversation everywhere. It isn’t an US or UK thing. Cyberbullying and cybercrimes are found to be the highest levels in Russia, Brazil, Italy, South Korea, and Japan, as well as the UK and US.

No Standards, No Metaverse Police

The process of reporting a cybercrime or bullying in the metaverse the isn’t served by the platform tool reporting system is:

  1. Block and report through personal platform tools.
  2. Report through tools or website of the VR platform or game directly.
  3. Report as a personal crime to the local police of the attacker.
  4. If money is involved or it is clearly a multiple user/player attack, report to the FBI, or your government’s cybercrime unit.

Cyberbullying event with Educators in VR on event moderation training in ENGAGE XR. While Interpol is the most likely agency to handle the metaverse policing, and they are working on training to be prepared for it, their job is to stop major criminal activities that cross international borders, not deal with a creep who is stalking you online. Should the United Nations police the metaverse? What about WHO as this is an international health crisis, as declared by several countries?

Yet, some of these “creeps” are going after multiple people and crossing platform borders, moving across social VR platforms into games and back. These games and platforms are not country borders, but maybe they should be treated like that?

Until we have standards, and international laws that focus on collaboration across country lines, as many say, the metaverse is the “Wild Wild West.”

The Metaverse Industry Must Step In and Up

Workshop session exploring the fine print on VR headsets and apps that enable data collection and what happens with it.Phil Spencer of Xbox proposed a collaboration between major games to prevent bad actors booted from one game to continue their reign of terror and harm on another, must like the casinos due when dealing with a disruptive or cheating person. Tossed from one casino, a warning is issued and you are watched in the next casino. Repeat the offense and get kicked out of that casino, you are blocked at the door at the next casino and so on. With AI and algorithms in development for years that monitor these things on a platform level, what would it take to collaborate with others to make this happen?

It’s more than this, though. There is an entire culture that encourages rude, mean, and sometimes nasty behavior. It’s expected that players will attack those who are non-white males with vicious verbal or physical attacks. It is expected that some social VR platforms are free-for-alls with little or no regulation or active monitoring to discourage bad behavior, leaving it up to the community to self-police.

When the culture encourages socially negative behavior, and the platform or game doesn’t offer human moderation to set a standard, how do you change it? Right now, we are seeing that with closure of AltspaceVR and the spread of the community desirous of a safe and secure social experience. Platforms infamous as toxic like VRChat are changing as former Altspacers find a home there.

Changing the culture is part of the responsibility of the community, but it starts with the platform. Spatial VR’s Community Guidelines state clearly:

We do not allow the depiction, promotion, or trade of firearms, ammunition, firearm accessories, or explosive weapons. We also prohibit instructions on how to manufacture those weapons. Content as part of a museum’s collection, carried by a police officer, in a military parade, or used in a safe and controlled environment such as a shooting range may be allowed.

Do not post, upload, stream, or share: Content that displays firearms, firearm accessories, ammunition, or explosive weapons; Content that offers the purchase, sale, trade, or solicitation of firearms, accessories, ammunition, explosive weapons, or instructions on how to manufacture them.

Exploring the fine print on VR headsets and apps that enable data collection and what happens with it.However, finding these guidelines is a self-directed hunt. Some platforms have experimented with introducing their community standards and guidelines as part of the sign up experience. Others have also experimented with funneling those kicked out of events or worlds into a tutorial or support session with a counselor that will help the person understand what they did wrong and how to change their behavior to fit into the community instead of be outside of it alone. Unfortunately, most of these have failed, but Discord is making a fresh attempt at changing things.

Research by Discord found that the majority of their bad actors on the social platform are teenagers who don’t know any better, so they are starting a new system called the Team Safety Assist program that will give a warning with information on what the person did “wrong”, and the consequences of not changing their behavior. Banning punishment is temporary, and reflected of the misdeed. All intentional and serious attacks are treated normally, with a potential permanent ban, but Discord wants to allow teens to learn from their mistakes. Could this work in the metaverse? I’m watching.

Closing Thoughts

Workshop on the fine print on VR headsets and apps that enable data collection and what happens with it.As we bring the end of this year’s 5th Annual Cyberbullying Month with Educators in VR to a close, know that this isn’t the end of this discussion. Educators in VR is committed to providing the latest information on cyberbullying and cybercrimes. We recently started collaborating with the University of Maryland Global Campus Center for Security Studies on their GenCyber Teacher Camp and other organizations to ensure we bring you the best workshops and training programs.

Educators in VR is also a consultant and advisor for educational institutions globally on privacy, safety, and security for immersive education. If you are interested in learning more, please contact us for more information.

Leave a Reply