The Internet can be a volatile place, as seen on social media, where posts can quickly dissolve to public discord. According to Statista, the online statistics portal, there are more than 2 billion people on Facebook, 1 billion on Instagram, and 68 million people on Twitter. This equates to over 3 billion people with whom one can go and have a potential conflict. Conflict situations in online spaces can take the form of classic cyber-bullying to full on hate speech. All too often an onlooker might scroll on past this kind of stuff, wanting no part in it—however, are there consequences to remaining silent in the face of hate, ignorance, and intolerance?
The Bystander Effect
In April, Facebook released a transparency statement where it reported removing 2.5 million pieces of hateful content. Facebook defines hateful content as anything that attacks a “protected characteristic” such as race, ethnicity, gender, sexual orientation, disability and so on. While the company admits that there is a lot of room for improvement on its hate speech detection system, it is still evident that hate speech is highly prevalent in social media.
When scrolling through your newsfeed you may come across someone getting harassed. It may even cross your mind to chime in—maybe you do, adding yourself to the situation in the process, or perhaps you decide it’s none of your business. This common situation is an example of how people can experience the bystander effect on social media.
The bystander effect is a term that became mainstream after the brutal rape and murder of Kitty Genovese in 1964. Genovese was returning to her New York apartment late one night when she was assailed. After screaming for help for an hour one person finally called the authorities. As police investigated the woman’s death, almost 40 neighbors admitted to hearing screams outside but chose to do nothing. The neighbors had figured that the altercation would sort itself out or that someone else would call the police.
The term itself was popularized by social psychologists Bibb Latane and John Darley. The duo hypothesized that the more bystanders there are in a situation, the less likely anyone is to take action. This situation is referred to as diffusion of responsibility. If you consider social media platforms as public spaces, the number of bystanders and diffusion of responsibility in a conflict situation can be staggering.
Slander Hits Home
Recently Sami Al-AbdRabbuh announced that he was running for Benton County Commissioner in public Corvallis Facebook groups. The post soon had hundreds of comments.
“It has been so humbling to see the outpouring of comments since my announcement,” Al-AdbRabbuh said.
The comments, many of which spurred off into side conversations of race, did result in some negativity. Al-AbdRabbuh said of some of the more negative comments, “While it seemed that some comments came from a place of prejudice, there were friends and others challenging those comments. We need to challenge prejudice in ourselves and others.”
Listen, Affirm, Respond, Add
So what do you do the next time you log on and see someone being harassed online or being targeted by hate and discrimination? While there are many iterations of bystander training to help people navigate situations in real life, things are a little bit trickier online. The National Conference for Community Justice developed a useful strategy for dealing with people and conflict online called LARA, which stands for Listen, Affirm, Respond, Add. The LARA strategy suggests that social media users take a beat after reading something offensive on the Internet.
The first step, “Listen”, suggests users pretend that the offending post was written by someone whose opinion they hold in high regard. “Affirm”, the second step, asks the user to find some common ground with the offending poster before moving on to the “Respond” phase. The last A stands for “Add”—the addition of facts, and here, statistics may be helpful to the constructive dialog. Additionally, Cultural Intelligence Center urges social media users to assume the best when preparing to respond to racist posts—there is a chance that the post in question was made out of ignorance. When crafting a response, it should aim to be a teachable moment similar to the Add phase in the LARA strategy mentioned above.
Jill Schuster, a member of the moderator team for the Corvallis People Facebook group, mentioned how important it is for community members to encourage productive dialogue in online spaces. “We hope that members conduct themselves with consideration for other members,” Schuster says. “It’s helpful if other members would suggest some self-control.”
Corvallis People, the largest Facebook group for Corvallis-area locals has over 12,800 members and has approximately 1,000 posts with 15,000 comments per month. Since the moderator team is comprised entirely of volunteers, it is especially important for members to put forth the extra effort to help keep discourse civil. If a particular post or member of an online group is violating the ground rules of the group and is not responding to civil engagement, contact the admin team for the group, or block the user if necessary.
Remember, people are most likely not going to change their opinions based on an interaction that occurs online, however engaging in civil conversation online works to set a new norm. Empowering bystanders to speak up and speak out against racist or hateful speech takes power away from those perpetuating the negativity.
By Erica Johnson