Facebook: Colorado shooter’s fan page does not violate terms


Story Highlights

Facebook pages supporting the suspect in the Colorado theater murders have surfaced

James Holmes is accused of killing 12 spectators early Friday morning

A Facebook fan page for Holmes had more than 800 likes on Wednesday

A pop culture professor warns against excessive reading in a tiny piece of the internet



CNN

The file photo, posted to a Facebook fan page for the accused Colorado shooter, shows two young men in a movie theater turning around to tell people behind them to shut up.

“If you don’t shut up,” he said, “we’ll James Holmes your a-.”

It’s nothing new for Facebook pages to pop up in support of the accused killers and other nasty figures, but a few dozen Holmes fan pages – including one with more than 800 followers that popped up the day Holmes is accused of opening fire on a theater in Aurora, Colorado, killing 12 people and injuring dozens – raises new questions about what constitutes free and appropriate speech in the digital age, especially on Facebook.

The network, which has 900 million monthly active users, has long been seen as a sort of brightly lit public square in an internet that has dark, seedy corners. People have to sign up for Facebook under a real identity, so the trolling habits that anonymity seems to encourage tend to be less tolerated on Facebook than on sites like 4chan or even YouTube.

Still, Facebook can be pretty lenient about what it will let people say on its network. The company has decided not to take down James Holmes’ fan pages, though employees are monitoring them closely for types of speech that would violate the social network’s Community Guidelines, spokesman Fred Wolens said.

The most popular page, “while incredibly obnoxious, does not violate our terms,” ​​he said.

For this fan site to be taken down, it would have to post “credible threats” against specific people or post something intended to incite violence, Wolens said.

Facebook declined to say whether any particular posts or comments on that particular page were removed because they met either of those two criteria. Facebook will sometimes delete comments, posts, or photos without deleting an entire page.

Wolens added that Holmes’ fan pages are not representative of the reaction of Facebook users. He pointed to several pages where Facebook users are rallying around Colorado shooting victims by posting memorials, messages of support and trying to raise funds for a victim who was shot and is now in an intensive care unit.

“We are encouraged that the vast majority of activity on Facebook surrounding this tragedy has focused on helping the community cope and beginning the healing process following these events,” the company said. in an emailed statement.

In the past, Facebook has been criticized for both leaving some Facebook pages and images and deleting others.

Last year, the site sparked outrage when it deleted a support page for breastfeeding mothers because it featured breastfeeding photos, but reinstated it two days later. It is now part of the site’s guidelines to allow such images to be posted on the site.

The social network in March 2011 took down a page calling for a Palestinian intifada after the Israeli government complained.

He left a Holocaust denial page in 2009, saying “being offensive or objectionable” doesn’t mean a site can be taken down. At the time, in Dallas, Texas, attorney Brian Cuban urged the network to implement tougher controls, saying, “There is no First Amendment right to free speech in the private domain. It is not a question of freedom of expression. Facebook is free to set the standard it wants.

Facebook’s “Community Standards” document, which is online, discusses violent and threatening remarks as follows: “Security is Facebook’s top priority. You cannot credibly threaten to harm others or organize acts of violence in the real world. We remove content and may turn to law enforcement when we perceive a real risk of physical harm or a direct threat to public safety. We also prohibit promoting, planning or celebrating any of your actions if they have or could result in financial harm to others, including theft and vandalism.

The site also does not tolerate language that incites self-harm, hate speech, bullying or “graphic content”, nudity or pornography, in accordance with this online policy.

Other tech companies have found themselves embroiled in similar debates about what is and isn’t appropriate communication on online platforms.

Apple is considered one of the tightest ships because it pre-approves apps that will be sold in its App Store. However, it has also been criticized for rejecting apps that would compete with the company’s offerings or which it finds distasteful for one reason or another. In 2010, for example, the company initially rejected an app that featured the work of a Pulitzer Prize-winning political cartoonist. The app was later approved by Apple, press next pad.

Google+, the social network of this technology giant, came under fire for initially requiring its users to log in using their real names. Some groups protested the policy, saying political dissidents of authoritarian regimes, for example, could not use the service without fear of violence. The company relaxed the policy in January, allowing nicknames.

Twitter prohibits users from impersonating others, infringing on trademarks, using the service unlawfully, and threatening violence, according to its standards document, called “Twitter Rules.” It prohibits the use of pornographic images “in your profile picture or user background”, but does not appear to prohibit users from posting links to such content.

Facebook is letting the Holmes fan page saga unfold.

Many users said they found the page “disgusting” and “sickly”. “You don’t care about the death of these poor people? Seriously?” we wrote.

Others call on Facebook users to ignore the page to avoid giving it more power. “He’s just a troll with nothing better to do than fish for negative attention on the internet. In fact, I feel sorry for him,” one commenter wrote.

The page administrator, who did not reveal his identity on this fan page, does not seem to care.

“Whatever you have to say to me, I don’t care. Every time you report to me. This page is not affected. (I’ve been reported over a billion times and nothing has happened),” the page admin wrote, adding, “Also, I don’t believe in karma and I don’t believe in ‘hell. Please keep this in mind when posting. Unless it’s something clever or funny, know; I’m just gonna laugh at you and all you’re doing is wasting your time.

While the page is whipping some people into a rage, Robert Thompsonpop culture professor at Syracuse University, warned against reading too much into a tiny piece of the internet that has little value for the broader public discourse.

“The amount of attention we give to this stuff is probably totally out of proportion to how most people feel,” he said over the phone. “But because anyone with the internet has an international distribution system at their fingertips, if you start something that’s a pro-mass murder fan page, it’s going to get people’s attention.”

It’s understandable that people are outraged by the Facebook page, he said, especially since it pokes fun at the victims of a mass murder that’s not even a week old. But the page could be written in a tongue-in-cheek tone, he said, and it’s certainly not representative of prevailing views in America — or the opinion of anyone other than its creator.

“The mistake is made when people say what does this page about America mean?” he said, “This page says nothing about America, except maybe there are too many Facebook pages.”

Previous Facebook Restrictions Make Your Fan Page Invisible to Search Engines
Next Find your Facebook fan page RSS feed URL