If a friend wanted to show you footage of a drug cartel beheading via chainsaw, they were showing you on LiveLeak. If you wanted to see footage of the Saddam Hussein execution you went to LiveLeak. LiveLeak contained much of the same footage but framed it in a more respectable way and the creators framed it as a place for citizen journalists to post uncensored videos of world events. Along with and others, Ogrish was a place people went to when they wanted to see the worst the web had to offer. LiveLeak began in 2006 as an offshoot of the early internet shock site Ogrish. I'm sat here now writing this with a mixture of sorrow because LL has been not just a website or business but a way of life for me and many of the guys but also genuine excitement at what's next.” “The world has changed a lot over these last few years, the Internet alongside it, and we as people. Vaidhyanathan said Facebook’s live video feature has turned into a beast that Facebook can do little about “short of flipping the switch.” Though Facebook has hired more moderators to supplement its machine detection and user reports, “you cannot hire enough people” to police a service with 2.3 billion users.“Nothing lasts forever though and-as we did all those years ago-we felt LiveLeak had achieved all that it could and it was time for us to try something new and exciting,” LiveLeak co-founder Hayden Hewitt said in a blog post explaining the change. Facebook simply didn’t know about it in time.įacebook’s Sonderby said in Tuesday’s blog post that the company “designated both shootings as terror attacks, meaning that any praise, support and representation of the events” are violations. Indecision didn’t seem to be the case here, though. In some cases, it’s not clear at the outset whether a video or other post violates Facebook’s standards, especially on a service with a range of languages and cultural norms. She calls it “incredibly offensive and inappropriate” to pin responsibility on users subjected to traumatic video. “If they cannot handle the responsibility, then it’s their fault for continuing to provide that service,” said Mary Anne Franks, a law professor at the University of Miami. Nonetheless, they say Facebook cannot deflect responsibility. However, it’s less clear how these systems apply to Facebook’s live streaming.Įxperts say live video poses unique challenges, and complaints about live streaming suicides, murders and beatings regularly come up. The video also outlined how it uses “computer vision” to detect 97 percent of graphic violence before anyone reports it. Those reports are then sent to human reviewers, the company said in a November video. Facebook didn’t immediately respond to a request for comment and questions about its communications with police.įacebook uses artificial intelligence to detect objectionable material, while relying on the public to flag content that violates its standards. The company does have a page titled “ information for law enforcement authorities ,” but it merely outlines procedures for making legal requests for user account records. Users are also told to contact law enforcement if someone is in immediate danger.įacebook also doesn’t appear to post any public information instructing law enforcement how to report dangerous or criminal video. A user who clicks on “report live video” gets a choice of objectionable content types to select from, including violence, bullying and harassment. To report live video, a user must know to click on a small set of three gray dots on the right side of the post.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |