From labor disputes cases to labor and employment publications, for your research, you’ll find solutions on Bloomberg Law®. Protect your clients by developing strategies based on Litigation...
Algorithms and artificial intelligence can only do so much to patrol the internet for offensive content. Nothing replaces a human being when it comes to making judgment calls about what’s appropriate and safe to share with the public, sources familiar with the process told Bloomberg BNA.
Facebook was confronted with another tragedy May 13 after a man in Tennessee set himself ablaze on Facebook Live. This came less than a month after footage of an elderly Cleveland man’s murder was posted on the platform.
Following the Cleveland incident, the company announced it would hire 3,000 people worldwide to monitor reports from users and remove content that violates its terms of service. They have a lot to monitor: About 7.9 million videos were viewed on Facebook in March 2017 alone, according to Bloomberg data. That is about a million more views than a year prior. The number of views spiked to about 10.2 million during the Thanksgiving and Christmas holiday season, according to the data.
Human moderators are necessary because “all the headscratching and quandaries will be in gray areas,” Ken Martin, the former senior vice president of program practices at CBS, told Bloomberg BNA May 15.
Hemanshu Nigam, founder and chief executive officer of SSP Blue, a cybersecurity company based in Santa Monica, Calif., agrees.
“Technology cannot process information like the human mind, so you still need humans,” Nigam told Bloomberg BNA May 15. “It’s going to be a combination of artificial intelligence, algorithm technology and human moderation.”
The more content there is online, the more “bad actors” there will be, Nigam, a former security chief at MySpace, said.
Facebook’s new hires will be joining the 4,500 current members of the company’s community operations teams, a spokeswoman told Bloomberg BNA in an email May 15. The company declined to comment further on the hiring process but said it uses a combination of user reporting, human review and automation to enforce the site’s content standards.
The human moderators review millions of reports every week in more than 40 languages, the spokeswoman said. Moderators are available around the clock, and they cover all time zones, she said.
Twitter Inc. and Snap Inc., the company behind Snapchat, have also come under fire for hosting content depicting violent acts and hate speech.
Community guidelines for Twitter and its live streaming app Periscope ban all threats and promotion of violence, Liz Kelley, a spokeswoman for Twitter, told Bloomberg BNA in a May 16 email. When users report violations, it goes to a human moderator but nothing is removed until a violation is confirmed, she said.
“Any content that’s removed is done by a member of our teams who are available 24/7,” Kelley said.
Snap didn’t immediately respond to Bloomberg BNA’s requests for comment on how it responds to content violations.
The amount of content is growing rapidly, and moderators may find themselves outmatched, Shontavia Johnson, director of the Intellectual Property Law Center at Drake University, told Bloomberg BNA.
People have different opinions about what is inappropriate, she said. A gun in a video may or may not be inappropriate based on the context, she said.
“Maybe there are internal standards, but human judgment is fluid,” Johnson said.
Standards are also hard to pin down because technology is evolving and new platforms in the future may require a different set of rules, she said. But getting users, moderators and tech companies engaged will ensure progress on the matter.
“We have a unique opportunity to learn from this stuff,” she said.
Nigam, whose company provides content moderators for tech companies, said moderators have to be well trained to handle the amount and type of content that is out there.
Moderators are hired from diverse backgrounds—from online commerce to computer engineering—and are screened to ensure they’re knowledgeable about various cultural norms and global events, he said.
“Different people bring different perspectives,” he said.
“You want the broadest spectrum of experience possible,” Martin said when asked about hiring people to maintain standards. A curious person who “keeps up to date and knows what’s going in the world” is ideal for these positions, he said.
Martin said the standards department at CBS frequently hired from within the company—staffers who had worked as line editors or program coordinators and were knowledgeable about company and industry standards for permissible content.
Providing support for these workers is also important. Moderators at SSP Blue are offered daily and weekly meetings where they can express their feelings about the content they view, Nigam said.
Facebook is hoping to foster a similar community standard, with moderators doing more than simply removing content. When a user posted a Facebook video threatening suicide in April, Facebook staff kept the video up and contacted law enforcement who were able to save the girl’s life, Facebook CEO Mark Zuckerberg said during the company’s earnings call May 3.
“So a lot of what we’re trying to do is not just about taking the content down but also helping people when they’re in need on the platform, and we take that very, very seriously,” Zuckerberg said.
To contact the reporter on this story: Patricio Chile in Washington at firstname.lastname@example.org
Copyright © 2017 The Bureau of National Affairs, Inc. All Rights Reserved.
All Bloomberg BNA treatises are available on standing order, which ensures you will always receive the most current edition of the book or supplement of the title you have ordered from Bloomberg BNA’s book division. As soon as a new supplement or edition is published (usually annually) for a title you’ve previously purchased and requested to be placed on standing order, we’ll ship it to you to review for 30 days without any obligation. During this period, you can either (a) honor the invoice and receive a 5% discount (in addition to any other discounts you may qualify for) off the then-current price of the update, plus shipping and handling or (b) return the book(s), in which case, your invoice will be cancelled upon receipt of the book(s). Call us for a prepaid UPS label for your return. It’s as simple and easy as that. Most importantly, standing orders mean you will never have to worry about the timeliness of the information you’re relying on. And, you may discontinue standing orders at any time by contacting us at 1.800.960.1220 or by sending an email to email@example.com.
Put me on standing order at a 5% discount off list price of all future updates, in addition to any other discounts I may quality for. (Returnable within 30 days.)
Notify me when updates are available (No standing order will be created).
This Bloomberg BNA report is available on standing order, which ensures you will all receive the latest edition. This report is updated annually and we will send you the latest edition once it has been published. By signing up for standing order you will never have to worry about the timeliness of the information you need. And, you may discontinue standing orders at any time by contacting us at 1.800.372.1033, option 5, or by sending us an email to firstname.lastname@example.org.
Put me on standing order
Notify me when new releases are available (no standing order will be created)