Amazon EXCLUSIVE to proactively remove more content that violates rules from cloud service sources
/cloudfront-us-east-2.images.arcpublishing.com/reuters/SI6YLLUONRMDLBQKIKCKJVYMGE.jpg)
Amazon.com Inc’s annual cloud computing conference attendees walk past the Amazon Web Services logo in Las Vegas, Nevada, United States, November 30, 2017. REUTERS / Salvador Rodriguez / File Photo
Sept. 2 (Reuters) – Amazon.com Inc (AMZN.O) plans to take a more proactive approach to determining what types of content violate its cloud service policies, such as the rules against promoting violence, and enforcing its removal, according to two sources, a move that could reignite debate over how much energy tech companies should have to restrict free speech.
Over the next few months, Amazon will hire a small group of people in its Amazon Web Services (AWS) division to develop expertise and work with outside researchers to monitor future threats, one of the sources familiar with the matter said.
This could make Amazon the world’s largest cloud service provider with 40% market share according to research firm Gartner, one of the world’s most powerful arbitrators for authorized content on the Internet, according to the reports. experts.
A day after this story was published, an AWS spokesperson told Reuters the news agency‘s information “was false” and added “AWS Trust & Safety does not intend to change its policies or processes, and the team has always been there. “
A Reuters spokesperson said the news agency was sticking to its reporting.
Amazon made headlines in the Washington Post last week for shutting down a website hosted on AWS that featured Islamic State propaganda celebrating the suicide bombing that killed around 170 Afghans and 13 US soldiers in Kabul last Thursday . They did so after the news agency contacted Amazon, according to the Post.
The proactive approach to content comes after Amazon banned the social media app Speak from its cloud service shortly after the Jan.6 riot on Capitol Hill for allowing content promoting violence. Read more
Amazon declined to comment prior to the publication of the Reuters article on Thursday. After the post, an AWS spokesperson said later today, “AWS Trust & Safety strives to protect AWS customers, partners, and Internet users from bad actors who attempt to use our services on the Internet. abusive or illegal purposes. When AWS Trust & Safety becomes aware of abusive or illegal behavior on AWS services, they act quickly to investigate and engage with customers to take appropriate action. “
The spokesperson added that “AWS Trust & Safety does not pre-review content hosted by our customers. As AWS continues to grow, we expect this team to continue to grow. “
Activists and human rights groups increasingly hold not only websites and apps accountable for harmful content, but also the underlying technological infrastructure that allows these sites to function, while conservatives politicians denounce what they see as a restriction on freedom of expression.
AWS already prohibits the use of its services in various ways, such as illegal or fraudulent activity, to incite or threaten violence or promote the sexual exploitation and abuse of children, in accordance with its acceptable use policy.
Amazon first asks customers to remove content that violates its policies or have a system to moderate the content. If Amazon does not come to an acceptable agreement with the customer, it may shut down the website.
Amazon aims to develop an approach to the content issues it and other cloud providers face more frequently, such as determining when disinformation on a company’s website is reaching a scale that requires AWS action, the source said. .
The new team at AWS does not plan to sift through the vast amounts of content that businesses host in the cloud, but will aim to stay ahead of future threats, such as emerging extremist groups whose content could end up on the web. AWS cloud, the added source.
A job posting on Amazon’s jobs website announcing a post of “Global Head of Policy at AWS Trust & Safety,” which was last seen by Reuters ahead of this story’s publication on Thursday, was no longer available on the Amazon site on Friday.
The announcement, which is still available on LinkedIn, describes the new role as one that will “identify policy gaps and propose scalable solutions”, “develop frameworks to assess risk and guide decision making” and “develop effective problem escalation mechanisms “.
The LinkedIn announcement also states that the post “will make clear recommendations to the management of AWS.”
The Amazon spokesperson said the job posting on the Amazon website had been temporarily removed from the Amazon website for editing and should not have been posted in its draft form.
AWS’s offerings include cloud storage and virtual servers and have large companies like Netflix (NFLX.O), Coca-Cola (KO.N) and Capital One (COF.N) as customers, according to its website. .
PROACTIVE MOVEMENTS
Better preparedness against certain types of content could help Amazon avoid legal and PR risks.
“While (Amazon) can proactively eliminate some of these items before they are discovered and become big news, it helps to avoid this reputational damage,” said Melissa Ryan, Founder of CARD Strategies, a consulting firm that helps organizations understand extremism and toxicity threats online.
Cloud services such as AWS and other entities such as domain registrars are considered the “backbone of the Internet,” but have always been politically neutral services, according to a 2019 report by Joan Donovan, a Harvard researcher who studies online extremism and disinformation campaigns.
But cloud service providers have already removed content, such as in the aftermath of the 2017 alt-right rally in Charlottesville, Virginia, helping to slow the ability of alt-right groups to organize, Donovan wrote.
“Most of these companies naturally didn’t want to get into content and didn’t want to be the arbiter of thought,” Ryan said. “But when you talk about hate and extremism, you have to take a stand.”
Reporting by Sheila Dang in Dallas; Editing by Kenneth Li, Lisa Shumaker, Sandra Maler and William Mallard
Our Standards: Thomson Reuters Trust Principles.