Two and a half years ago, we outlined our approach to removing content from Google products and services. Our process hasn’t changed since then, but our recent decision to stop censoring search on Google.cn has raised new questions about when we remove content, and how we respond to censorship demands by governments. So we figured it was time for a refresher.
Censorship of the web is a growing problem. According to the Open Net Initiative, the number of governments that censor has grown from about four in 2002 to over 40 today. In fact, some governments are now blocking content before it even reaches their citizens. Even benign intentions can result in the specter of real censorship. Repressive regimes are building firewalls and cracking down on dissent online -- dealing harshly with anyone who breaks the rules.
Increased government censorship of the web is undoubtedly driven by the fact that record numbers of people now have access to the Internet, and that they are creating more content than ever before. For example, over 24 hours of video are uploaded to YouTube every minute of every day. This creates big challenges for governments used to controlling traditional print and broadcast media. While everyone agrees that there are limits to what information should be available online -- for example child pornography -- many of the new government restrictions we are seeing today not only strike at the heart of an open Internet but also violate Article 19 of the Universal Declaration of Human Rights, which states that: “Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.”
We see these attempts at control in many ways. China is the most polarizing example, but it is not the only one. Google products -- from search and Blogger to YouTube and Google Docs -- have been blocked in 25 of the 100 countries where we offer our services. In addition, we regularly receive government requests to restrict or remove content from our properties. When we receive those requests, we examine them to closely to ensure they comply with the law, and if we think they’re overly broad, we attempt to narrow them down. Where possible, we are also transparent with our users about what content we have been required to block or remove so they understand that they may not be getting the full picture.
On our own services, we deal with controversial content in different ways, depending on the product. As a starting point, we distinguish between search (where we are simply linking to other web pages), the content we host, and ads. In a nutshell, here is our approach:
Search is the least restrictive of all our services, because search results are a reflection of the content of the web. We do not remove content from search globally except in narrow circumstances, like child pornography, certain links to copyrighted material, spam, malware, and results that contain sensitive personal information like credit card numbers. Specifically, we don’t want to engage in political censorship. This is especially true in countries like China and Vietnam that do not have democratic processes through which citizens can challenge censorship mandates. We carefully evaluate whether or not to establish a physical presence in countries where political censorship is likely to happen.
Some democratically-elected governments in Europe and elsewhere do have national laws that prohibit certain types of content. Our policy is to comply with the laws of these democratic governments -- for example, those that make pro-Nazi material illegal in Germany and France -- and remove search results from only our local search engine (for example, www.google.de in Germany). We also comply with youth protection laws in countries like Germany by removing links to certain material that is deemed inappropriate for children or by enabling Safe Search by default, as we do in Korea. Whenever we do remove content, we display a message for our users that X number of results have been removed to comply with local law and we also report those removals to chillingeffects.org, a project run by the Berkman Center for Internet and Society, which tracks online restrictions on speech.
Platforms that host content like Blogger, YouTube, and Picasa Web Albums have content policies that outline what is, and is not, permissible on those sites. A good example of content we do not allow is hate speech. Our enforcement of these policies results in the removal of more content from our hosted content platforms than we remove from Google Search. Blogger, as a pure platform for expression, is among the most open of our services, allowing for example legal pornography, as long as it complies with the Blogger Content Policy. YouTube, as a community intended to permit sharing, comments, and other user-to-user interactions, has its Community Guidelines that define its own rules of the road. For example, pornography is absolutely not allowed on YouTube.
We try to make it as easy as possible for users to flag content that violates our policies. Here’s a video explaining how flagging works on YouTube. We review flagged content across all our products 24 hours a day, seven days a week to remove offending content from our sites. And if there are local laws where we do business that prohibit content that would otherwise be allowed, we restrict access to that content only in the country that prohibits it. For example, in Turkey, videos that insult the founder of modern Turkey, Mustafa Ataturk, are illegal. Two years ago, we were notified of such content on YouTube and blocked those videos in Turkey that violated local law. A Turkish court subsequently demanded that we block them globally, which we refused to do, arguing that Turkish law cannot apply outside Turkey. As a result YouTube has been blocked there.
Finally, our ads products have the most restrictive policies, because they are commercial products intended to generate revenue.
These policies are always evolving. Decisions to allow, restrict or remove content from our services and products often require difficult judgment calls. We have spirited debates about the right course of action, whether it’s about our own content policies or the extent to which we resist a government request. In the end, we rely on the principles that sit at the heart of everything we do.
We’ve said them before, but in these particularly challenging times, they bear repeating: We have a bias in favor of people's right to free expression. We are driven by a belief that more information means more choice, more freedom and ultimately more power for the individual.
No comments:
Post a Comment