
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:media="http://search.yahoo.com/mrss/">
    <channel>
        <title><![CDATA[ The Cloudflare Blog ]]></title>
        <description><![CDATA[ Get the latest news on how products at Cloudflare are built, technologies used, and join the teams helping to build a better Internet. ]]></description>
        <link>https://blog.cloudflare.com</link>
        <atom:link href="https://blog.cloudflare.com/" rel="self" type="application/rss+xml"/>
        <language>en-us</language>
        
        <lastBuildDate>Mon, 13 Apr 2026 18:51:20 GMT</lastBuildDate>
        <item>
            <title><![CDATA[Digital Evidence Across Borders and Engagement with Non-U.S. Authorities]]></title>
            <link>https://blog.cloudflare.com/digital-evidence-across-borders-and-engagement-with-non-us-authorities/</link>
            <pubDate>Thu, 28 Feb 2019 13:00:00 GMT</pubDate>
            <description><![CDATA[ Since we first started reporting in 2013, our transparency report has focused on requests from U.S. law enforcement. Previous versions of the report noted that, as a U.S. company, we ask non-U.S. law enforcement agencies to obtain formal U.S. legal process before providing customer data.  ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Since we first started reporting in 2013, our transparency report has focused on requests from U.S. law enforcement. Previous versions of the report noted that, as a U.S. company, we ask non-U.S. law enforcement agencies to obtain formal U.S. legal process before providing customer data.</p><p>As more countries pass laws that seek to extend beyond their national borders and as we expand into new markets, the question of how to handle requests from non-U.S. law enforcement has become more complicated. It seems timely to talk about our engagement with non-U.S. law enforcement and how our practice is changing. But first, some background on the changes that we’ve seen over the last year.</p>
    <div>
      <h3>Law enforcement access to data across borders</h3>
      <a href="#law-enforcement-access-to-data-across-borders">
        
      </a>
    </div>
    <p>The explosion of cloud services -- and the fact that data may be stored outside the countries of residence of those who generated it -- has been a challenge for governments conducting law enforcement investigations. A number of U.S. laws, like the Stored Communications Act or the Electronic Communications Privacy Act restrict companies from providing particular types of data, such as the content of communications, to any person or entity, including foreign law enforcement agencies, without U.S. legal process. To get access to electronic data stored outside their home borders, law enforcement agencies around the world have long used Mutual Legal Assistance Treaties (MLATs) that allow one country to ask for another country’s help to get access to evidence. Unfortunately, the MLAT process can be slow and cumbersome.</p><p>Countries frustrated by the inability of law enforcement to quickly gather evidence held outside their borders have taken matters into their own hands. Some have proposed laws mandating that important data about their citizens remain in country, where it can be easily accessed when requested. Others have proposed laws that would allow law enforcement to get access to data wherever it is stored, which puts companies in the position of potentially violating one country’s laws in order to comply with another’s.</p><p>In short, a new paradigm that allows law enforcement to access appropriate digital evidence across borders, with sufficient procedural safeguards to protect our users’ privacy and ensure due process, is long overdue.</p>
    <div>
      <h3>U.S. CLOUD Act</h3>
      <a href="#u-s-cloud-act">
        
      </a>
    </div>
    <p>In March 2018, the U.S. Congress passed the Clarifying Lawful Overseas Use of Data (CLOUD) Act as part of a large bill funding the government. The idea behind the law is that governments that protect their citizens’ due process rights and civil liberties should be able to get access to electronic content related to their citizens when conducting law enforcement investigations, wherever that data is stored.</p><p>The CLOUD Act anticipates that the U.S. government will enter into agreements with other countries’ governments to give each of the participating governments access to data stored in other participating countries for the purpose of investigating and prosecuting certain crimes. Under the law, the U.S. government will have to determine that a country has “robust substantive and procedural protections for privacy and civil liberties” before entering into an agreement with that country. After a country enters a formal agreement with the United States, U.S. companies would no longer be restricted by U.S. law from providing that country’s law enforcement with access to content data in response to a valid law enforcement request.</p><p>From a practical standpoint, the CLOUD Act envisions that U.S. companies like Cloudflare will be providing information directly to governments that have entered into agreements with the U.S. government. The idea is to change the relevant question away from “where is the data stored?” to “is the person being investigated a citizen or resident of the country asking for the information?”, recognizing every government’s right to investigate crimes that occur within its borders or affect its citizens.</p>
    <div>
      <h3>Movement in Europe</h3>
      <a href="#movement-in-europe">
        
      </a>
    </div>
    <p>Governments outside the United States have also moved forward with proposals that would provide law enforcement agencies authority to obtain information related to their citizens across borders. The United Kingdom, for example, has been working to update their laws and negotiate a bilateral agreement with the United States for access to data maintained by U.S. companies, consistent with the framework established in the CLOUD Act.</p><p>The European Union has also been active in moving forward with a framework on obtaining electronic evidence across borders. Much like the U.S. CLOUD Act, the European Commission’s eEvidence Regulation would allow EU Member States to seek digital evidence outside of their national borders provided that fundamental rights are protected. The European Commission also envisions entering into negotiations with U.S. authorities on data sharing arrangements under the mandate of EU Member States.</p>
    <div>
      <h3>So where does all of this leave us?</h3>
      <a href="#so-where-does-all-of-this-leave-us">
        
      </a>
    </div>
    <p>As a U.S. company that stores customer records inside the United States, Cloudflare has long held the view that non-U.S. governments should have to follow U.S. due process requirements in order to obtain any records about our customers. When non-U.S. governments have come to us requesting records, we have explained the nature of our service and, to the extent they were interested in obtaining data, encouraged them to submit a request to the U.S. Department of Justice through the MLAT process.</p><p>But it’s important to note that these processes serve an important function and are not just intended to delay the efforts of foreign law enforcement. They have helped us address some of the more challenging requests that we have seen. Let’s say, for example, law enforcement from an otherwise-respected nation sent us a court order demanding information about websites run by a vocal group of dissenters or even the organizers of a separatist referendum and also asked us to redirect that website to a location of their choosing. In that case, we would direct that foreign agency to submit an MLAT request. In situations like this, we might not receive subsequent legal process from the U.S. government, either because the government declined to ask the Department of Justice for an MLAT related to activity that could be viewed as political or because the Department of Justice declined to process it.</p><p>With the changing legal and policy landscape, as well as our increased presence in non-U.S. locations, we think it’s time to take a step towards the new framework that is taking shape.</p>
    <div>
      <h3>What type of information could we provide to non-US law enforcement?</h3>
      <a href="#what-type-of-information-could-we-provide-to-non-us-law-enforcement">
        
      </a>
    </div>
    <p>The overwhelming majority of information that U.S. law enforcement seeks from Cloudflare through legal process is what we consider to be basic subscriber data -- the type of information that customers give us when they sign up for service. That includes things like name, email address, physical address, phone number, the means and source of payment, and non-content information about a customer’s account, such as data about login times and IP addresses used to login to the account.</p><p>Although we consider this account information to be private customer data, worthy of protection, we share the commonly held view that it is less sensitive than information considered to be content, such as email communications or documents created by users. In fact, U.S. law allows law enforcement to compel us to provide basic subscriber data with a subpoena, a type of legal process that does not require prior judicial review.</p><p>Recent policy discussions have convinced us that there may be situations where it is appropriate to provide this type of basic subscriber information to non-U.S law enforcement in response to non-U.S. legal process similar to a subpoena, a view in line with that of many other tech companies. We may therefore respond to requests for subscriber information if a government is seeking information about a crime in its country or about its citizens, we have employees in the country, and appropriate due process requirements and international standards have been met. We will also consider whether the country has signed a CLOUD Act agreement with the United States.</p><p>The CLOUD Act and other existing U.S. laws govern the provision of more sensitive, content data to non-U.S. law enforcement. U.S. companies are legally prohibited from providing content data to a non-U.S. government absent a U.S. CLOUD Act agreement with that country. Given the nature of our service, however, we rarely have records that constitute content that we could provide to law enforcement regardless of jurisdiction.</p>
    <div>
      <h3>Overall Principles We Follow</h3>
      <a href="#overall-principles-we-follow">
        
      </a>
    </div>
    <p>When we talk about our relationship with law enforcement, we often say that it is not Cloudflare's intent to make law enforcement's work any harder or any easier. We respect both that law enforcement agencies have a job to do and that our customers have rights relating to how their data is shared with law enforcement.</p><p>Regardless of what government is asking, there are certain standards we believe must be followed before we turn over customer data. Our goal is to maintain a healthy and open relationship with law enforcement officials so that they understand and respect our positions on each of these standards. The principles which remain important to us are as follows:</p><ul><li><p><b>Require Due Process.</b> Cloudflare requires government entities seeking access to personal customer information to obtain appropriate legal process, including prior independent judicial review of any request for content.</p></li><li><p><b>Provide Notice.</b> We believe our customers deserve to be notified when we receive legal requests for their information, whether the requests come from law enforcement or private parties involved in civil litigation. We will provide that notice before we disclose the information, unless prohibited by law.</p></li><li><p><b>Protect Privacy and User Rights.</b> Whether inside or outside the United States, Cloudflare will fight law enforcement requests that we believe are overbroad, illegal, or wrongly issued. This includes requests to delay or prevent notice that appear unnecessarily broad, given the government interests at stake.</p></li><li><p><b>Be Transparent.</b> We believe the ability to report on the numbers and types of requests that we get from law enforcement, as well as how we respond, is critical to building trust with our customers. We will fight requests that unnecessarily restrict our ability to be transparent with our users.</p></li></ul><p>Consistent with the last standard, we also intend to update our transparency report to reflect any requests that we receive from non-U.S. law enforcement authorities, whether for user information or anything else.</p> ]]></content:encoded>
            <category><![CDATA[Policy & Legal]]></category>
            <category><![CDATA[Politics]]></category>
            <category><![CDATA[Abuse]]></category>
            <category><![CDATA[Due Process]]></category>
            <category><![CDATA[Community]]></category>
            <guid isPermaLink="false">4YcHdL78G4t1QL1hKNYsbS</guid>
            <dc:creator>Caroline Greer</dc:creator>
        </item>
        <item>
            <title><![CDATA[Out of the Clouds and into the weeds: Cloudflare’s approach to abuse in new products]]></title>
            <link>https://blog.cloudflare.com/out-of-the-clouds-and-into-the-weeds-cloudflares-approach-to-abuse-in-new-products/</link>
            <pubDate>Wed, 27 Feb 2019 13:00:00 GMT</pubDate>
            <description><![CDATA[ In a blogpost yesterday, we addressed the principles we rely upon when faced with numerous and various requests to address the content of websites that use our services.  ]]></description>
            <content:encoded><![CDATA[ <p></p><p>In a <a href="/unpacking-the-stack-and-addressing-complaints-about-content/">blogpost</a> yesterday, we addressed the principles we rely upon when faced with numerous and various requests to address the content of websites that use our services. We believe the building blocks that we provide for other people to share and access content online should be provided in a content-neutral way. We also believe that our users should understand the policies we have in place to address complaints and law enforcement requests, the type of requests we receive, and the way we respond to those requests. In this post, we do the dirty work of addressing how those principles are put into action, specifically with regard to Cloudflare’s expanding set of features and products.</p>
    <div>
      <h3>Abuse reports and new products</h3>
      <a href="#abuse-reports-and-new-products">
        
      </a>
    </div>
    <p>Currently, we receive abuse reports and law enforcement requests on fewer than one percent of the more than thirteen million domains that use Cloudflare’s network. Although the reports we receive run the gamut -- from phishing, malware or other technical abuses of our network to complaints about content -- the overwhelming majority are allegations of copyright violations or violations of other intellectual property rights. Most of the complaints that we receive do not identify concerns with particular Cloudflare services or products.</p><p>In the last year or so, we’ve also launched a variety of new products, including our video product (<a href="https://www.cloudflare.com/products/stream-delivery/">Cloudflare Stream</a>), a serverless edge computing platform (<a href="https://www.cloudflare.com/products/cloudflare-workers/">Cloudflare Workers</a>), a <a href="https://www.cloudflare.com/products/registrar/">self-serve registrar service</a>, and a privacy-focused recursive resolver (<a href="https://1.1.1.1/">1.1.1.1</a>), among others. Each of these services raises its own complex set of questions.  </p><p>There is no one-size-fits-all solution to address possible abuse of our products. Different types of services come with different expectations, as well as different legal and contractual obligations. Yet as we discussed in relation to our focus on transparency on <a href="/cloudflare-transparency-update-joining-cloudflares-flock-of-warrant-canaries-2/">Monday</a>, being fully transparent means being consistent and predictable so our users can anticipate how we will respond to new situations.</p>
    <div>
      <h3>Developing an approach to abuse</h3>
      <a href="#developing-an-approach-to-abuse">
        
      </a>
    </div>
    <p>To help us sort through how to address both complaints and law enforcement requests, when we introduce new products or features, we ask ourselves four basic sets of questions about the relationship between the service we’re providing and potential complaints about content:</p><ul><li><p>First, how are Cloudflare’s services interacting with the website content? For example, are we doing anything more than providing security and acting as a reliable conduit from one location to another?  Are we providing definitive storage of content? Did we provide the website its domain name through our registrar service? Is the Cloudflare service or product doing anything that could be seen as organizing, analyzing, or promoting content?</p></li><li><p>Second, what type of action might a law enforcement or private complainant want us to take and what are the consequences of it?  What sort of information might law enforcement request -- private information about the user, content of what was sent over the Internet, or logs that would track activity?  Will third parties request information about a website; would they request removal of content from the Internet? Would removing our services address the problem presented?</p></li><li><p>Third, what laws, regulations or contractual requirements apply? Does the nature of our interaction with the online content impact our legal obligations? Has the law enforcement request or regulation satisfied basic principles of the rule of law or due process?</p></li><li><p>Fourth, will our response to the matter presented scale to address the variety of different requests or complaints we may receive over time, covering a variety of different subject matters and viewpoints? Can we craft a principled and content-neutral process to respond to the request? Would our response have an overbroad impact, either by impacting more than the problematic content or changing the Internet in jurisdictions beyond the one that has issued the law or regulation at issue?</p></li></ul><p>Although those preliminary questions help us determine what actions we must take, we also do our best to think about the broader implications on the Internet of any steps we might take to address complaints.</p>
    <div>
      <h2>So how does this work in practice?</h2>
      <a href="#so-how-does-this-work-in-practice">
        
      </a>
    </div>
    
    <div>
      <h3>Response to abuse complaints for customers using our proxy and CDN services</h3>
      <a href="#response-to-abuse-complaints-for-customers-using-our-proxy-and-cdn-services">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/7fYyp9YRicdb7b4tQSIBnS/6ae08708e364e32a5c907f04d1b2459c/image5.png" />
            
            </figure><p>People often come to Cloudflare with abuse complaints because our network sits in front of our customers’ sites in order to protect them from cyber attacks and to improve the performance of their website.</p><p>There aren’t a lot of laws or regulations that impose obligations to address content on those providing security or <a href="https://www.cloudflare.com/learning/cdn/what-is-a-cdn/">CDN services</a>, for good reason. Most people complaining about content are looking for someone who can take that content off the Internet entirely. As we’ve talked about on <a href="/thoughts-on-abuse/">other</a> <a href="/anonymity-and-abuse-reports/">occasions</a>, Cloudflare is unable to remove content that we don’t host, so we therefore try to make sure that the complaint gets to its intended audience -- the hosting provider who has the ability to remove the material from the Internet. As described on <a href="https://www.cloudflare.com/abuse/">our abuse page</a>,  complaining parties automatically receive information about how to contact the hosting provider, and unless the complaining party requests otherwise, abuse complaints are automatically forwarded to both the website owner and the hosting company to allow them to take action.</p><p>This approach has another benefit, consistent with the fourth set of questions we ask ourselves. It prevents addressing content with an unnecessarily blunt tool. Cloudflare is unable to remove its security and CDN services from only a sliver of problematic content on a website.  If we remove our services, it has to be from an entire domain or subdomain, which may cause considerable collateral damage. For example, think of the vast array of sites that allow individual independent users to upload content (“user generated content”). A website owner or host may be able to curate or deal with specific content, but if companies like Cloudflare had to respond to allegations of abuse by a single user’s upload of a single piece of concerning content by removing our core services from an entire site, and making it vulnerable to a cyberattack, those sites would be much more difficult to operate and the content contributed by all other users would be put at risk.</p><p>Similarly, there are a number of different infrastructure services that cooperate to make sure each connection on the Internet can happen successfully – DNS, <a href="https://www.cloudflare.com/learning/dns/glossary/what-is-a-domain-name-registrar/">registrars</a>, registries, security, etc.  If each of the providers of those services, any one of which could put the entire transmission at risk, is applying blunt tools to address content, then the aperture of what content will stay online will get smaller and smaller. Those are bad results for the Internet. Actions to address troubling content online should focus narrowly on the actual concern to avoid unintended collateral consequences.</p><p>While we are unable to remove content we do not host, we are able to take steps to address abuse of our services, such as phishing and malware attacks. Phishing attacks typically fall into two buckets -- a website that has been compromised (unintentional phishing) or a website solely dedicated to intentionally misleading others to gather information (intentional phishing). These buckets are treated differently.</p><p>We discussed earlier that we aim to use the most precise tools possible when addressing abuse, and we take a similar approach for unintentional phishing content. If a website has been compromised (typically an outdated CMS) we can place a warning interstitial page in front of that specific phishing content to protect users from accidentally falling victim to the attack. In the majority of situations, this action is taken at a URL level of granularity.</p><p>In the case of intentional phishing attacks, such a domain like  my-totally-secure-login-page{.}com in combination with our Trust &amp; Safety team being able to confirm the presence of phishing content on the website, we take broader action including a domain-wide interstitial warning page (effectively *my-totally-secure-login-page{.}com/*), and in some cases we may terminate our services to the intentionally malicious domain. To be clear though, this does not remove the phishing content that remains hosted by the website’s hosting provider. Ultimately, action still needs to be taken by the website owner or hosting provider to fully remove the underlying issue.</p>
    <div>
      <h3>Response to complaints about content stored definitively on our network</h3>
      <a href="#response-to-complaints-about-content-stored-definitively-on-our-network">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/2Mz81IWy2rQJhZgHnVwXJ9/df8e0f2ec7ca2a0d1240131009164bbc/image4.png" />
            
            </figure><p>We think our approach requires a different set of responses for the small, but growing, number of Cloudflare products that include some sort of storage. Cloudflare Stream, for example, allows users to store, transcode, distribute and playback their videos. And Cloudflare Workers may allow users to store certain content at the edge of our network without a core host server. Although we are not a website hosting provider, these products mean we may be the only place where a certain piece of content is stored in some cases.  </p><p>When we are the definitive repository for content through any of our services, Cloudflare will carefully review any complaints about that content and may disable access to it in response to a valid legal takedown request from either government or private actors. Most often, these legal takedown requests are from individuals alleging copyright infringement.  Under the U.S. Digital Millennium Copyright Act, there is a specific process online storage providers follow to remove or disable access to content alleged to infringe copyright and provide an opportunity for those who post the material to contest that it is infringing. We have already begun implementing this process for content stored on our network.  That’s why we’ve begun a new section of our <a href="https://cloudflare.invisionapp.com/share/RUPOO3MPDKH#/screens">transparency report</a> on requests for content takedown pursuant to U.S. copyright law for content that is stored on our network.  </p><p>We haven’t received any government requests yet to take down content stored on our network. Given the significant potential impact on freedom of expression from a government ordering that content be removed, if we do receive those requests in the future, we will carefully analyze the factual basis and legal authority for the request.  If we determine that the order is valid and requires Cloudflare action, we will do our best to address the request as narrowly as possible, for example, by clarifying overbroad requests or limiting blocking of access to the content to those areas where it violates local law, a practice known as “geo-blocking”. We will also update our transparency report on any government requests that we receive in the future and any actions we take.</p>
    <div>
      <h3>Response to complaints about our registrar service</h3>
      <a href="#response-to-complaints-about-our-registrar-service">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/6FxcoT7686OkzBPJTPM7tN/ed90c776932edafbc6b95d59377d1703/registrar.png" />
            
            </figure><p>If you sign up for our self-serve registrar service, you’re legally bound by the terms of our contract with the Internet Corporation for Assigned Names and Numbers (ICANN), a non-profit organization responsible for coordinating unique Internet identifiers across the world, as well as our contract with the relevant domain name registry.  </p><p>Our registrar-focused <a href="https://www.cloudflare.com/products/registrar/abuse/">web page</a> for abuse reporting does not reference abuse complaints about a website’s content.  In our role as a domain registrar, Cloudflare has no control or ability to remove particular content from a domain. We would be limited to simply revoking or suspending the domain registration altogether which would remove the website owner’s control over the <a href="https://www.cloudflare.com/learning/dns/glossary/what-is-a-domain-name/">domain name</a>. Such actions would typically only be done at the direction of the relevant domain name registry, in accordance with their registration rules associated with the <a href="https://www.cloudflare.com/learning/dns/top-level-domain/">Top Level Domain</a>, or more usually to address incidents of abuse as raised by the registry or ICANN. We therefore treat content-related complaints submitted based on our registrar services the same way we treat complaints about content for sites using our CDN or proxy services.  We forward them to the website owner and the website hosting company to allow them to take action or we work in tandem with the relevant registry and at their direction.</p><p>Running a registrar service comes with other legal obligations. As an ICANN accredited registrar, part of our contractual obligations include adhering to third party dispute resolution processes regarding trademark disputes, as handled by providers such as the World Intellectual Property Organization (WIPO) and the National Arbitration  Forum. Also, we continue to be part of the ICANN community discussions on how best to handle the collection, publication and provision of access to personal data in the WHOIS database in a manner consistent with the EU’s General Data Protection Regulation (GDPR) and other privacy frameworks. We will provide more updates on that front when the discussions have ripened.</p>
    <div>
      <h3>Response to complaints about IPFS</h3>
      <a href="#response-to-complaints-about-ipfs">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/5T3SHdqfJMZSvtb0C4LBbo/84cd4798a1cb309eeae75972d2a3ca8e/ipfs.png" />
            
            </figure><p>Back in September, we <a href="/distributed-web-gateway/">announced</a> that Cloudflare would be providing a gateway to the InterPlanetary File System (IPFS). Cloudflare’s IPFS gateway is a way to access content stored on the IPFS peer-to-peer network. Because Cloudflare is not acting as the definitive storage for the IPFS network, we do not have the ability to remove content from that network. We simply operate as a cache in front of IPFS, much as we do for our more traditional customers.</p><p>Because content is stored on potentially dozens of nodes in IPFS, if one node that was caching content goes down, the network will just look for the same content on another node. That fact makes IPFS exceptionally resilient. That same resilience, however, means that unlike with our traditional customers, with IPFS, there is no single host to inform of a complaint about content stored on the IPFS network.  Cloudflare often has no knowledge of who the owner is of content being accessed through the gateway, and this makes it impossible to notify the specific owner when we receive a complaint.</p><p>The law hasn’t yet quite caught up with distributed networks like IPFS, and there’s a notable debate among IPFS users about how best to deal with abuse. Some argue that having problematic content stored on IPFS will discourage adoption of the protocol, and advocate for the development of lists of problematic hashes that  IPFS gateways could choose to block. Others point out that any mechanism intended to block IPFS content will itself be subject to abuse. We don’t have the answer to that debate, but it does demonstrate to us the importance of being thoughtful about how we proceed.</p><p>For the time being, our plan is to respond to U.S. court orders that require us to clear our cache of content stored on IPFS. More importantly, however, we intend to report in future transparency reports on any law enforcement requests we receive to clear our IPFS cache, to ensure continued public discussion.</p>
    <div>
      <h3>Cloudflare Resolvers: 1.1.1.1 and Resolver for Firefox</h3>
      <a href="#cloudflare-resolvers-1-1-1-1-and-resolver-for-firefox">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/atuUDCyhmzyh4RqbtOd6U/76647f964b85043f8d1296e5dd038dfd/1111-1.gif" />
            
            </figure><p>In April of last year, we <a href="/announcing-1111/">launched</a> our first DNS resolver, 1.1.1.1.  In June, we partnered with Mozilla to provide direct DNS resolution from within the Firefox browser using the Cloudflare Resolver for Firefox. Our goal with both resolvers was to develop fast DNS services that were focused on user privacy.  </p><p>We often get questions about how how we deal with both abuse complaints and law enforcement requests related to our resolvers.  Both of our resolvers are intended to provide only direct DNS resolution. In other words, Cloudflare does not block or filter content through either 1.1.1.1 or the Cloudflare Resolver for Firefox. If Cloudflare were to receive a request from a law enforcement or government agency to block access to domains or content through one of our resolvers, Cloudflare would fight that request. At this point, we have not yet received any government requests to block content through our resolvers. Cloudflare would also document any request to block content from our resolvers in our semi-annual transparency report, unless we were legally prohibited from doing so.</p><p>Similarly, Cloudflare has not received any government requests for data about the users of our resolvers, and would fight such a request if necessary. Given our public commitment not to retain any personally identifiable information for more than 24 hours, we believe it is unlikely that we would have any information even if asked. Nonetheless, if we were to receive a government request for data about a resolver user, we would document the request in our transparency report, unless legally prohibited from doing so.    </p>
    <div>
      <h3>The long road ahead</h3>
      <a href="#the-long-road-ahead">
        
      </a>
    </div>
    
            <figure>
            
            <img src="https://cf-assets.www.cloudflare.com/zkvhlag99gkb/52nr5Co31KS2aVzil4x90h/c2d650f2d18ca8c78d0a13a9148a9603/road.png" />
            
            </figure><p>Although new products offered by Cloudflare in the future, as well as the legal and regulatory landscape, may change over the years, we expect that our approach to thinking about new products will stand the test of time. We’re guided by some central principles -- allowing our infrastructure to be as neutral as possible, following the rule of law or requiring due process, being open about what we’re doing, and making sure that we’re consistent regardless of the wide variety of issues we face. And we will work hard to make sure that doesn’t change, because even the smallest tweaks to the way we do things can have a significant impact at the scale we operate.</p> ]]></content:encoded>
            <category><![CDATA[Freedom of Speech]]></category>
            <category><![CDATA[Policy & Legal]]></category>
            <category><![CDATA[Politics]]></category>
            <category><![CDATA[Abuse]]></category>
            <category><![CDATA[Due Process]]></category>
            <category><![CDATA[Community]]></category>
            <guid isPermaLink="false">3TokDJcXCygYPTjnifbwUM</guid>
            <dc:creator>Justin Paine</dc:creator>
        </item>
        <item>
            <title><![CDATA[Unpacking the Stack and Addressing Complaints about Content]]></title>
            <link>https://blog.cloudflare.com/unpacking-the-stack-and-addressing-complaints-about-content/</link>
            <pubDate>Tue, 26 Feb 2019 13:00:00 GMT</pubDate>
            <description><![CDATA[ Although we are focused on protecting and optimizing the operation of the Internet, Cloudflare is sometimes the target of complaints or criticism about the content of a very small percentage of the more than thirteen million websites that use our service. ]]></description>
            <content:encoded><![CDATA[ <p></p><p>Although we are focused on protecting and optimizing the operation of the Internet, Cloudflare is sometimes the target of complaints or criticism about the content of a very small percentage of the more than thirteen million websites that use our service. Our termination of services to the Daily Stormer website a year and a half ago drew significant attention to our approach to these issues and prompted a lot of thinking on our part.  </p><p>At the time, Matthew <a href="/why-we-terminated-daily-stormer/">wrote</a> that calls for service providers to reject some online content should start with a consideration of how the Internet works and how the services at issue up and down the stack interact with that content. He tasked Cloudflare’s policy team with engaging broadly to try and find an answer. With some time having passed, we want to take stock of what we’ve learned and where we stand in addressing problematic content online.  </p>
    <div>
      <h3>The aftermath of the Daily Stormer decision</h3>
      <a href="#the-aftermath-of-the-daily-stormer-decision">
        
      </a>
    </div>
    <p>The weeks immediately following the decision in August 2017 were filled with conversations. Matthew made sure the Cloudflare team accepted every single invitation to talk about these issues; we didn’t simply put out a press release or “no comment” anyone. Our senior leadership team spoke with the media and with our employees -- some of whom had received threats related both to Cloudflare’s provision of services to the Daily Stormer and to the termination of those services. On the policy side, we spoke with a broad range of ideologically-diverse advocacy groups who reached out to alternatively congratulate us or chastise us for the decision.</p><p>As the time stretched into months, the conversations changed. We spoke with organizations who have made it their mission to fight hate and intolerance, with human rights organizations that depend on access to the Internet, with tech companies doing their best to moderate content, with academics who think about and research all aspects of content online, and with interested government and non-governmental organizations on two continents. In the end, we spoke with hundreds of different experts, groups, and entities about how different companies and different types of services address troubling content at different places in the Internet stack.  </p><p>Our overwhelming sense from these conversations is that the Internet, and the industry that has grown up around it, is at a crossroads. Policy makers and the public are rightly upset about misuse of the Internet.  We heard repeatedly that the world is moving away from the Internet as a neutral platform for people to express themselves and access information. Many governments and many of the constituents they represent appear to want the Internet cleaned up and stripped of troubling content through any technical means necessary, even if it means that innovation will be stifled and legitimate voices will be silenced. And companies large and small seem to be going along with it.</p>
    <div>
      <h3>Moving forward</h3>
      <a href="#moving-forward">
        
      </a>
    </div>
    <p>We’ve thought long and hard about what’s next both for us and the Internet in general. Although we share concerns about the exploitation of online tools, we are convinced that there are ways forward that do not shortchange the security, availability, and promise of the Internet.</p><p>We think the right solution will take us out of the clouds and into the weeds.  We have to figure out what core functions need to be protected to have the Internet we want, and we will have to get away from the idea that there’s a one-size-fits-all solution that will address the problems we see. If we really want to address risks online while maintaining the Internet as a forum for communication, commerce, and free expression, different kinds of services are going to have to deal with abuse differently.</p><p>The more we talked to people, the more that we saw a fundamental split on the Internet between the services that substantively touch content and the infrastructure services that do not.  It’s possible that, as a company that provides largely infrastructure services ourselves, we were were looking for this distinction. But we believe the distinction is real and helps explain why different businesses make distinctly different choices. As we discuss in our blog posts on transparency this week, the approach to questions about abuse complaints will mean different things for different Cloudflare products. Although we are not at the point yet where Cloudflare’s products organize, analyze, or promote content, we are aware that this conclusion may have implications for us in the future.</p>
    <div>
      <h3>Content curators</h3>
      <a href="#content-curators">
        
      </a>
    </div>
    <p>The Internet has revolutionized the way we communicate and access information. Because of the way the Internet works, everyone online has the opportunity to create and consume the equivalent of their own newspaper or television network. Almost any content you could want is available, if you can find it. That idea is at the heart of a the divide between services that curate content -- like social media platforms and search engines -- and basic Internet infrastructure services.  </p><p>Content curators make content-based decisions for a business purpose. For a search engine, that might mean algorithmically reviewing content to best match what is sought by the user. For a social media site, it might be a review of content to help predict what content the user will want to see next or what advertising might be most appealing.</p><p>For these types of online products, users understand and generally expect that the services will vary based on content. Different search engines yield different results; Different social media platforms will promote different content for you to review. These services are the Internet’s equivalents of the very small circle of newspaper editors or television network executives of old, making decisions about what you see online based on what they think you’ll want to see.</p><p>The value in these content curator services depends on how well they analyze, use, and make judgments about content.  From a business perspective, that means that these services want the flexibility to include or exclude particular content from their platforms. For example, it makes perfect sense for a platform that advertises itself as building community to have rules that prevent the community from being disrupted with hate-filled messages and disturbing content.</p><p>We should expect content curator services to moderate content and should give them the flexibility to do so. If these services are transparent about what they allow and don’t allow, and how they make decisions about what to exclude, they can be held accountable the same way people hold other businesses to account. If people don’t like the judgments being made, they can take their business to a platform or service that’s a better fit.</p>
    <div>
      <h4>Basic Internet infrastructure services</h4>
      <a href="#basic-internet-infrastructure-services">
        
      </a>
    </div>
    <p>Basic Internet services, on the other hand, facilitate the business of other providers and website owners by providing infrastructure that enables access to the Internet.  These types of services -- which Matthew described in detail in the Daily Stormer <a href="/why-we-terminated-daily-stormer/">blog post</a> -- include telecommunications services, hosting services, domain name services such as registry and <a href="https://www.cloudflare.com/learning/dns/glossary/what-is-a-domain-name-registrar/">registrar services</a>, and services to help optimize and secure Internet transmissions. The core expertise of these services is not content analysis, but providing the infrastructure needed for someone else to develop and analyze that content.</p><p>Because people expect these infrastructure services to be used to provide technical access to the Internet, the notion that these numerous services might be used to monitor what you’re doing online or make decisions about what content you should be entitled to access feels like a misuse, or even an invasion of privacy.</p><p>Internet infrastructure is a lot like other kinds of physical infrastructure.  At some basic level, we believe that everyone should be allowed to have housing, electricity or telephone, no matter what they plan to do with those services. Or that individuals should be able to send packages through FedEx or walk down the street wearing a backpack with a reasonable expectation they won’t be subject to unfounded search or monitoring. Much as we believe that the companies that provide these services should provide services to all, not just those with whom they agree, we continue to believe that basic internet infrastructure services, which provide the building blocks for other people to create and access content online, should be provided in a content-neutral way.</p>
    <div>
      <h3>Complicated companies</h3>
      <a href="#complicated-companies">
        
      </a>
    </div>
    <p>Developing different expectations for content curation services and infrastructure services is tougher than it seems. Behemoths best known for content curation services often provide infrastructure services as well. Alphabet, for example, provides content-neutral infrastructure services to millions of customers through Google Cloud and Google Domains, while also running one of the world’s largest content curated site in YouTube. And even if companies try to distinguish their infrastructure from content curation services, their customers may not.</p><p>In a world where content needs to be on a large network to stay online, there are only a handful of companies that can satisfy. Reducing that handful to those — like Cloudflare — that fall solely into the infrastructure bucket makes the number almost impossibly small. That is why we want to do better job talking about differences in expectations not by company, but by service.</p><p>And maybe we should also recognize that having only a small number of companies with robust enough networks to keep content online--most of which do content curation--is part of the problem. If you believe that the only way to be online is to be on a platform that curates content, you’re going to be rightly skeptical of that company’s right to take down content that they don’t want on their site. That doesn’t mean that a business that depends on analyzing content has to stop doing it, but it does make it that much more important that we have neutral infrastructure. It might be impossible for an alternate platform to be built, and for certain voices to have a presence online, without it.</p><p>The good news is that we’re not alone in our view of the fundamental difference between content curators and Internet infrastructure services. From the <a href="https://www.cloudflare.com/cloudflare-criticism/">criticism</a> we received for the Daily Stormer decision, to the <a href="https://www.techdirt.com/articles/20180819/00455840462/forget-about-social-media-content-moderation-get-ready-internet-infrastructure-content-moderation.shtml">commentary</a> of Mike Masnick at Techdirt, to the academic <a href="https://poseidon01.ssrn.com/delivery.php?ID=542020096000010096112083068071071102026044031032057003066126104028004098107027115066031056003008104040034096120064104017001089027091046046045108074101107103092011090089081106023090018070113114080075019004126030099064009084090096086093025085031070005&amp;EXT=pdf">analysis</a> of Yale Law Professor Jack Balkin, to the <a href="https://cyberstability.org/research/call-to-protect/">call</a> of the Global Commission on the Security of Cyberspace (GCSC) to protect the “public core” of the Internet, there’s an increasing awareness that not protecting neutral Internet infrastructure could undermine the Internet as we know it.</p>
    <div>
      <h3>Thoughts on due process</h3>
      <a href="#thoughts-on-due-process">
        
      </a>
    </div>
    <p>In his blog post on the Daily Stormer decision, Matthew talked about the importance of due process, the idea that you should be able to know the rules a system will follow if you participate in that system. But what we’ve learned in our follow up conversations is that due process has a different meaning for content curators.</p><p>There has been a clamor for companies like Facebook and Google to explain how they make decisions about what to show their users, what they take down, and how someone can challenge those decisions. Facebook has even developed an “Oversight Board for Content Decisions” -- dubbed as Facebook’s supreme court -- that is empowered to oversee the decisions the company makes based on its terms of service. Given that this process is based on terms of service, which the company can change at will to accommodate business decisions, this mostly seems like a way to build confidence in the company’s decision-making process. Instituting an internal review process may make users feel that the decisions are less arbitrary, which may help the company keep people in their community.</p><p>That idea of entirely privatized due process may make sense for content curators, who make content decisions by necessity, but we don’t believe it makes sense for those that provide infrastructure services. When access to basic Internet services is on the line, due process has to mean rules set and adjudicated by external decision-makers.</p>
    <div>
      <h3>Abuse on Internet infrastructure</h3>
      <a href="#abuse-on-internet-infrastructure">
        
      </a>
    </div>
    <p>Although we don’t believe it is appropriate for Cloudflare to decide what voices get to stay online by terminating basic Internet services because we think content is a problem, that’s far from the end of the story. Even for Internet infrastructure, there are other ways that problematic content online can be, and is, addressed.</p><p>Laws around the world provide mechanisms for addressing particular types of content online that governments decide is problematic. We can save for another day whether any particular law provides adequate due process and balances rights appropriately, but at a minimum, those who make these laws typically have a political legitimacy that infrastructure companies do not.</p><p>Tomorrow, we’ll talk about how we are operationalizing our view that it’s important to  get into the weeds by considering how different laws apply to us on a service-by-service, and function-by-function basis.</p> ]]></content:encoded>
            <category><![CDATA[Freedom of Speech]]></category>
            <category><![CDATA[Policy & Legal]]></category>
            <category><![CDATA[Politics]]></category>
            <category><![CDATA[Due Process]]></category>
            <category><![CDATA[Community]]></category>
            <guid isPermaLink="false">ZLdefAUX2U3eaijY9OeZe</guid>
            <dc:creator>Alissa Starzak</dc:creator>
        </item>
    </channel>
</rss>