Subscribe to receive notifications of new posts:

Control your traffic at the edge with Cloudflare

2016-09-29

4 min read

Today, we're introducing two new Cloudflare Traffic products to give customers control over how Cloudflare’s edge network handles their traffic, allowing them to shape and direct it for their specific needs.

More than 10 trillion requests flow through Cloudflare every month. More than 4 million customers and 10% of internet requests benefit from our global network. Cloudflare's virtual backbone gives every packet improved performance, security, and reliability.

That's the macro picture.

What's more interesting is keeping each individual customer globally available. While every customer benefits from the network effect of Cloudflare, each customer is (appropriately) focused on their application uptime, security and performance.

Rate Limiting*

Cloudflare’s new Rate Limiting allows a customer to rate limit, shape or block traffic based on the number of requests per second per IP, cookie, or authentication token. Traffic can be controlled on a per-URI (with wildcards for greater flexibility) basis giving pinpoint control over a website, application, or API.

Cloudflare Traffic Control

Customers seek reliability and availability in the face of popularity or unexpected traffic such as slow brute force attacks on a WordPress site, Denial of Service against dynamic pages, or the stampede of requests that comes with success. We are the leader at stopping significant DDoS attacks and offer a comprehensive WAF to target specific application-level attacks.

Now we are adding the capability to give each customer fine-grained control over the traffic that reaches their origin servers.

Even well-engineered applications have a resource limit. Any dynamic endpoint either has a hard system limit or an economic limit on the number of servers you can afford. Those expensive endpoints need additional protection against floods of traffic, including legitimate visitors. You can provision for the worst case...but when you find a new Pokémon Go on your hands, the best case hurts, too.

To shield origins from attack and preserve uptime, Cloudflare Rate Limiting lets you throttle, block, and otherwise control the flow of traffic to maintain availability, limit economic impact and preserve performance. All of which can be done with thoughtful configuration, testing rules to measure their impact, and applying changes globally within seconds.

That solves several problems. Rate Limiting protects APIs as well as web pages. Different version of your APIs can not only have different rate limit triggers, but they can return custom JSON responses or response codes if, for instance, you want to obfuscate the standard 429 HTTP error code.

Static pages are easy to cache at Cloudflare's edge, so high traffic on a home page is welcome. But a competitor scraping your search results is different, and may cause economic pain in server resources in addition to disrupting business as usual. So Rate Limiting lets you define specific URLs with lower limits and different policies.

Similarly, rules designed to protect a login endpoint enables real users to still access your application while defending yourself from brute-force attacks designed to break into your system or simply exhaust your resources. Rate Limiting gives customers this power, in part, by distinguishing between POSTs versus GETs and identifying authentication failures through the server response code.

From rate limiting within your applications to replacing hardware capabilities at the top of your rack, Rate Limiting solves a problem that otherwise requires several tools and custom development to solve. In the future, Rate Limiting will become even more intelligent, automatically enabling caching on your marketing site on a product launch, queueing customers on Black Friday to ensure your e-commerce system handles the demand, and helping to maximize the return on your IT investments by protecting them from damaging spikes and traffic patterns.

Traffic Manager

For many customers, gone are the days of running a single server for their web application. Two scenarios are common: a single datacenter or cloud provider running multiple load-balanced servers, and replication of that infrastructure across multiple geographies.

Customers have moved to load balanced infrastructure to provide reliability, handle traffic spikes, and handle traffic locally in different regions of the world.

The beauty of the single server approach was that it was simple to manage: all traffic from everywhere on the Internet hit the same server. Unfortunately, that doesn’t scale to today’s Internet and so controls are needed to handle load balancing across servers and locations.

Cloudflare’s new Traffic Manager enables a customer to keep their application running during a failure or better handle unexpected spikes in traffic by load balancing across multiple servers, datacenters, and geographies.

Traffic Manager has four major features: health checks, load balancing, failover and geo-steering.

Health checks automatically test the availability of individual origin servers so that Traffic Manager has a real-time view of the health of your origin servers. This information is used for failover and load balancing.

Load balancing automatically shares traffic across a collection of origin servers; if an origin server fails a health check it is automatically removed and load is shared across the remaining servers. For more complex environments an entire collection of origin servers can be removed from receiving traffic if the number of failed servers in the collection falls below some safe threshold.

Geo-steering allows a customer to configure traffic delivery to specific origin server groups based on the physical location of the visitor.

Health checks, load balancing, failover, and geo-steering work together to give customers fine grained control over their traffic.

Cloudflare Traffic Manager checks the health of applications from 100+ locations around the world, taking automatic action to route around failure, based on policies crafted by each customer.

Fiber cut in Malaysia? Host not responding for customers in Minneapolis? Checking from every location on a network with more public internet exchanges than any other company means you get localized, specific decision making. When a problem crops up on the network path to one of our customers' servers, we'll route that traffic instantly to another healthy, available server -- without changing the policy for other, healthy routes.

Sometimes, it's not the network. With Traffic Manager, you may load balance across individual hosts and across multiple hosts. Application down on AWS? Send those visitors to your Rackspace-hosted application in seconds. Failover from a cloud provider to your own datacenter, and back again as soon as your primary location is healthy again.

Other times, customers run different instances to account for the speed of light, and be as close as possible to their customers -- much as Cloudflare does. With Traffic Manager, you can route visitors to your site to the nearest host, based on their region. Choose to send visitors in Munich to your datacenter in Amsterdam, and visitors from Kansas City to your St. Louis datacenter. These policies can be combined, so if your European datacenter is down, then and only then send that traffic to your United States datacenter.

Many of our customers put significant investment into the availability of their infrastructure. Traffic Manager extends that investment across Cloudflare's ever-growing global network.

Early Access

CloudFlare Traffic is now available in Early Access for all customers, and will be available publicly before the end of the year. Read more about Traffic Manager and Rate Limiting and request access in your Cloudflare dashboard, in the Traffic app.

Cloudflare dashboard, Traffic app

**Note: This post was updated 4/13/17 to reflect the current product name. All references to Traffic Control have been changed to Rate Limiting.*

Cloudflare's connectivity cloud protects entire corporate networks, helps customers build Internet-scale applications efficiently, accelerates any website or Internet application, wards off DDoS attacks, keeps hackers at bay, and can help you on your journey to Zero Trust.

Visit 1.1.1.1 from any device to get started with our free app that makes your Internet faster and safer.

To learn more about our mission to help build a better Internet, start here. If you're looking for a new career direction, check out our open positions.
TrafficLoad BalancingRate LimitingAPIProduct NewsReliability

Follow on X

John Roberts|@pencoyd
Cloudflare|@cloudflare

Related posts

October 24, 2024 1:00 PM

Durable Objects aren't just durable, they're fast: a 10x speedup for Cloudflare Queues

Learn how we built Cloudflare Queues using our own Developer Platform and how it evolved to a geographically-distributed, horizontally-scalable architecture built on Durable Objects. Our new architecture supports over 10x more throughput and over 3x lower latency compared to the previous version....

October 09, 2024 1:00 PM

Improving platform resilience at Cloudflare through automation

We realized that we need a way to automatically heal our platform from an operations perspective, and designed and built a workflow orchestration platform to provide these self-healing capabilities across our global network. We explore how this has helped us to reduce the impact on our customers due to operational issues, and the rich variety of similar problems it has empowered us to solve....