Subscribe to receive notifications of new posts:

Bringing AI to the edge with NVIDIA GPUs

2021-04-13

2 min read

Cloudflare has long used machine learning for bot detection, identifying anomalies, customer support and business intelligence.  And internally we have a cluster of GPUs used for model training and inference.

For even longer we’ve been running code “at the edge” in more than 200 cities worldwide. Initially, that was code that we wrote and any customization was done through our UI or API. About seven years ago we started deploying custom code, written in Lua, for our enterprise customers.

But it’s quite obvious that using a language that isn’t widely understood, and going through an account executive to get code written, isn’t a viable solution and so four years ago we announced Cloudflare Workers. Workers allows anyone, on any plan, to write code that gets deployed to our edge network. And they can do it in the language they choose.

After launching Workers we added storage through Workers KV as programs need algorithms plus data. And we’ve continued to add to the Workers platform with Workers Unbound, Durable Objects, Jurisdictional Restrictions and more.

But many of today’s applications need access to the latest machine learning and deep learning methods. Those applications need three things: to scale easily, to securely handle models and to use familiar tools.

To help anyone build AI-based applications Cloudflare is extending the Workers platform to include support for NVIDIA GPUs and TensorFlow. Soon you’ll be able to build AI-based applications that run across the Cloudflare network using pre-built or custom models for inference.

NVIDIA + TensorFlow

For many years we’ve looked for appropriate hardware to run on our edge network to enable AI-powered applications. We’ve examined a wide variety of dedicated AI accelerator chips, as well as approaches that use clever encoding of the models to make them run efficiently on CPUs.

Ultimately, we decided that with our large footprint of servers in more than 200 cities worldwide and the strong support for NVIDIA GPUs in AI toolkits that the best solution was a deployment of NVIDIA GPUs to enable AI-based applications. Today we announced that we are partnering with NVIDIA to bring AI to our global edge network.

Previously machine learning models were deployed on expensive centralized servers or using cloud services that limited them to “regions” around the world. Cloudflare and NVIDIA are putting machine learning at the edge, within milliseconds of the global online population, enabling high performance, low latency AI to be deployed by anyone.

Because the models themselves remain in Cloudflare’s data centers, developers can deploy custom models without putting them on end user devices where they might risk being stolen. And because we’ll be deploying models across our large network, scaling becomes trivial and built in.

TensorFlow has become one of the defacto standard libraries and toolsets for building and running AI models. Cloudflare’s edge AI will use TensorFlow allowing developers to train and optimize models using familiar tools and tests before deploying them to our edge network.

In addition to building your own models using TensorFlow, we plan to offer pre-trained models for tasks such as image labeling/object recognition, text detection, and more.

Nata Or Not

As a demonstration of a real AI-based application running on Cloudflare’s infrastructure, the team in the Cloudflare Lisbon office built a website: nataornot.com. Upload a picture of food and it’ll tell you whether it’s one of Portugal’s delicious pasteis de nata (an egg custard tart pastry dusted with cinnamon) or not.

The code used a TensorFlow model built from thousands of pictures of pasteis and other foods which runs on a Cloudflare server which has an NVIDIA A100 Tensor Core GPU in it. If you want to build your own pastel de nata recognizer we’ve open sourced the TensorFlow model here.

If you simply want to know whether the picture of food you have is a pastel de nata or not visit nataornot.com.

Going Forward

If you are interested in being notified when Workers AI becomes available please sign up here. And if you’re interested in working on AI at scale we’re hiring.

Cloudflare's connectivity cloud protects entire corporate networks, helps customers build Internet-scale applications efficiently, accelerates any website or Internet application, wards off DDoS attacks, keeps hackers at bay, and can help you on your journey to Zero Trust.

Visit 1.1.1.1 from any device to get started with our free app that makes your Internet faster and safer.

To learn more about our mission to help build a better Internet, start here. If you're looking for a new career direction, check out our open positions.
Developer WeekDevelopersCloudflare WorkersPartnersCloudflare NetworkDeveloper Platform

Follow on X

Cloudflare|@cloudflare

Related posts

October 31, 2024 1:00 PM

Moving Baselime from AWS to Cloudflare: simpler architecture, improved performance, over 80% lower cloud costs

Post-acquisition, we migrated Baselime from AWS to the Cloudflare Developer Platform and in the process, we improved query times, simplified data ingestion, and now handle far more events, all while cutting costs. Here’s how we built a modern, high-performing observability platform on Cloudflare’s network. ...