The Cloudflare Stack: The Performant, Cost-Effective, Zero-Maintenance Way To Build Your Serverless Web Apps in 2023 | by Bharat Arimilli | Nov, 2022

Cloudflare’s broad spectrum of cloud computing offerings gives developers fewer compromises and incredible new opportunities

A screenshot of the product page for Cloudflare Workers titled
Product page for Cloudflare Workers

The promise of serverless computing was always the idea of ​​deploying apps with little to no maintenance burden, low cost, and instant scalability. However, significant issues have always made it much less compelling in practice. Concerns of vendor lock-in, poor integration with relational DBs, cold starts, and massively high costs all resulted in serverless becoming a niche offering only suitable for specific use cases.

In recent years, however, Cloudflare’s foray into building full-stack cloud computing platforms has finally brought us closer to fulfilling the early promise of serverless. Built on the power of their edge computing network, we are finally seeing serverless that can deliver a combination of performance, cost-effectiveness and low maintenance.

The foundation of Cloudflare’s existence is its edge computing network, computing power distributed across the globe so you can serve content and run code as close as possible to your users. It powers Cloudflare’s well-known and long-running content-delivery network (CDN), and is what they later built their cloud offerings on.

What does this mean for their cloud platform? Instead of deploying resources to a specific region, your code/content/data will always be served from the server closest to your users. You get incredible global scale performance without worrying about the operational burden of deploying your app or resources to multiple regions or manually scaling your resources to match demand.

This key design feature is what makes Cloudflare’s offering so compelling and unique compared to many other serverless platforms. Let’s review what this means for some of their offerings and discuss some of the pros and cons.

cloudflare workers

A screenshot of the Cloudflare workers landing page with the title
cloudflare workers landing page

Cloudflare’s evolution towards building a full-stack cloud computing platform began with Worker, offering their serverless functions. Based on Cloudflare’s edge computing network, it does not run code in a container-based manner similar to AWS Lambda and Google Cloud Functions. Cloudflare workers instead run as V8 isolates (like Chrome’s JavaScript engine), separate pieces of code that can run without a container, based on Web API instead of Node.js. This means incredibly fast performance with zero cold starts because you don’t need to wait for a container to spin up to serve a request. It solves a major performance problem that has plagued many serverless compute platforms since their inception.

Following are the advantages of this architecture:

  • zero cold starts
  • Code will always run closest to the user thanks to Cloudflare’s edge network, bringing optimal performance without any developer configuration or intervention
  • Using standards-compliant Web APIs means your serverless functions are portable, reducing vendor lock-in (there’s also the Cloudflare Workers runtime) open sourceMeaning you can self-host workers if you want)

However, there are notable drawbacks:

  • No Node.js support means workers are only compatible with a small subset of NPM packages
  • Because the code does not run in sandboxed containers, you cannot make direct TCP connections (for example, to a PostgreSQL database) and will instead need to run requests through an HTTP-based proxy.

Despite these limitations, some recent events have dramatically changed this equation. The rise of Web API-based runtimes such as Denso and the increasingly prevalent use of ESM (ES Modules) within the JavaScript ecosystem make it more common for popular JavaScript libraries to support runtimes such as Workers.

The other major development is that Cloudflare is now rolling out a whole stack of serverless cloud offerings to complement Workforce, so you can easily build full-stack apps without needing to leave the platform.

A screenshot of the Cloudflare Pages landing page with the title
cloudflare page landing page

Cloudflare is a deployment platform for page frontends, where your frontends are deployed instantly on Cloudflare’s edge network. This is not a unique offering as similar services exist like Vercel and Netlify. You get Versal-like functionality such as automated deployment, preview builds, deployment to edge networks, and full-stack support with serverless functions.

However, the real value proposition with Pages is that it resides on Cloudflare’s edge network and within Cloudflare’s cloud stack. You have the performance and reliability of a network through which some of the world’s largest websites drive their traffic, as well as integration with Cloudflare’s other excellent cloud services.

Pros:

  • Your frontend resides on Cloudflare’s excellent edge network, which means great performance
  • Can deploy serverless functions as part of your frontend via Page Functions (based on Cloudflare Workers) to build full-stack apps
  • No cap on number of requests or bandwidth

Shortcoming:

  • You might not be able to use your favorite frameworks, UI components, or libraries with full-stack Cloudflare pages (due to the lack of Node.js support in Cloudflare workers). This is not a problem for projects without backend components or server-side rendering, as they do not run on Cloudflare Workers. Cloudflare Pages will happily build front-end-only projects with the Node.js tooling.
  • NeXT.JS support is still in its early stages and once it’s ready, Versal’s deeper integration with the framework may still be more compelling.
Screenshot of a blog post about D1 on Cloudflare's blog, showing the title and graphic.  header
D1 post on Cloudflare’s blog

The inability of Cloudflare workers to connect directly to the database via TCP significantly limits the database story within Cloudflare’s stack. To help mitigate this and provide a well-integrated database solution for workers, Cloudflare recently announced its D1 SQL database service.

In typical Cloudflare fashion, it is a serverless offering based on its edge computing network. The service is built on SQLite, an incredibly lightweight SQL database that runs as a single file (it’s so lightweight it can even run on embedded and mobile devices).

This architecture means the D1 can deliver some incredible functionality with zero maintenance burden and zero long-running servers to worry about. Among these functionalities are automatic read-replication and automatic backup with redundancy and restore functionality.

There’s no region to choose from, no replication to enable, and no database servers to scale. You get world class performance out of the box. This is functionality that is expensive and/or non-trivial to enable with traditional databases. And, because D1 is based on SQLite, an open-source database, your database is still completely portable should you choose to take it elsewhere.

Pros:

  • Global-scale performance out-of-the-box with no additional configuration
  • Being based on SQLite, the database is completely portable. You can easily migrate in and out of D1.

Shortcoming:

  • D1 can only be accessed from Cloudflare Workers, and it’s unclear if this will ever change. Allowing access to D1 data outside of Cloudflare will require building a public-facing API with Cloudflare workers.
  • SQLite is not as fully featured as other SQL flavors like Postgres

D1 is currently in alpha. Some of the functionality mentioned above is still in the works and may change.

A screenshot of the product page for Cloudflare R2 showing
Product page for Cloudflare R2

R2 is Cloudflare’s storage offering, built with an AWS S3-compatible API. The biggest issue that R2 solves among its storage competitors is eliminating withdrawal fees. An evacuation fee is charged for data leaving a provider’s cloud network. Traditionally, this meant that you were charged not only for writes, reads and storage, but also each time the data was sent anywhere.

Not only does this cause you to pay twice for your data, but it also discourages you from moving your data to a different provider (as you’ll have to pay for moving that data outside your current provider’s network). will have to pay exorbitant exit fees). Cloudflare has been around for a few years now Bandwidth Alliance And now they’ve done just that with their own storage offering.

Pros:

  • Much cheaper than S3 for frequently accessed data
  • no exit fee
  • Direct integration with Cloudflare Workers
  • S3-compatible API
  • No need to choose area for each bucket. In the future, this will allow R2 to automatically analyze data access patterns where data is stored.

Shortcoming:

  • Limited configuration options available compared to S3
  • Not at full feature parity with S3’s API
  • no archival storage rates (S3 will probably be cheaper for accessed data)

Cloudflare’s offering now allows you to build full-stack apps entirely within their ecosystem for the first time ever. While some of the offerings are still new and have notable limitations, Cloudflare’s architecture makes the early promise of serverless computing more real than ever. While the stack doesn’t make sense for every app, many apps today can benefit from it. The entire platform is also evolving very quickly, with new services and features being added constantly.

I think the general vision of Cloudflare is something many developers can get on board with. Instead of trying to plaster the wall with cloud services and try to match their larger cloud competitors service for service, they are building a handful of well designed services designed to address specific developer pain points. are building.

There are also other values ​​which I believe make them more developer-friendly than other platforms. As a company, Cloudflare has made a big effort to be more cost-effective than its competitors. Their services usually don’t charge exit fees, their pricing is extremely competitive, and their pricing tiers are usually more affordable than those of their competitors.

He has also consistently pushed for open standards within the wider developer ecosystem and within his own offerings. As noted above, many of Cloudflare’s offerings specifically mitigate concerns of vendor lock-in in this way.

If you’re interested in serverless computing but haven’t been impressed by what’s already out there, Cloudflare is worth checking out. While the stack isn’t working for all use cases, especially in its early stages today, I’d argue that it can provide enormous value for the right kinds of apps. Even if it doesn’t work for you today, its fast-paced nature means it could very soon.

Leave a Reply