Skip to main content
Deno 2 is finally here 🎉️
Learn more
Announcing Deno Queues

In the ever-evolving world of cloud software, Deno aims to radically simplify. Leveraging public cloud infrastructure has traditionally demanded sifting through layers of boilerplate code and intricate configurations, often monopolizing a significant chunk of the developer’s time and energy. Our goal is to distill these intricacies into user-friendly primitives, enabling developers to design, refine, and launch their projects with unmatched speed.

Watch our Deno Queues announcement.

With this in mind, we rolled out Deno KV a few months ago (currently in open beta). Anchored on the robust capabilities of FoundationDB, Deno KV is more than just a new persistence option for apps. It’s about transforming the developer experience by eliminating redundant configurations and offering a refreshingly streamlined API.

Building upon this foundation (pun intended), we are elated to unveil Deno Queues today. This tool is set to revolutionize scalable messaging and elevate the management of background processing in your applications.

const db = await Deno.openKv();

db.listenQueue(async (msg) => {
  await postToSlack(msg.channel, msg.text);
});

await db.enqueue({ channel: "C123456", text: "Slack message" }, {
  delay: 60000,
});

In this post, we’ll cover key aspects of Deno Queues:

What are Deno Queues?

Deno Queues, built on Deno KV, allow you to offload parts of your application or schedule work for the future to run asynchronously, with two new simple APIs with zero configuration or infrastructure to maintain:

  • .enqueue(): Pushes new messages into the queue for guaranteed delivery immediately or at a time in the future.
  • .listenQueue(): Handler used for processing new messages from the queue.

Since Queues are built on Deno KV, it uses SQLite when running locally and FoundationDB when running on Deno Deploy for maximum availability and throughput.

Running Queues on Deno Deploy is optimized for performance. Deno Deploy automatically spins up V8 isolates on-demand and dispatches messages when they’re available for processing. Your application code simply listens to new messages with listenQueue handler, and Deno Deploy handles the rest.

Deno Queues guarantees at-least-once delivery. For most enqueued messages, the listenQueue handler will be invoked once. In some failure instances, the handler may be invoked multiple times to ensure delivery. It’s important to design your applications to ensure that duplicate messages are handled correctly.

You can also combine Queues with KV atomic transactions primitives, which can unlock powerful workflows. For example, you may add messages to the queue as part of a KV transaction, which succeeds or fails atomically:

const kv = await Deno.openKv();
const change = 10;

const bob = await kv.get(["balance", "bob"]);
const liz = await kv.get(["balance", "liz"]);
if (bob.value < change) {
  throw "not enough balance";
}

const success = await kv.atomic()
  .check(bob, liz) // balances did not change
  .set(["balance", "bob"], bob.value - change)
  .set(["balance", "liz"], liz.value + change)
  // Enqueue a message to notify Liz and Bob
  .enqueue({ type: "notify", name: "liz", amount: change })
  .enqueue({ type: "notify", name: "bob", amount: -change })
  .commit();
Enqueue new messages as part of an atomic transaction — only if the entire transaction succeeds, they will be enqueued.

You can also update Deno KV state from your listenQueue handler. For instance, if you want to ensure that updates on each message is performed only once, you can also use the Queue API with KV atomic transactions:

const db = await Deno.openKv();

db.listenQueue(async (msg) => {
  const nonce = await db.get(["nonces", msg.nonce]);
  if (nonce.value === null) {
    // This messaged was already processed.
    return;
  }

  const change = msg.change;
  const bob = await db.get(["balance", "bob"]);
  const liz = await db.get(["balance", "liz"]);

  const success = await db.atomic()
    // Ensure this message was not yet processed
    .check({ key: nonce.key, versionstamp: nonce.versionstamp })
    .delete(nonce.key)
    .sum(["processed_count"], 1n)
    .check(bob, liz) // balances did not change
    .set(["balance", "bob"], bob.value - change)
    .set(["balance", "liz"], liz.value + change)
    .commit();
});

const nonce = crypto.randomUUID();
await db
  .atomic()
  .check({ key: ["nonces", nonce], versionstamp: null })
  .enqueue({ nonce, change: 10 })
  .set(["nonces", nonce], true)
  .sum(["enqueued_count"], 1n)
  .commit();
This example uses KV atomic transactions to ensure each message is updated only once.

Additionally, if your listenQueue handler throws an exception, the runtime will automatically retry to call the handler again until it succeeds or until maximum retry attempts are reached. If maximum attempts (current default is 5) are reached, the message will be dropped.

Use cases and examples

Queues are useful in scaling applications by allowing servers to offload async processes and scheduling work for the future.

Below are a few examples.

Scheduled email notifications

Sometimes a job or task that’s initiated by your user may take enough time where you don’t want to make them wait for a “task complete” response or there’s no need to send them a response. This is when you can offload work to a queue to keep your server or app responsive for your user.

Here’s how you would use Queues to send email notifications:

const db = await Deno.openKv();

db.listenQueue(async (msg) => {
  if (msg.type === "welcome_email") {
    await sendWelcomeEmail(msg.customer_id);
  } else if (msg.type === "survey_email") {
    await sendSurveyEmail(msg.customer_id);
  }
});

await db.enqueue(
  { type: "welcome_email", customer_id: 123 },
);

await db.enqueue(
  { type: "survey_email", customer_id: 123 },
  { delay: 259200000 }, // deliver in 3 days
);

Reliable webhook processing

Another extremely common example of using queues on the web is through processing webhooks. Here’s an example using Oak and Queues to handle webhooks asynchronously:

import { Application, Router } from "https://deno.land/x/[email protected]/mod.ts";

const db = await Deno.openKv();

db.listenQueue(async (msg) => {
  await processWebHook(msg.webhook_body);
});

const router = new Router();
router.post("/webhook", async (ctx) => {
  db.enqueue({ webhook_body: await ctx.request.body().value });
  ctx.response.status = 200;
});

const app = new Application();
app.use(router.routes());
app.use(router.allowedMethods());

await app.listen({ port: 8000 });

Slack Reminder Bot

Queues is great for building bots in Discord or Slack.

Here’s an example of using Deno Queues to create a simple reminder app in Slack.

Receiving a reminder in Slack

And this is a Discord bot that uses Deno Queues to create giveaways and allow users to join with a single click.

Creating a giveaway in Discord

More examples

More examples of queue usage can be found at docs.deno.com.

Pricing for Deno Queues

As you explore the capabilities of Queues, it’s important to grasp the cost implications. Queues has no specific cost of its own, but rather charged in terms of Deno KV operations and Deno Deploy requests (for listening). Specifically:

Enqueuing a Message: Each enqueue action translates into a KV write operation.

Receiving a Message: Every received message entails a KV write, and a single request charge.

This transparent pricing structure ensures you’re only billed for the operations you use, aligning with our commitment to efficiency and simplicity.

Other resources

What’s next

Building scalable apps and servers on the web requires offloading background tasks to queues. However, there are many steps in configuring them for use. Deno Queues, built right into the runtime and on top of robust infrastructure of Deno Deploy, lets you use serverless, distributed queues in only a few lines of code.

Deno Queues joins Deno KV, web standards APIs, npm, and all-in-one modern tooling as key building blocks that make creating for the web simpler and more productive. We are still a long ways to go from our goal and have many more exciting features on the roadmap. Stay tuned.

We’re always open to feedback and feature requests! Feel free to join our growing Discord or create an issue here.