Sometimes you need a backend logic to do more than respond to a request. For example, you might want to process a batch of files and upload the results to Supabase Storage. Or read multiple entries from a database table and generate embeddings for each entry.
With the introduction of background tasks, executing these long-running workloads with Edge Functions is super easy.
We've introduced a new method called EdgeRuntime.waitUntil , which accepts a promise. This ensures that the function isn't terminated until the promise is resolved.
Free projects can run background tasks for a maximum of 150 seconds (2m 30s). If you are on a paid plan, this limit increases to 400 seconds (6m 40s). We plan to introduce more flexible limits in the coming months.
You can subscribe to notifications when the function is about to be shut down by listening to beforeunload event. Read the guide for more details on how to use background tasks.
Edge Function invocations now have access to ephemeral storage. This is useful for background tasks, as it allows you to read and write files in the /tmp directory to store intermediate results.
Example: Extracting a zip file and uploading its content to Supabase Storage#
Let's look at a real-world example using Background Tasks and Ephemeral Storage.
Imagine you're building a Photo Album app. You want your users to upload photos as a zip file. You would extract them in an Edge Function and upload them to storage.
One of the most straightforward ways to implement is using streams:
import { ZipReaderStream } from 'https://deno.land/x/zipjs/index.js'
import { createClient } from 'jsr:@supabase/supabase-js@2'
for await (const entry of await req.body.pipeThrough(new ZipReaderStream())) {
// write file to Supabase Storage
const { error } = await supabase.storage
.from(uploadId)
.upload(entry.filename, entry.readable, {})
console.log('uploaded', entry.filename)
}
return new Response(
JSON.stringify({
uploadId,
}),
{
headers: {
'content-type': 'application/json',
},
}
)
})
If you test out the streaming version, it will run into memory limit errors when you try to upload zip files over 100MB. This is because the streaming version has to keep every file in a zip archive in memory.
We can modify it instead to write the zip file to a temporary file. Then, use a background task to extract and upload it to Supabase Storage. This way, we only read parts of the zip file to the memory.
import { BlobWriter, ZipReader, ZipReaderStream } from 'https://deno.land/x/zipjs/index.js'
import { createClient } from 'jsr:@supabase/supabase-js@2'
const supabase = createClient(
Deno.env.get('SUPABASE_URL'),
Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')
)
let numFilesUploaded = 0
async function processZipFile(uploadId, filepath) {
Edge Functions now support establishing both inbound (server) and outbound (client) WebSocket connections. This enables a variety of new use cases.
Example: Building an authenticated relay to OpenAI Realtime API#
OpenAI recently introduced a Realtime API, which uses WebSockets. This is tricky to implement purely client-side because you'd need to expose your OpenAI key publicly. OpenAI recommends building a server to authenticate requests.
With our new support for WebSockets, you can easily do this in Edge Functions without standing up any infrastructure. Additionally, you can use Supabase Auth to authenticate users and protect your OpenAI usage from being abused.
import { createClient } from 'jsr:@supabase/supabase-js@2'
In the past few months, we have made many performance, stability, and DX improvements to Edge Functions. While these improvements often aren't visible to the end-users, they are the foundation of the new features we are announcing today.
We have a very exciting roadmap planned for 2025. One of the main priorities is to provide customizable compute limits (memory, CPU, and execution duration). We will soon announce an update on it.
Stay tuned for the upcoming launches this week. You will see how all these upcoming pieces fit like Lego bricks to make your developer life easy.