Presigned URLs

Presigned URLs via edge scripting for bunny.net 🐰

NPM Version

Introduction

With the addition of edge scripting, it is now possible to intercept requests and responses to and from bunny.net pull zones.

In this guide, we will write our own standalone edge script to sign urls and upload files using the bunny-presigned-urls package.

Why standalone scripts?

Currently, there is a bug with middleware scripts that blocks all POST requests when the middleware script is connected to a pull zone that has a storage zone as its origin. This bug is a nonstarter for middleware scripts.

In addition, using a standalone script minimizes latency on your pull zone and storage zone.

Authentication

While presigned urls can be authenticated with their signatures, you still need to authenticate users who create those presigned urls.

This guide authenticates users with a custom AccessKey, which is a string value you set in the environment variables. The request AccessKey header value is compared to the environment variable for authentication.

Depending on your app, you may decide to implement cookie or token authentication.

Those implementation details are left to you so that you can use your existing authentication implementation or the authentication implementation of your choice.

If you are using token authentication, please check out the jose (JSON Object Signing and Encryption) package.

Limitations

Platform limitations

The main limitations are from the edge scripting platform:

  • CPU Time per request is limited to 30s
  • Active memory is limited to 128MB

While the bunny-presigned-urls package uses ReadableStreams to minimize memory usage, the maximum file upload size is still subject to the amount of time it takes for the script to upload it to your storage zone. This time is variable, depending on the storage zone region, storage zone type, and edge script location. Local upload testing from the United States is confirmed to work with files up to 1GB. Your milage may vary.

Nonetheless, this example with bunny-presigned-urls is well-suited for image uploads and other text-based files.

If you are dealing with videos, you should be using Bunny Stream and their presigned urls with the TUS Resumable Uploads endpoint instead.

Default package limits

In addition to the main limitations from bunny.net, the bunny-presigned-urls package introduces common-sense, default limits too.

By default, the package:

  • limits the maximum upload size (maxSize) to 10MB
  • limits the upload time (expires) to 1hr
  • requires a checksum to validate the exact file uploaded

Remove these limits by setting:

  • maxSize to Infinity
  • expires to 1000yr
  • checksum to false

Please note:

  • You must set the custom maxSize and expires options in both the signUrl and uploadFile functions
  • When maxSize is less than Infinity, you must pass the fileSizeInBytes option to the signUrl function
  • You must set the custom checksum option in the signUrl function

Preparing the edge script

Creating the script

Follow bunny.net’s quickstart guide to create a standalone script.

Currently, there is a bug with edge scripts that breaks all logging. Check your browser console for this error to confirm it:

WebSocket connection to '<URL>' failed: WebSocket is closed before the connection is established.

Setting environment variables

Navigate to the Environment variables tab and set these environment variables:

  • Set KEY to a random 32-character hex string
  • Set ACCESS_KEY to any secret string
  • Set STORAGE_ZONE_NAME to your storage zone name
  • Set STORAGE_ZONE_PASSWORD to your storage zone password
  • Set STORAGE_ZONE_STORAGE_HOSTNAME to your storage zone hostname

For your KEY and ACCESS_KEY values, feel free to generate them with npx --yes bunny-presigned-urls@latest

To find the storage zone values, visit:

  • The new Dashboard > Storage > Storage Zone Name > FTP & API Access
  • The old Panel > Storage > Storage Zone Name > FTP & API Access

Writing the edge script

Adding imports

Start by importing the following packages. Because the runtime is based on a modified Deno runtime, URL imports are permitted.

import * as BunnySDK from 'https://esm.sh/@bunny.net/edgescript-sdk@0.11.2'import {  signUrl,  uploadFile,} from 'https://cdn.jsdelivr.net/npm/bunny-presigned-urls@0.0.5/dist/index.js'import * as process from 'node:process'import { z } from 'https://esm.run/zod@3.23.8'

Configuration

Parse your env values and set other config values:

const maxSize = '10MB'const expires = '1hr'const signPathname = '/sign'const uploadPathname = '/upload'const configSchema = z.object({  accessKey: z.string(),  key: z.string(),  storageZone: z.object({    name: z.string(),    password: z.string(),    storageHostname: z.string(),  }),})type Config = z.infer<typeof configSchema>function readConfigFromEnv(): Config {  const config = configSchema.parse({    accessKey: process.env.ACCESS_KEY,    key: process.env.KEY,    storageZone: {      name: process.env.STORAGE_ZONE_NAME,      password: process.env.STORAGE_ZONE_PASSWORD,      storageHostname: process.env.STORAGE_ZONE_STORAGE_HOSTNAME,    },  })  return config}const config = readConfigFromEnv()// validate user inputsconst parametersSchema = z.object({  checksum: z.string().length(64),  filePath: z.string(),  fileSizeInBytes: z.number().int().positive(),})

For the filePath, consider:

  • validating it matches a pattern /images/*
  • generating it from the checksum

Create the serve function handler

Prepare the default routes and responses:

BunnySDK.net.http.serve(async (request: Request): Promise<Response> => {  try {    // workaround bug where request.url protocol is http://, not https://    const requestUrl = request.url.replace("http://", "https://");    const url = new URL(requestUrl);    // sign    // upload    // fallback for all other routes    return new Response(undefined, {      status: 404,      statusText: "Not Found",    });  } catch {    // hide 500 errors for security    return new Response(undefined, {      status: 500,      statusText: "Internal Server Error",    });  }

Configure signed urls

// signif (request.method === 'POST' && url.pathname === signPathname) {  // authorize user  if (request.headers.get('AccessKey') !== config.accessKey) {    return new Response(undefined, {      status: 401,      statusText: 'Unauthorized',    })  }  // validate inputs  const parameters = parametersSchema.safeParse(await request.json())  if (!parameters.success) {    return new Response('Invalid parameters', {      status: 400,      statusText: 'Bad Request',    })  }  // return signed url response  return await signUrl({    baseUrl: url.origin + uploadPathname,    checksum: parameters.data.checksum,    expires,    filePath: parameters.data.filePath,    fileSizeInBytes: parameters.data.fileSizeInBytes,    key: config.key,    maxSize,    storageZone: config.storageZone,  })}

Configure file uploads

// uploadif (request.method === 'POST' && url.pathname === uploadPathname) {  // optionally, validate the file type before upload, but be aware of gotchas  // return uploaded file response  return await uploadFile({    body: request.body,    expires,    key: config.key,    maxSize,    storageZone: config.storageZone,    url: requestUrl,  })}

If you choose to validate file uploads, please be aware that:

  • Converting the request.body from a ReadableStream to an Uint8Array via await request.bytes(); may cause your edge script to run out of its 128MB of memory
  • Splitting the request.body into two streams via request.body.tee() will signal backpressure at the rate of the faster consumer, meaning unread data is enqueued internally by the slower consumer without any limit or backpressure

File size and file checksum validation is handled internally by the package.

Writing the upload script

Browser

In the browser, you will likely receive a File object from a file input, the File System API, or OPFS.

To get a ReadableStream from the File object:

file.stream()

To get the fileSizeInBytes from the File object:

file.size

Node.js

In Node.js, there are many different ways to access files and their stats.

To get a ReadableStream:

import { createReadStream } from 'node:fs'import path from 'node:path'import { Readable } from 'node:stream'Readable.toWeb(  createReadStream(path.resolve(filePath)),) as ReadableStream<Uint8Array>

To get the fileSizeInBytes:

import { stat } from 'node:fs/promises'import path from 'node:path'const stats = await stat(path.resolve(filePath))stats.size

Example

This example is for the Browser. For Node.js, swap out the ReadableStream and fileSizeInBytes for their equivalents above.

This example uses top-level await. If your runtime does not support top-level await, you will need to wrap the code in an async function.

Regardless of the Browser or Node.js environment, streams can only be read from once.

import { checksumFromReadableStream } from 'bunny-presigned-urls'// retrieve the filelet file: File// this example authenticates with the AccessKey header// however, you will likely use token or cookie authentication insteadconst AccessKey = 'd377c532a8f1ebe57e15983d8902f5c4'// copy the url for the edge scriptconst baseUrl = 'https://my-presigned-urls-standalone-ltz3z.b-cdn.net'const signedUrlResponse = await fetch(`${baseUrl}/sign`, {  body: JSON.stringify({    checksum: await checksumFromReadableStream(file.stream()),    filePath: `/images/${file.name}`,    fileSizeInBytes: file.bytes(),  }),  headers: {    AccessKey,    Referer: `${baseUrl}/`,  },  method: 'POST',})const message = await signedUrlResponse.text()if (URL.canParse(message)) {  const uploadResponse = await fetch(message, {    body: file.stream(),    duplex: 'half',    headers: {      Referer: `${baseUrl}/`,    },    method: 'POST',  })  const { status, statusText } = uploadResponse  if (status === 201) {    console.log('Done')  } else {    console.error({      message: await uploadResponse.text(),      status,      statusText,    })  }} else {  const { status, statusText } = signedUrlResponse  console.error({    message,    status,    statusText,  })}