Skip to main content

@repo/storage

File storage package for managing file uploads with S3-compatible storage backends.

Installation

pnpm add @repo/storage

Setup

1. Database Schema

Add the files table to your database schema (tenant or global):

import { createFilesTable } from '@repo/storage/database'

// For global database
export const filesTable = createFilesTable()

// For tenant-scoped database
import { tenantSchema } from '@repo/auth/database'
export const filesTable = createFilesTable(tenantSchema)

2. Environment Variables

S3_ACCESS_KEY_ID=your_access_key
S3_SECRET_ACCESS_KEY=your_secret_key
PRIVATE_S3_ENDPOINT=http://localhost:9000
BODY_SIZE_LIMIT=100M

3. Server File Manager

Create a server-side file manager instance:

// lib/files.server.ts
import { env } from '$env/dynamic/private'
import { getGlobalDb } from '$lib/db/managedDb.server'
import { S3Client } from '@aws-sdk/client-s3'
import { FilesRepository, ServerFileManager } from '@repo/storage/server'

export const serverFileManager = new ServerFileManager({
repository: new FilesRepository(getGlobalDb),
s3: {
client: new S3Client({
endpoint: env.PRIVATE_S3_ENDPOINT,
region: 'us-west-1',
forcePathStyle: true,
credentials: {
accessKeyId: env.S3_ACCESS_KEY_ID!,
secretAccessKey: env.S3_SECRET_ACCESS_KEY!,
},
}),
},
})

Single File

1. Define Schema

import { FileInput } from '@repo/storage'

export const myFormSchema = z.object({
name: z.string(),
file: zaf(z.file(), {
component: FileInput.Root
}),
})

2. Remote Function (Create)

export const createForm = form(myFormSchema, async ({ file, ...data }) => {
await serverFileManager.createBucket('my-bucket')
const { createdFile } = await serverFileManager.sync('my-bucket', { file })
// createdFile is typed as FileEntry
await db.insert(myTable).values({ ...data, fileId: createdFile.id })
})

3. Remote Function (Update)

Fetch oldFileId from DB when updating:

export const updateForm = form(myFormSchema, async ({ file, ...data }) => {
if (file) {
const oldFileId = await db.select({ fileId }).from(myTable).where(...).then(r => r[0]?.fileId)
const { createdFile } = await serverFileManager.sync('my-bucket', { file, oldFileId })
// createdFile replaces old file
}
})

4. Editing - Pass Existing File

<FileInput.Provider fileMeta={data.existingFile}>
<TypedForm.Remote form={updateForm} enctype='multipart/form-data' />
</FileInput.Provider>

Multiple files

1. Define Schema with File Input

import { FileInput } from '@repo/storage'
import { z } from 'zod'

export const myFormSchema = z.object({
name: z.string(),
// Add file upload fields
...FileInput.getMultipleFileSchema({
getUrl: file => `/my-resource/${resourceId}/files/${file.id}`,
}),
})

Important: Only one file field per form is currently supported.

The getMultipleFileSchema function adds two fields to your schema:

  • files: The file upload field (array if multiple: true)
  • deleteFileIds: Tracks which existing files should be deleted

2. Handle Files in Remote Functions

export const createMyResourceForm = form(myFormSchema, async (data) => {
const { files, deleteFileIds, ...restData } = data

// Create bucket if it doesn't exist
await serverFileManager.createBucket('my-bucket')

// Sync files (upload new, delete removed)
await serverFileManager.sync('my-bucket', {
files,
deleteFileIds,
})
})

Advanced usage / editing flow

Serving Private Files

Create a server endpoint to stream files with access control:

// routes/files/[fileId]/+server.ts or any other path.
import type { RequestHandler } from './$types'
import { serverFileManager } from '$lib/files.server'
import { validateSession } from '$lib/library/authHelpers.server'
import { error } from '@sveltejs/kit'

export const GET: RequestHandler = async ({ params }) => {
return await serverFileManager.streamFile(params.fileId)
}

The streamFile method returns a Response with appropriate headers for content type, caching, and inline display.

Editing current files

You can set the current files. They will can be selected and changed. You can use the served file to preview (for images)

<script>
import { FileInput } from '@repo/storage'

let { data } = $props()
</script>

<FileInput.Provider fileMeta={data.fileMeta}>
<TypedForm.Remote form={createMyResourceForm} schema={myFormSchema} enctype='multipart/form-data' />
</FileInput.Provider>

Public Buckets & Client File Manager

For public files that don't require authentication, you can make buckets public and use the client file manager to generate URLs.

Making Buckets Public

// Make bucket public when creating it
await serverFileManager.createBucket('public-bucket', { public: true })

// Or make an existing bucket public
await serverFileManager.makeBucketPublic('public-bucket')

Client File Manager

// lib/files.client.ts
import { env } from '$env/dynamic/public'
import { ClientFileManager } from '@repo/storage'

export const clientFileManager = new ClientFileManager({
s3: {
endpoint: env.PUBLIC_S3_ENDPOINT,
},
})

// Generate public URL
const url = clientFileManager.getClientFileUrl({
key: file.key,
bucket: file.bucket
})

Local Development with MinIO

Use MinIO to mock S3 locally:

# docker-compose.yml
services:
minio:
image: minio/minio:latest
restart: always
command: server /data --console-address ":9001"
environment:
MINIO_ROOT_USER: ${S3_ACCESS_KEY_ID:-admin}
MINIO_ROOT_PASSWORD: ${S3_SECRET_ACCESS_KEY:-secret_access_key}
volumes:
- minio-data:/data
ports:
- '9000:9000' # S3 API
- '9001:9001' # Web Console
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:9000/minio/health/live"]
interval: 10s
timeout: 5s
retries: 5

volumes:
minio-data:

Access MinIO console at http://localhost:9001

API Reference

ServerFileManager

// Sync files (upload new, delete removed)
await serverFileManager.sync(bucket: string, options?: {
newFiles?: File[]
deleteFileIds?: string[]
db?: FilesDbContext
})

// Delete a file
await serverFileManager.delete(id: string, options?: { db?: FilesDbContext })

// Create a bucket
await serverFileManager.createBucket(bucket: string, options?: { public?: boolean })

// Make bucket public
await serverFileManager.makeBucketPublic(bucket: string)

// Stream a file (returns Response)
await serverFileManager.streamFile(fileId: string)