Skip to Content

Storage Backends

Django-CFG backup supports multiple storage backends for flexibility in different environments.


Overview

BackendUse CaseCostDurability
LocalDevelopment, small projectsFreeSingle server
AWS S3Enterprise, compliance$$99.999999999%
Cloudflare R2Cost-effective production$High
MinIOSelf-hosted, air-gappedFreeConfigurable
DigitalOcean SpacesSimple cloud storage$High

Local Storage

Best for development and small deployments where backups stay on the same server.

from django_cfg import BackupConfig, BackupStorageConfig backup = BackupConfig( enabled=True, storage=BackupStorageConfig( backend="local", local_path="backups/", # Relative to BASE_DIR ), )

Absolute Path

storage=BackupStorageConfig( backend="local", local_path="/var/backups/myapp/", )

Best Practices

Local Storage Risks Local storage provides no protection against:

  • Server disk failure
  • Server compromise
  • Datacenter issues

For production, always use off-site storage (S3, R2, etc.)

Recommended setup for local development:

backup = BackupConfig( enabled=True, storage=BackupStorageConfig( backend="local", local_path="backups/", ), schedule=BackupScheduleConfig(enabled=False), # Manual only retention=BackupRetentionConfig( keep_daily=3, keep_weekly=0, keep_monthly=0, ), )

AWS S3

Industry-standard cloud storage with 11 nines durability.

Standard S3

storage=BackupStorageConfig( backend="s3", s3_bucket="my-company-backups", s3_access_key="${AWS_ACCESS_KEY_ID}", s3_secret_key="${AWS_SECRET_ACCESS_KEY}", s3_region="us-east-1", s3_prefix="database-backups/production/", )

IAM Role (EC2)

When running on EC2 with IAM role, you can omit credentials:

storage=BackupStorageConfig( backend="s3", s3_bucket="my-company-backups", s3_region="us-east-1", s3_prefix="database-backups/production/", # Credentials automatically from IAM role )

Required IAM Permissions

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::my-company-backups", "arn:aws:s3:::my-company-backups/*" ] } ] }

S3 Storage Classes

For cost optimization, configure lifecycle rules in AWS Console:

ClassUse CaseRetrieval Time
StandardRecent backups (< 30 days)Immediate
Standard-IAWeekly/monthly backupsImmediate
GlacierArchive (> 90 days)Minutes to hours
Glacier DeepLong-term compliance12-48 hours

Cloudflare R2

S3-compatible storage with zero egress fees - ideal for backups you may need to restore frequently.

storage=BackupStorageConfig( backend="s3", s3_bucket="my-backups", s3_endpoint_url="https://<ACCOUNT_ID>.r2.cloudflarestorage.com", s3_access_key="${R2_ACCESS_KEY_ID}", s3_secret_key="${R2_SECRET_ACCESS_KEY}", s3_region="auto", # Always "auto" for R2 s3_prefix="db-backups/", )

Getting R2 Credentials

  1. Go to Cloudflare Dashboard → R2
  2. Create a bucket (e.g., my-backups)
  3. Go to “Manage R2 API Tokens”
  4. Create token with Object Read & Write permissions
  5. Copy Access Key ID and Secret Access Key

Cost Comparison (10 GB backups, 5 restores/month)

ProviderStorageEgressTotal
AWS S3$0.23$4.50~$5/mo
Cloudflare R2$0.15$0~$0.15/mo

R2 for Backups R2’s zero egress fees make it ideal for database backups where you may need to restore data frequently during disaster recovery.


MinIO (Self-Hosted)

S3-compatible storage you can run on your own infrastructure.

storage=BackupStorageConfig( backend="s3", s3_bucket="backups", s3_endpoint_url="https://minio.internal.company.com", s3_access_key="${MINIO_ACCESS_KEY}", s3_secret_key="${MINIO_SECRET_KEY}", s3_region="us-east-1", # Can be any value for MinIO s3_prefix="django-backups/", )

MinIO Docker Setup

docker-compose.yml
services: minio: image: minio/minio:latest command: server /data --console-address ":9001" ports: - "9000:9000" - "9001:9001" environment: MINIO_ROOT_USER: minioadmin MINIO_ROOT_PASSWORD: minioadmin volumes: - minio_data:/data volumes: minio_data:

Use Cases

  • Air-gapped environments
  • Data sovereignty requirements
  • On-premise infrastructure
  • Development/testing S3 integration

DigitalOcean Spaces

S3-compatible object storage from DigitalOcean.

storage=BackupStorageConfig( backend="s3", s3_bucket="my-space-name", s3_endpoint_url="https://nyc3.digitaloceanspaces.com", s3_access_key="${DO_SPACES_KEY}", s3_secret_key="${DO_SPACES_SECRET}", s3_region="nyc3", s3_prefix="backups/", )

Regions

RegionEndpoint
NYC3https://nyc3.digitaloceanspaces.com
SFO3https://sfo3.digitaloceanspaces.com
AMS3https://ams3.digitaloceanspaces.com
SGP1https://sgp1.digitaloceanspaces.com
FRA1https://fra1.digitaloceanspaces.com

Environment-Based Configuration

Use different storage backends per environment:

config.py
from django_cfg import DjangoConfig, BackupConfig, BackupStorageConfig import os def get_backup_storage() -> BackupStorageConfig: """Return storage config based on environment.""" env = os.getenv("ENV_MODE", "development") if env == "production": return BackupStorageConfig( backend="s3", s3_bucket="${BACKUP_S3_BUCKET}", s3_endpoint_url="${BACKUP_S3_ENDPOINT}", s3_access_key="${BACKUP_S3_ACCESS_KEY}", s3_secret_key="${BACKUP_S3_SECRET_KEY}", s3_region="auto", ) else: return BackupStorageConfig( backend="local", local_path="backups/", ) class MyConfig(DjangoConfig): backup = BackupConfig( enabled=True, storage=get_backup_storage(), )

Storage Security

Encryption at Rest

Most S3-compatible services support server-side encryption:

  • AWS S3: Enable SSE-S3 or SSE-KMS in bucket settings
  • R2: Encrypted by default
  • MinIO: Configure encryption in MinIO settings

Client-Side Encryption

For additional security, enable backup encryption:

backup = BackupConfig( enabled=True, storage=BackupStorageConfig(backend="s3", ...), encryption_key="${BACKUP_ENCRYPTION_KEY}", # GPG-compatible )

Network Security

  • Use HTTPS endpoints (all examples above use HTTPS)
  • Configure VPC endpoints for AWS S3 in production
  • Use private networking for MinIO

Troubleshooting

Common Issues

Access Denied

Symptoms: AccessDenied error when uploading

Solutions:

  1. Verify IAM permissions include s3:PutObject
  2. Check bucket policy allows your IAM user/role
  3. Verify access key and secret are correct
  4. For R2: Ensure token has Write permissions

Bucket Not Found

Symptoms: NoSuchBucket error

Solutions:

  1. Verify bucket name is correct (case-sensitive)
  2. Check bucket exists in the correct region
  3. For R2: Bucket names are globally unique

Endpoint Error

Symptoms: Connection timeout or SSL error

Solutions:

  1. Verify endpoint URL is correct
  2. Check network connectivity to endpoint
  3. Ensure HTTPS is used
  4. For MinIO: Verify server is running

Testing Storage Connection

# Test backup creation python manage.py db_backup -v 2 # Check backup was created python manage.py db_backup --list

See Also