Storage Backends
Django-CFG backup supports multiple storage backends for flexibility in different environments.
Overview
| Backend | Use Case | Cost | Durability |
|---|---|---|---|
| Local | Development, small projects | Free | Single server |
| AWS S3 | Enterprise, compliance | $$ | 99.999999999% |
| Cloudflare R2 | Cost-effective production | $ | High |
| MinIO | Self-hosted, air-gapped | Free | Configurable |
| DigitalOcean Spaces | Simple cloud storage | $ | High |
Local Storage
Best for development and small deployments where backups stay on the same server.
from django_cfg import BackupConfig, BackupStorageConfig
backup = BackupConfig(
enabled=True,
storage=BackupStorageConfig(
backend="local",
local_path="backups/", # Relative to BASE_DIR
),
)Absolute Path
storage=BackupStorageConfig(
backend="local",
local_path="/var/backups/myapp/",
)Best Practices
Local Storage Risks Local storage provides no protection against:
- Server disk failure
- Server compromise
- Datacenter issues
For production, always use off-site storage (S3, R2, etc.)
Recommended setup for local development:
backup = BackupConfig(
enabled=True,
storage=BackupStorageConfig(
backend="local",
local_path="backups/",
),
schedule=BackupScheduleConfig(enabled=False), # Manual only
retention=BackupRetentionConfig(
keep_daily=3,
keep_weekly=0,
keep_monthly=0,
),
)AWS S3
Industry-standard cloud storage with 11 nines durability.
Standard S3
storage=BackupStorageConfig(
backend="s3",
s3_bucket="my-company-backups",
s3_access_key="${AWS_ACCESS_KEY_ID}",
s3_secret_key="${AWS_SECRET_ACCESS_KEY}",
s3_region="us-east-1",
s3_prefix="database-backups/production/",
)IAM Role (EC2)
When running on EC2 with IAM role, you can omit credentials:
storage=BackupStorageConfig(
backend="s3",
s3_bucket="my-company-backups",
s3_region="us-east-1",
s3_prefix="database-backups/production/",
# Credentials automatically from IAM role
)Required IAM Permissions
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::my-company-backups",
"arn:aws:s3:::my-company-backups/*"
]
}
]
}S3 Storage Classes
For cost optimization, configure lifecycle rules in AWS Console:
| Class | Use Case | Retrieval Time |
|---|---|---|
| Standard | Recent backups (< 30 days) | Immediate |
| Standard-IA | Weekly/monthly backups | Immediate |
| Glacier | Archive (> 90 days) | Minutes to hours |
| Glacier Deep | Long-term compliance | 12-48 hours |
Cloudflare R2
S3-compatible storage with zero egress fees - ideal for backups you may need to restore frequently.
storage=BackupStorageConfig(
backend="s3",
s3_bucket="my-backups",
s3_endpoint_url="https://<ACCOUNT_ID>.r2.cloudflarestorage.com",
s3_access_key="${R2_ACCESS_KEY_ID}",
s3_secret_key="${R2_SECRET_ACCESS_KEY}",
s3_region="auto", # Always "auto" for R2
s3_prefix="db-backups/",
)Getting R2 Credentials
- Go to Cloudflare Dashboard → R2
- Create a bucket (e.g.,
my-backups) - Go to “Manage R2 API Tokens”
- Create token with Object Read & Write permissions
- Copy Access Key ID and Secret Access Key
Cost Comparison (10 GB backups, 5 restores/month)
| Provider | Storage | Egress | Total |
|---|---|---|---|
| AWS S3 | $0.23 | $4.50 | ~$5/mo |
| Cloudflare R2 | $0.15 | $0 | ~$0.15/mo |
R2 for Backups R2’s zero egress fees make it ideal for database backups where you may need to restore data frequently during disaster recovery.
MinIO (Self-Hosted)
S3-compatible storage you can run on your own infrastructure.
storage=BackupStorageConfig(
backend="s3",
s3_bucket="backups",
s3_endpoint_url="https://minio.internal.company.com",
s3_access_key="${MINIO_ACCESS_KEY}",
s3_secret_key="${MINIO_SECRET_KEY}",
s3_region="us-east-1", # Can be any value for MinIO
s3_prefix="django-backups/",
)MinIO Docker Setup
services:
minio:
image: minio/minio:latest
command: server /data --console-address ":9001"
ports:
- "9000:9000"
- "9001:9001"
environment:
MINIO_ROOT_USER: minioadmin
MINIO_ROOT_PASSWORD: minioadmin
volumes:
- minio_data:/data
volumes:
minio_data:Use Cases
- Air-gapped environments
- Data sovereignty requirements
- On-premise infrastructure
- Development/testing S3 integration
DigitalOcean Spaces
S3-compatible object storage from DigitalOcean.
storage=BackupStorageConfig(
backend="s3",
s3_bucket="my-space-name",
s3_endpoint_url="https://nyc3.digitaloceanspaces.com",
s3_access_key="${DO_SPACES_KEY}",
s3_secret_key="${DO_SPACES_SECRET}",
s3_region="nyc3",
s3_prefix="backups/",
)Regions
| Region | Endpoint |
|---|---|
| NYC3 | https://nyc3.digitaloceanspaces.com |
| SFO3 | https://sfo3.digitaloceanspaces.com |
| AMS3 | https://ams3.digitaloceanspaces.com |
| SGP1 | https://sgp1.digitaloceanspaces.com |
| FRA1 | https://fra1.digitaloceanspaces.com |
Environment-Based Configuration
Use different storage backends per environment:
from django_cfg import DjangoConfig, BackupConfig, BackupStorageConfig
import os
def get_backup_storage() -> BackupStorageConfig:
"""Return storage config based on environment."""
env = os.getenv("ENV_MODE", "development")
if env == "production":
return BackupStorageConfig(
backend="s3",
s3_bucket="${BACKUP_S3_BUCKET}",
s3_endpoint_url="${BACKUP_S3_ENDPOINT}",
s3_access_key="${BACKUP_S3_ACCESS_KEY}",
s3_secret_key="${BACKUP_S3_SECRET_KEY}",
s3_region="auto",
)
else:
return BackupStorageConfig(
backend="local",
local_path="backups/",
)
class MyConfig(DjangoConfig):
backup = BackupConfig(
enabled=True,
storage=get_backup_storage(),
)Storage Security
Encryption at Rest
Most S3-compatible services support server-side encryption:
- AWS S3: Enable SSE-S3 or SSE-KMS in bucket settings
- R2: Encrypted by default
- MinIO: Configure encryption in MinIO settings
Client-Side Encryption
For additional security, enable backup encryption:
backup = BackupConfig(
enabled=True,
storage=BackupStorageConfig(backend="s3", ...),
encryption_key="${BACKUP_ENCRYPTION_KEY}", # GPG-compatible
)Network Security
- Use HTTPS endpoints (all examples above use HTTPS)
- Configure VPC endpoints for AWS S3 in production
- Use private networking for MinIO
Troubleshooting
Common Issues
Access Denied
Symptoms: AccessDenied error when uploading
Solutions:
- Verify IAM permissions include
s3:PutObject - Check bucket policy allows your IAM user/role
- Verify access key and secret are correct
- For R2: Ensure token has Write permissions
Bucket Not Found
Symptoms: NoSuchBucket error
Solutions:
- Verify bucket name is correct (case-sensitive)
- Check bucket exists in the correct region
- For R2: Bucket names are globally unique
Endpoint Error
Symptoms: Connection timeout or SSL error
Solutions:
- Verify endpoint URL is correct
- Check network connectivity to endpoint
- Ensure HTTPS is used
- For MinIO: Verify server is running
Testing Storage Connection
# Test backup creation
python manage.py db_backup -v 2
# Check backup was created
python manage.py db_backup --listSee Also
- Configuration Guide - Complete BackupConfig reference
- Overview - Architecture and concepts
- AWS S3 Documentation
- Cloudflare R2 Documentation