Backup and Restore¶
Procedures for backing up and restoring Tempo data.
Overview¶
Tempo stores data in two locations: 1. Database - PostgreSQL database with workout data, settings, and metadata 2. Media Files - Filesystem storage for photos and videos
Both must be backed up for a complete backup.
Tempo also provides a Data Export feature that creates a user-friendly backup in a portable ZIP format. This can be used as an alternative or supplement to database backups.
Backup Procedures¶
Data Export as Backup¶
Tempo's export feature provides a user-friendly way to backup your data directly from the web interface. This method is ideal for:
- User-level backups - No database access required
- Portable backups - Single ZIP file contains everything
- Quick backups - Simple one-click export from Settings page
- Migration - Easy to move data between instances
Using the Export Feature¶
- Log into Tempo
- Navigate to Settings
- Find the "Export / Import" section
- Click "Export All Data"
- Wait for the download to complete
The export includes: - All workouts with complete data - All media files (photos and videos) - All shoes and settings - Best efforts and calculated data - Raw workout files (GPX, FIT, CSV)
Export vs. Database Backup¶
| Feature | Data Export | Database Backup |
|---|---|---|
| Ease of Use | User-friendly, web interface | Requires database access |
| Portability | Single ZIP file | SQL dump + media files |
| Completeness | All data included | All data included |
| Format | Structured JSON + binary files | SQL + filesystem files |
| Restore | Import feature (user-friendly) | Direct database restore |
| Automation | Manual only | Can be automated |
Recommendation: Use both methods for comprehensive backup strategy: - Data Export - Regular user-level backups, easy to create and store, can be restored via import feature - Database Backup - System-level backups, can be automated, faster restore for large datasets
Database Backup¶
Using Docker¶
# Backup database
docker exec tempo-postgres pg_dump -U postgres tempo > backup-$(date +%Y%m%d-%H%M%S).sql
# Or with compression
docker exec tempo-postgres pg_dump -U postgres tempo | gzip > backup-$(date +%Y%m%d-%H%M%S).sql.gz
Using PostgreSQL Client¶
# Backup database
pg_dump -h localhost -U postgres tempo > backup-$(date +%Y%m%d-%H%M%S).sql
# Or with compression
pg_dump -h localhost -U postgres tempo | gzip > backup-$(date +%Y%m%d-%H%M%S).sql.gz
Media Files Backup¶
Using tar¶
Using rsync¶
Complete Backup Script¶
Create a backup script that backs up both:
#!/bin/bash
BACKUP_DIR="/backup/tempo"
DATE=$(date +%Y%m%d-%H%M%S)
# Create backup directory
mkdir -p "$BACKUP_DIR"
# Backup database
docker exec tempo-postgres pg_dump -U postgres tempo | gzip > "$BACKUP_DIR/db-$DATE.sql.gz"
# Backup media
tar -czf "$BACKUP_DIR/media-$DATE.tar.gz" ./media
# Keep only last 30 days of backups
find "$BACKUP_DIR" -name "*.gz" -mtime +30 -delete
echo "Backup completed: $DATE"
Automated Backups¶
Cron Job¶
Add to crontab for daily backups:
Systemd Timer¶
Create a systemd service and timer for automated backups.
Restore Procedures¶
Data Import as Restore¶
Tempo's import feature provides a user-friendly way to restore data from an export ZIP file. This method is ideal for:
- User-level restore - No database access required
- Selective restore - Can import into existing instance with duplicate detection
- Cross-instance migration - Easy to move data between Tempo instances
- Quick restore - Simple drag-and-drop interface from Settings page
Using the Import Feature¶
- Log into Tempo
- Navigate to Settings
- Find the "Export / Import" section
- In the "Import Data" area, click or drag and drop your export ZIP file
- Wait for the import to complete
- Review the import summary
The import will: - Restore all workouts, media, shoes, and settings - Preserve GUIDs and timestamps from the export - Automatically skip duplicates - Provide detailed statistics and error reporting
Note: Import supports export format version 1.0.0. Files up to 500MB are supported.
Import vs. Database Restore¶
| Feature | Data Import | Database Restore |
|---|---|---|
| Ease of Use | User-friendly, web interface | Requires database access |
| Selective Restore | Can import into existing data | Replaces entire database |
| Duplicate Handling | Automatic duplicate detection | No duplicate handling |
| Cross-Instance | Easy migration between instances | Requires database compatibility |
| Speed | Slower for very large datasets | Faster for large datasets |
| Automation | Manual only | Can be automated |
Recommendation: - Use Data Import for user-level restores, migrations, and selective restores - Use Database Restore for complete system restores and automated backup strategies
Database Restore¶
Using Docker¶
# Restore database
docker exec -i tempo-postgres psql -U postgres tempo < backup.sql
# Or from compressed backup
gunzip -c backup.sql.gz | docker exec -i tempo-postgres psql -U postgres tempo
Using PostgreSQL Client¶
# Restore database
psql -h localhost -U postgres tempo < backup.sql
# Or from compressed backup
gunzip -c backup.sql.gz | psql -h localhost -U postgres tempo
Important: Restore to an empty database or drop existing database first.
Media Files Restore¶
Using tar¶
Using rsync¶
Complete Restore¶
- Stop Tempo services
- Restore database
- Restore media files
- Verify file permissions
- Start Tempo services
- Verify data integrity
Backup Verification¶
Verify Database Backup¶
# Check backup file
pg_restore --list backup.sql | head -20
# Test restore to temporary database
createdb tempo_test
pg_restore -d tempo_test backup.sql
Verify Media Backup¶
# List contents
tar -tzf media-backup.tar.gz | head -20
# Check file count
tar -tzf media-backup.tar.gz | wc -l
Backup Storage¶
Local Storage¶
- Store backups on separate disk
- Use different physical location if possible
- Encrypt sensitive backups
Remote Storage¶
- Cloud storage (S3, Google Cloud Storage, etc.)
- Remote server via rsync/SSH
- Network-attached storage (NAS)
Encryption¶
Encrypt backups before storing:
# Encrypt backup
gpg --symmetric --cipher-algo AES256 backup.sql.gz
# Decrypt backup
gpg --decrypt backup.sql.gz.gpg > backup.sql.gz
Retention Policy¶
Recommended Retention¶
- Daily backups: Keep 7 days
- Weekly backups: Keep 4 weeks
- Monthly backups: Keep 12 months
- Yearly backups: Keep indefinitely
Cleanup Script¶
#!/bin/bash
BACKUP_DIR="/backup/tempo"
# Remove backups older than 30 days
find "$BACKUP_DIR" -name "*.sql.gz" -mtime +30 -delete
find "$BACKUP_DIR" -name "*.tar.gz" -mtime +30 -delete
Disaster Recovery¶
Recovery Plan¶
- Assess damage: Determine what data is lost
- Stop services: Prevent further data loss
- Restore from backup: Use most recent backup
- Verify integrity: Check data completeness
- Resume services: Start Tempo services
- Monitor: Watch for issues
Testing Restores¶
Regularly test restore procedures:
- Test database restore to temporary database
- Verify media files restore correctly
- Test complete restore procedure
- Document any issues
Best Practices¶
Backup Frequency¶
- Active use: Daily backups recommended
- Light use: Weekly backups may suffice
- Critical data: Consider multiple daily backups
Backup Location¶
- Store backups off-site
- Use multiple backup locations
- Test backup accessibility regularly
Documentation¶
- Document backup procedures
- Keep restore procedures accessible
- Update procedures as needed
- Train team members
Troubleshooting¶
Backup Fails¶
- Check disk space
- Verify database is accessible
- Check file permissions
- Review error messages
Restore Fails¶
- Verify backup file integrity
- Check database connection
- Ensure sufficient disk space
- Review error messages
Data Mismatch¶
- Verify backup date matches restore date
- Check for missing media files
- Verify database schema matches
- Review migration history