remote-dump - Database Dump
Create a local copy of the remote environment’s database for development and testing.
Purpose
The remote-dump command connects to your configured remote PrestaShop Enterprise environment’s database and creates a local dump file that can be used for local development and testing.
When to Use
- Initial environment setup: Getting database content for a new local environment
- Fresh data for testing: Obtaining current production/staging data for local testing
- Troubleshooting: Getting production-like data to reproduce and debug issues
- Data synchronization: Keeping local database in sync with remote environment
Prerequisites
Before running the remote-dump command:
- Authentication: Must be authenticated with
ps-enterprise auth
- Configuration: Must have valid environment configuration from
ps-enterprise config
Usage
Basic Database Dump
ps-enterprise remote-dump
Custom Output Location
# Save to specific location
ps-enterprise remote-dump --output /path/to/custom/dump.sql
Options
| Flag |
Short |
Description |
Required |
--output |
-o |
Specify custom output file path |
No |
--help |
-h |
Show command help |
No |
Process
1. Current Dump Check
- Ask to choose between current dump or create a new one
- When last dump is selected, cleans local database and re-imports it
- Creates a complete database dump
- Includes all tables, data, and structure
- Applies compression to reduce file size
3. Local Storage
- Saves uncompressed dump to specified location (default:
./tmp/dump.sql)
- Overwrites previous dump files
- Sets appropriate file permissions
- Anonymizes user sensitive data
Examples
Standard Development Setup
# Get fresh database dump for development
cd my-prestashop-project
ps-enterprise remote-dump
# Dump will be saved to ./tmp/dump.sql
# And imported into local database
Custom Backup Location
# Save to organized backup directory
mkdir -p ~/backups/$(date +%Y-%m-%d)
ps-enterprise remote-dump --output ~/backups/$(date +%Y-%m-%d)/myproject-dump.sql
Automated Backup Script
#!/bin/bash
# Daily backup script
PROJECT_NAME="myproject"
BACKUP_DIR="~/backups/$PROJECT_NAME"
DATE=$(date +%Y-%m-%d_%H-%M-%S)
mkdir -p "$BACKUP_DIR"
ps-enterprise remote-dump --output "$BACKUP_DIR/dump-$DATE.sql"
Success Indicators
After successful database dump:
✅ A previous dump of <environment> has been restored
or
✅ A fresh dump of <environment> has been imported
Troubleshooting
Database Connection Failed
❌ Database connection failed: Access denied for user
Solution:
- Verify environment configuration:
ps-enterprise config
- Check authentication:
ps-enterprise auth
- Contact administrator for database access permissions
Network Timeout
❌ Connection timeout: Unable to reach database server
Solution:
- Check network connectivity
- Verify firewall settings allow database connections
- Try again during off-peak hours for large databases
- Contact network administrator if issues persist
Insufficient Disk Space
❌ Dump failed: No space left on device
Solution:
- Check available space:
df -h
- Free up disk space by removing old files
Permission Denied
❌ Permission denied: Cannot write to dump file
Solution:
- Check directory permissions:
- Fix permissions:
Integration with Other Commands
The remote-dump command integrates with the full CLI workflow:
- Requires: Valid authentication (
auth) and configuration (config)
- Followed by: Usually followed by
patch to prepare database for local use
- Works with:
start command imports the dump into local database containers
- Automation: Often used in automated setup workflows
Best Practices
Regular Updates
- Schedule Regular Dumps: Keep local data reasonably current
- Document Changes: Note when dumps were taken and from which environment
- Version Control: Consider versioning important dump files
Development Workflow
- Fresh Data: Start development sessions with recent data
- Data Isolation: Use separate dumps for different development branches
- Testing Data: Use staging dumps for testing, production for debugging
auth - Required for database access
config - Determines which database to dump
patch - Apply local settings after getting dump
start - Import dump into local environment
Database Size: Large databases may take significant time to download. Ensure stable network connection and adequate disk space.
Pro Tip: For large databases, consider using staging environment dumps for regular development work, and production dumps only when debugging specific issues.