Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Microsoft SQL Databases

These are typically hosted on a Linux container but may also be hosted on an existing SQL Server instance.

Usual backup best practices apply such as scheduled daily backups. You may use SQL Server Management Studio (SSMS) or a third party tool to automate this.

These SQL databases contain the data that your users have uploaded to the system as well as most configurations and other data that the system stores.

PostgreSQL Vector

This is hosted on a Linux container.

The database includes the data uploaded in plain text format as well as a vector representation of it for faster searching.

Backups should be performed at the same time as the SQL backups as these databases reference each other’s content.

 Suggested backup procedure

PostgreSQL Regular Backup Instructions Using pg_dump

1. Prerequisites

  • Install PostgreSQL Client Tools:

    # For Debian/Ubuntu
    sudo apt-get update
    sudo apt-get install postgresql-client
    
    # For CentOS/RHEL
    sudo yum install postgresql
  • Ensure Database Access:

    • The backup user must have read permissions on the target database.

2. Configure Password-less Authentication

  1. Create .pgpass File:

    touch ~/.pgpass
    chmod 600 ~/.pgpass
  2. Add Database Credentials:

    hostname:port:database:username:password

    Example:

    localhost:5432:your_database:your_username:your_password

3. Create Backup Script

Example Script (backup_postgres.sh):

#!/bin/bash

# Variables
DB_NAME="your_database"
DB_USER="your_username"
BACKUP_DIR="/path/to/backup"
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="$BACKUP_DIR/${DB_NAME}_$DATE.sql.gz"
RETENTION_DAYS=30
EMAIL="admin@example.com"

# Create Backup Directory
mkdir -p "$BACKUP_DIR"

# Perform Backup
pg_dump -U "$DB_USER" "$DB_NAME" | gzip > "$BACKUP_FILE"

# Check Backup Success
if [ $? -eq 0 ]; then
    echo "Backup successful: $BACKUP_FILE"
else
    echo "Backup failed for $DB_NAME" | mail -s "PostgreSQL Backup Failure" "$EMAIL"
    exit 1
fi

# Remove Old Backups
find "$BACKUP_DIR" -type f -name "${DB_NAME}_*.sql.gz" -mtime +$RETENTION_DAYS -exec rm {} \;

# Optional: Log the Backup
echo "$(date): Backup completed for $DB_NAME" >> "$BACKUP_DIR/backup.log"
  • Make Script Executable:

    chmod +x /path/to/backup_postgres.sh

4. Schedule Backups with Cron

  1. Edit Crontab:

    crontab -e
  2. Add Cron Job (Daily at 2 AM):

    0 2 * * * /path/to/backup_postgres.sh >> /path/to/backup.log 2>&1

5. Verify Backups

  • Check Backup Directory:

    Ensure new .sql.gz files are appearing as scheduled.

  • Test a Backup Restoration:

    createdb test_restore_db
    gunzip -c /path/to/backup/your_database_YYYYMMDD_HHMMSS.sql.gz | psql -U your_username -d test_restore_db
    • Verify Data:

      psql -U your_username -d test_restore_db -c "\\dt"
    • Remove Test Database:

      dropdb test_restore_db

6. Optional: Secure and Offsite Storage

  • Encrypt Backups (Using GPG):

    gpg --symmetric --cipher-algo AES256 "$BACKUP_FILE"
    rm "$BACKUP_FILE"
  • Sync to Remote Server:

    Add to the backup script:

    rsync -avz "$BACKUP_FILE.gpg" remote_user@remote_host:/remote/backup/dir/
    • Ensure SSH keys are set up for password-less access.

7. Monitoring and Alerts

  • Check Logs Regularly:

    Review /path/to/backup.log for backup statuses.

  • Set Up Email Alerts:

    Ensure the script sends emails on failure as shown in the backup script.


Restoration Steps

  1. Create a New Database:

    createdb restored_db
  2. Restore from Backup:

    gunzip -c /path/to/backup/your_database_YYYYMMDD_HHMMSS.sql.gz | psql -U your_username -d restored_db
    • If Encrypted:

      gpg --decrypt /path/to/backup/your_database_YYYYMMDD_HHMMSS.sql.gz.gpg | gunzip | psql -U your_username -d restored_db
  3. Verify Restoration:

    psql -U your_username -d restored_db -c "\\dt"
  4. Remove Test Database (if applicable):

    dropdb restored_db

Dashboard / Ingestor

These servers don’t need regular backups as they don’t contain user information.

Backing up the service directories before upgrades is recommended for easy rollbacks if required.

Configuration Backups

Windows Server

It’s recommended to backup the configuration files at these locations for easy reinstallation.

Dashboard: C:\inetpub\BGDDashboard\Configuration

Ingestor Service: C:\AGAT\BGService\Configuration

Paths may differ depending on installation choices.

Linux Server

The site specific configurations are stored in docker.env in the home or Gateway directory.

If modifications were made to the docker-compose.yml or docker-compose.override.yml files, these should be backed up too before upgrades.

  • No labels