Forum Discussion

williammyer1's avatar
Icon for Nimbostratus rankNimbostratus
May 29, 2024

BigIP UCS Backup script; looking for some guidance on design



I've began to work on a bash script, intended to be ran locally on each F5 appliance via a cron task.

The criteria for this script has been,

  • Saves the UCS /w encryption using {Hostname}-YYYY-MM-DD.ucs naming format.
  • Uploads the generated UCS file to a SFTP server 
    • SFTP native commands are a MUST, SCP will not work due to it's reliance on command shell/login.
  • Rollover after X # of saved files in order to prevent storage exhaustion on the target SFTP Server
    • I strongly doubt any form of deduplication will work with a encrypted UCS
  • Sends an email notification if the backup failed


I've so far written a script that addresses the first 3 criteria and have been waiting for those to go through their paces in testing before adding in notification logic. 

The commands and logic being used have gotten more complex, the further I've gotten into the script's development. This has lead to some concerns about whether this is the best approach given the nature of the F5 BigIP systems being a vendor appliance and worry that there's a large possibility commands may stop working correctly after a major x. version update, requiring an overhaul of a fairly complex script.  I'm almost wondering if setting up an AWX/Tower host in our environment and then using the f5networks Ansible Module for the majority of the heavy lifting followed by some basic logic for file rotation, would be a better long term approach.  Ansible would also be a bit more flexible in that I wouldn't have to hardcore values that diverge between individual hosts into the script itself. It's however not clear if the F5networks ansible module supports SFTP as I only see SCP referenced.


Advice and insight is much appreciated!

# F5 backup script based on

# User-configurable Variables
ENCRYPTION_PASSPHRASE='' # Blank out the value to not encrypt the UCS backup.
MAX_FILES=45  # Maximum number of backup files to keep

# Dynamic Variables (do not edit)
DATE=$(date +%Y-%m-%d)

# Start logging
echo "$(date +'%Y-%m-%d %H:%M:%S') - Starting backup script." >> ${LOG_FILE}

# Save the UCS backup file
if [ -n "${ENCRYPTION_PASSPHRASE}" ]; then
    echo "Running the UCS save operation (encrypted)." >> ${LOG_FILE}
    tmsh save /sys ucs ${UCS_FILE} passphrase "${ENCRYPTION_PASSPHRASE}" >> ${LOG_FILE} 2>&1
    echo "Running the UCS save operation (not encrypted)." >> ${LOG_FILE}
    tmsh save /sys ucs ${UCS_FILE} >> ${LOG_FILE} 2>&1

# Create a temporary batch file for SFTP commands
echo "cd ${REMOTE_DIR}" > $BATCH_FILE
echo "put ${UCS_FILE}" >> $BATCH_FILE
echo "bye" >> $BATCH_FILE

# Log that the transfer is starting
echo "Starting SFTP transfer." >> ${LOG_FILE}

# Execute SFTP command and capture the output
transfer_command_output=$(sftp -b "$BATCH_FILE" -i "${SSH_KEY}" -oBatchMode=no "${REMOTE_USER}@${REMOTE_HOST}" 2>&1)

# Extract the "Transferred:" line
transfer_summary=$(echo "$transfer_command_output" | grep "^Transferred: sent")

if [ $transfer_status -eq 0 ]; then
    if [ -n "$transfer_summary" ]; then
        echo "UCS file copied to the SFTP server successfully (remote:${REMOTE_HOST}:${REMOTE_DIR}/${UCS_FILE}). $transfer_summary" >> ${LOG_FILE}
        echo "UCS file copied to the SFTP server successfully (remote:${REMOTE_HOST}:${REMOTE_DIR}/${UCS_FILE}). Please check the log for details." >> ${LOG_FILE}
    echo "$transfer_command_output" >> ${LOG_FILE}
    echo "UCS SFTP copy operation failed. Please read the log for details." >> ${LOG_FILE}
    rm -f $BATCH_FILE
    exit 1

# Clean up the temporary batch file

# Rollover backup files if the number exceeds MAX_FILES
echo "Checking and maintaining the maximum number of backup files." >> ${LOG_FILE}

# Create a list of files to delete
sftp -i "${SSH_KEY}" -oBatchMode=no "${REMOTE_USER}@${REMOTE_HOST}" <<EOF > file_list.txt
ls -1 ${HOSTNAME}-*.ucs

# Filter out unwanted lines and sort the files alphanumerically
grep -v 'sftp>' file_list.txt | grep -v '^cd ' | sort > filtered_file_list.txt

# Determine files to delete
files_to_delete=$(head -n -${MAX_FILES} filtered_file_list.txt)

if [ -n "$files_to_delete" ]; then
    # Create a temporary batch file for SFTP cleanup commands
    for file in $files_to_delete; do
        echo "Deleting $file" >> ${LOG_FILE}
        echo "rm $file" >> $CLEANUP_BATCH_FILE
    echo "bye" >> $CLEANUP_BATCH_FILE

    # Execute SFTP cleanup command and log the output
    cleanup_command_output=$(sftp -b "$CLEANUP_BATCH_FILE" -i "${SSH_KEY}" -oBatchMode=no "${REMOTE_USER}@${REMOTE_HOST}" 2>&1)
    echo "$cleanup_command_output" >> ${LOG_FILE}

    # Clean up the temporary batch file
    echo "No files to delete. Total files within limit." >> ${LOG_FILE}

# Clean up the file lists
rm -f file_list.txt filtered_file_list.txt

# Delete the local copy of the UCS archive
tmsh delete /sys ucs ${UCS_FILE} >> ${LOG_FILE} 2>&1

echo "$(date +'%Y-%m-%d %H:%M:%S') - Backup script completed." >> ${LOG_FILE}


2 Replies

  • williammyer1 I would like to add that whenever you do a code upgrade or patch you will have to configure your cron job all over again because that configuration doesn't carry over during upgrades. You are definitely better off going the route that JRahm has mentioned.

  • Hi williammyer1 . Nice work on your backup script! What you point out is true, however. You do need to adjust as you upgrade, and potentially manage device-specific data, so the opex for that is going to be higher. Ansible is a better approach for those reasons. You can use the bigip_ucs and bigip_ucs_fetch modules to create the ucs and download it to your ansible hub, then use/write an sftp module to move the files to their destination.