Creating manual backups of a website can be a time-consuming task. Using Linux, you can automate the entire download/transfer using the handy utility known as wget (Ubuntu users: install it with
sudo apt-get install wget)
Here’s the command to use for downloading your backup:
wget ftp://yourwebsite.com/some/directory -nv -r -N -l inf -nH --ftp-user='username' --ftp-password='password'
I’ve included all the options needed to fetch everything in the specified folder and maintain that same folder structure in the downloaded backup. Make sure to substitute your own FTP host, directory, username and password above! You can extend this further by using cronjobs to run this command on a set schedule, compressing the files into a single archive file, and more!