Good old FTP (and cloud!) to protect your vSphere environment

0 Flares Twitter 0 Facebook 0 Google+ 0 LinkedIn 0 Email -- 0 Flares ×
Lately, I had to rebuild my personal lab after a major crash of the storage system I use, and even if I was able to restore everything, the procedure I had to use involved many manual rebuilding of the resources. This time, I decided that everything has to be protected in a way that it would be really easy to perform e restore of every component of the lab. I started with the basic infrastructure based on VMware vSphere, and I realized that, as much as it sounds something from the jurassic age, the best way to create backups of those resources was to use an FTP server!

FTP?

Yes, FTP. you read it right. In my lab I run an almost complete vSphere stack, and for this reason I have different systems I need to protect. And even if they are virtual machines, so it would be easy to just backup those VMs, some of them have an internal configuration backup, that may be useful to use.
Let’s start with vCenter: I run a vCenter Server Appliance 6.7 now in my lab, and it has its own internal backup options, that can be reached from the management interface:
As you can read, the options are FTPS, HTTPS, SCP, FTP and HTTP. So, there are multiple options here, but I also have to backup NSX Manager in my lab, and here the choices are more limited:
Only FTP is available here, so it’s pretty easy to choose this solution as a common protocol to backup both platforms. To do this, I installed FileZilla Server in one of my management machines, and left the default option to run it as aservice:
Once installed, I created c:\backups as my start folder, and two subdirectories into it, named vcsa and nsx. Then, I created two users and mapped each one to the corresponding directory, with proper permissions:
NOTE: give “create directories” permissions to the VCSA user, since vCenter backup tries to create some additional subfolders in the home directories. Otherwise the backup will fail but the error will not talk about permissions, so this could be confusing.
Once the FTP server was ready, I went back to the two configuration panes of both vcsa and nsx and configured the backup options; first vcsa:
and then nsx:

… and to the Cloud!

After waiting for the first night to pass, the first backups were stored in the FTP server. But the FTP server in just another virtual machine inside my lab, so if anything happens to it I would lose all those backups. To solve this, I decided to store my backup in a remote location using Amazon S3. The quickest way to do so is to use one of the multiple “mount S3 as a local filesytem” software solutions, possibly a free one. There are many solutions to achieve this goal, from Dropbox to OneDrive to Google Drive, but since I already have an AWS account, I decided to use S3 Browser to upload my files to a new S3 bucket via its command line tool.
I created a new AWS IAM user with the minimum amount of permissions, allowed to work only on the dedicated bucket I created for this job, and also filtering out the source IP as the only one the account can connect from, thanks to the fact my lab has some fixed IP addresses. AWS security is out of scope of this post, but you can find endless resources about the best way to configure securely an S3 bucket.
Upon opening S3 Browser for the first time, it automatically asks to register a new S3 account, and this is what I did, by adding the newly created IAM user to the software:
This is useful because the command line tool of S3 Browser can use directly this information. Time to set the new sync job. Let’s open a cmd console and test the command:
“C:\Program Files\S3 Browser\s3browser-con.exe” sync vccr-lab-backups:password c:\backups s3:vccr-lab-backups ncdhs
NOTE: replace “password” with the Password that was used to encrypt the access keys in the previous screenshot.
The tool syncs the entire content of the c:\backups folder, and at the end, the backup of both NSX and VCSA are now stored in AWS S3. Also, with the “d” option, old backups that are deleted locally will also be deleted from the S3 bucket.
As a last step, the script is scheduled using the Windows Task Scheduler so that it is executed daily. Create a new task with this options:
  • general: choose to run whether user is logged on or not
  • trigger: set a time of the day, and run daily
  • “Action: Start a program
  • Program/script: “C:\Program Files\S3 Browser\s3browser-con.exe”
  • Add arguments: sync vccr-lab-backups:password c:\backups s3:vccr-lab-backups ncdhs
  • Click OK to create Scheduled Task
After the first execution, we can check the logs at %APPDATA%\S3Browser\logs by reading the s3browser-con* log files, and see that the sync is working correctly.
0 Flares Twitter 0 Facebook 0 Google+ 0 LinkedIn 0 Email -- 0 Flares ×