Skip to content
Luca Dell'Oca Principal Cloud Architect @Veeam
Virtual To The Core Virtual To The Core

Virtualization blog, the italian way.

  • Media
  • About me
Virtual To The Core
Virtual To The Core

Virtualization blog, the italian way.

Veeam Availability Console series #1: Installation and initial configuration

Luca Dell'Oca, July 26, 2017July 25, 2017

As Veeam is soon to release the final version of a new solution, called Veeam Availability Console, I started to study this software, since it’s a key component of Veeam strategy for Service Providers, which is my main focus as a Veeam employee. In this series of posts, I will explore the software, its architecture, how it works, and what can be done with it. In this first post, we’ll start with a bit of theory, and we’ll see how to install and configure it.

VAC: what it is

Veeam Availability Console (from now on VAC) is a management platform for service providers, to offer to end users Veeam-powered services. From a technical point of view, is a web-based solution, completely multi-tenant, where providers can onboard, monitor and manage their customers. VAC is a software that can be installed on Microsoft Windows, and has two main services:
– Veeam Availability Console Server, the core of the solution
– Veeam Availability Console UI, the web interface
Plus, VAC uses a Microsoft SQL server on the backend to store all its information and uses Veeam Cloud Connect as the transport and tunneling technology to communicate over the internet with remote agents. The solution, when deployed, is similar to this:
In my lab, I have a complete Veeam Cloud Connect environment, so I don’t need to deploy it from scratch. For providers not using it yet, VAC can install the Veeam Cloud Gateway component into the same machine where it is installed, so that small deployments can use an “all in one” design. For larger deployments, however, a dedicated machine for VAC is to be preferred.
If VAC is running on a dedicated machine, note the TCP ports requirements: as Veeam Cloud Gateways are often deployed in a separate network, like a DMZ, a firewall port needs to be open for TCP/UDP 9999 to allow VAC to communicate with Cloud Gateways. During the beta phase of VAC, this has been one of the most common installation issue faced by our beta testers.

VAC Installation and first configuration

VAC installation is really simple, and the on-screen wizard can be followed from start to end. A bit of sizing may be needed for a production environment, so it’s better to spend a couple words on this topic even if this is just a lab. VAC has been designed to manage, at its maximum, 10.000 Veeam agents (both Windows, Linux in the future, or VBR servers). The backend database is responsible, as said before, to store all the information and history of managed agents, and supposing 1 backup activity per day per Agent, 10.000 agents will grow the database by ~250 MB per day. Thus, in 3 months we can expect about 25 GBs of data for the VAC server (using our default retention policy settings).
For CPU and memory, I’d start with 4 CPU and 16 GB of ram in a virtual machine, and then grow both parameters according to monitored resource consumption. I don’t have these sizing information yet.
As I said, the installation is pretty simple: during the wizard, the only point of attention is the last step, during the recap of the installation parameters, if you want to change any of them like the database location or the default TCP ports:
As soon as the installation is completed, we can open the console via the Icon saved on the desktop, which simply is a link to the web interface, that you can always reach at http://hostname:1280. During the first login, you are presented with the initial configuration:
As I said before, VAC relies on the Veeam Cloud Connect technology to interconnect to remote agents and Veeam installations. Every communication is tunneled inside the TCP-6180 port that is also used for Cloud Connect. So, in order to use VAC, we need a Cloud Connect system. As I have already a complete Cloud Connect platform in my lab, let’s configure this. By hitting the Step 1, we are brought to the Cloud Connect Server part of the configuration, where we can add a new server.
 
The server is correctly added to the VAC installation:
 
And the first step of the Getting Started wizard is completed:
The next step is to configure notifications, by using a valid SMTP server, and by configuring an email address to send the notifications.
We’ll see in the next post of this series how to personalize the Availability Console and prepare to onboard our first customers.

 

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to email a link to a friend (Opens in new window) Email
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on Pinterest (Opens in new window) Pinterest
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to share on Pocket (Opens in new window) Pocket
Tech availabilityconfigurationconsoleinstallationintroductionVACveeam

Post navigation

Previous post
Next post

Search

Sponsors

Latest Posts

  • Migrate WSL (Windows Subsystem for Linux) to a new computer
  • Pass keystrokes to a pfSense virtual machine to install it automatically
  • Automatically deploy pfSense with Terraform and Ansible
  • My Automated Lab project: #6 Create a S3 Bucket with Terraform
  • My Automated Lab project: #5 Deploy a Linux vSphere VM with Terraform and custom disks
©2025 Virtual To The Core | WordPress Theme by SuperbThemes
We use cookies to ensure that we give you the best experience on our website, and to collect anonymous data regarding navigations stats using 3rd party plugins; they all adhere to the EU Privacy Laws. If you continue to use this site we will assume that you are ok with it.OkNoPrivacy Policy