I manage a number of vCenter instances and a lot of ESXi hosts. Some of the hosts are production, some for test and development. Sometimes an ESXi host needs to be used by a different group or temporarily moved to a new cluster and then back again afterwards.
To automate the configuration of these systems and the VMs running on them I use Ansible. For a freshly-imaged, new installation of ESXi one of the first things I do it to run an Ansible playbook that sets up the ESXi host, and the first thing it does is to install the ssh keys of the people who need to log in as root, then it updates the root password.
I have ssh public keys for every user that needs root access. A short bash script combines those keys and my Ansible management public key into authorized_keys
files for the ESXi hosts in each vCenter instance. In my Ansible group_vars/
directory is a file for each group of ESXi hosts, so all of the ESXi hosts in a group get the same root password and ssh keys. This also makes it easy to change root passwords and add and remove ssh keys of users as they are added to or leave different groups.
Here’s a portion of a group_vars/esxi_hosts_cicd/credentials.yml
file for a production CICD cluster:
# ESXI Hosts (only Ops can ssh in)
esxi_root_authorized_keys_file: authkeys-ops
esxi_username: 'root'
esxi_password: !vault |
$ANSIBLE_VAULT;1.1;AES256
34633832366431383630653735663739636466316262
39363165663566323864373930386239380085373464
32383863366463653365383533646437656664376365
31623564336165626162616263613166643462356462
34633832366431383630653735663739636466316262
39363165663566323864373930386239380085373464
32383863366463653365383533646437656664376365
31623564336165626162616263613166643462356462
3061
The password is encrypted using Ansible Vault.
In my main.yml
file I call the esxi_host
role for all of the hosts in the esxi_hosts
inventory group. Since I use a different user to manage non-ESXi hosts, the play that calls the role tells Ansible to use the root user only when logging into ESXi hosts.
- name: Setup esxi_hosts
gather_facts: False
user: root
hosts: esxi_hosts
roles:
- esxi_host
The esxi_host
role has an esxi_host/tasks/main.yml
playbook. The two plays that update the authorized_keys
file and root password look like this:
- name: Set the authorized ssh keys for the root user
copy:
src: "{{ esxi_root_authorized_keys_file }}"
dest: /etc/ssh/keys-root/authorized_keys
owner: root
group: root
mode: '0600'
- name: Set the root password for ESXI Hosts
shell: "echo '{{ esxi_password }}' | passwd -s"
no_log: True
The first time I run this the password is set to some other value, so I start Ansible with:
ansible-playbook main.yml \
--vault-id ~/path/to/vault/private/key/file \
-i inventory/ \
--limit [comma-separated list of new esxi hosts] \
--ask-pass \
--ask-become-pass
This will prompt me for the current root ssh password. Once I enter that it logs into each ESXi host, installs the new authorized_keys file, uses the vault private key to decrypt the password, then updates the root password.
After I’ve done this once, since the Ansible ssh key is also part of the authorized_keys
file, subsequent Ansible updates just use the ssh key to login, and I don’t have to use --ask-pass
or --ask-become-pass
parameters.
This is also handy when switching a host from one cluster to another. As long as the ssh keys are installed I no longer need the current root password to update the root password.
Hope you find this useful.