This page shows how to create a user cluster that is in a different data center than the admin cluster.
As of Anthos clusters on VMware (GKE on-prem) version 1.9, you can create admin and user clusters that have different vSphere datacenters. This document describes how you can use this public-preview feature.
Make sure that your admin workstation version is upgraded to 1.9 or above.
Step 1: Create a user cluster configuration file for the new user cluster
See Create a user cluster for more information about creating user clusters.
When you create a user cluster configuration file for the new user cluster, you must include the field
vCenter.datacenter if you want it to be in a separate data center than the admin cluster. When you include the
vCenter.datacenter field, then you must also include the
masterNode.vSphere.datastore is required if the admin cluster and user cluster are in separate virtual vSphere data centers. It is recommended that you create a separate datastore for each user cluster control plane in the admin cluster data center, so that the admin cluster control plane and the user cluster control planes have isolated datastore failure domains. Although you can use the admin cluster datastore for user control-plane nodes, that puts the user control-plane nodes and the admin cluster into the same datastore failure domain.
In this example, the admin cluster is in the
ADMIN_DATACENTER, and the new user cluster is in the
apiVersion: v1 kind: UserCluster # ... gkeOnPremVersion: 1.9.0-gke.x # ... vCenter: datacenter: USER_DATACENTER # Required cluster: USER_CLUSTER # Required if resourcePool is not specified. folder: USER_FOLDER # Optional resourcePool: USER_RESOURCE_POOL # Required if cluster is not specified. datastore: USER_DATASTORE # Required credentials: # Optional fileRef: path: USER_VCENTER_CREDENTIAL_PATH entry: vCenter masterNode: vsphere: datastore: USER_MASTER_DATASTORE # Required. network: vCenter: networkName: USER_NETWORK # Required
Step 2: Upload an OS image template
gkectl prepare to upload the OS image template to
gkectl prepare --kubeconfig ADMIN_KUBECONFIG --bundle-path BUNDLE_TARBALL --user-cluster-config USER_CLUSTER_CONFIG
Replace the following:
BUNDLE_TARBALLwith the path to the 1.9.0-gke.x bundle tarball.
ADMIN_KUBECONFIGwith the path of your admin cluster kubeconfig file.
USER_CLUSTER_CONFIGwith the path of your user cluster configuration file.
Step 3: Create the user cluster in a separate data center
gkectl create to create the new user cluster in a separate data center. This command is the same as for the same data center as the admin cluster.
gkectl create cluster --kubeconfig ADMIN_KUBECONFIG --config USER_CLUSTER_CONFIG
Service accounts for the new user cluster
A user cluster can use a different vSphere service account, with different
vCenter.credentials, from the admin cluster. The admin service account only needs access to the admin datacenter, while the user service account only needs access to the corresponding user datacenter.
Firewall Rules for the new user cluster
The admin cluster and the user cluster can use
networkName in different VLANs and different datacenters. However, the following cross-VLAN communication must be allowed.
- Admin cluster nodes can access port 22 on the user cluster node IP addresses. This access is required for the user control plane to create ssh tunnels to user nodes.
- User nodes can access port 443 on the user control plane VIP address. This access is required for user nodes and pods to talk with the apiserver.