Skip to main content

Installation

This documents the first time setup of deployments to GKE, you should only run through these commands if you are setting up from scratch. Please see (TODO) to manage the cluster and deploy.

gcloud container clusters get-credentials cluster-1 --zone europe-west4-a --project rental-connect-kube

Setup an ip address first

gcloud compute addresses list
gcloud compute addresses create banking-heyrobin-co-za-ip --region europe-west4

That should now reserve an ip address You can add an A record at secure.konsoleh.co.za

Persistent storage

This is where we will store the files generated and retrieved from ABSA.

gcloud compute disks create --size=30GB --zone=europe-west4-a absa-banking-files

Get a list of available instances gcloud compute instances list

We then need to attach the disk to an instance

gcloud compute instances attach-disk gke-cluster-1-pool-1-46fd5b59-ln4k --disk absa-banking-files --device-name absa-banking-files

SSH into the instance

gcloud compute ssh gke-cluster-1-pool-1-46fd5b59-ln4k

Now we format the disk, make sure you are formatting the correct disk!!

sudo mkfs.ext4 -m 0 -F -E lazy_itable_init=0,lazy_journal_init=0,discard /dev/sde

Now we can detach the disk and mount it in our yml file

gcloud compute instances detach-disk gke-cluster-1-pool-1-46fd5b59-ln4k --disk absa-banking-files

spec:
containers:
- image: ...
name: ...
volumeMounts:
- name: absa-banking-files
mountPath: /data
# readOnly: true
# ...
volumes:
- name: ...
gcePersistentDisk:
pdName: absa-banking-files
fsType: ext4

Cloud SQL

gcloud sql instances create --database-version=POSTGRES_9_6 --tier=db-f1-micro --gce-zone=europe-west4-a banking-images-db

gcloud sql instances list

Create the database

gcloud sql databases create banking_production --instance=banking-images-db --charset=UTF8 --collation=en_US.UTF8

Enable the sqladmin gcloud services enable sqladmin.googleapis.com gcloud sql users create proxyuser --instance=banking-images-db --password=XXXXX

install jq (brew install jq)

export CONNECTION_NAME="$(gcloud sql instances describe banking-images-db --format=json | jq -r '.connectionName')"

GKE

Service Accounts

Create some permissions and roles here

gcloud iam service-accounts create app-banking
export PROJECT_ID=rental-connect-kube
export DNS_WEBSITE=banking.heyrobin.co.za
export APP_USER_EMAIL="$(gcloud iam service-accounts list --format=json | jq -r '.[] | select(.email | startswith("app-banking@")) | .email')"
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$APP_USER_EMAIL" --role='roles/storage.admin'
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$APP_USER_EMAIL" --role='roles/errorreporting.admin'
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$APP_USER_EMAIL" --role='roles/cloudsql.client'

We also need an Iam service to access SQL.

gcloud iam service-accounts create sql-banking-user
export SQL_USER_EMAIL="$(gcloud iam service-accounts list --format=json | jq -r '.[] | select(.email | startswith("sql-banking-user@")) | .email')"
gcloud projects add-iam-policy-binding $PROJECT_ID --member="serviceAccount:$SQL_USER_EMAIL" --role='roles/cloudsql.client'

Container Registry

gcloud container clusters list

Setup docker to push images to google container registry

gcloud auth configure-docker
docker build -t banking-api -f Dockerfile.prod .
docker tag banking-api:latest gcr.io/rental-connect-kube/banking-api:latest
gcloud docker -- push gcr.io/rental-connect-kube/banking-api

Secrets

We need to create some secrets for the database

gcloud iam service-accounts keys create deploy/.keys/app-banking-user.json --iam-account $APP_USER_EMAIL
kubectl create secret generic app-banking-user-credentials --from-file=keyfile=deploy/.keys/app-banking-user.json
gcloud iam service-accounts keys create deploy/.keys/sql-banking-user.json --iam-account $SQL_USER_EMAIL
kubectl create secret generic cloudsql-instance-credentials --from-file=credentials.json=deploy/.keys/sql-banking-user.json

Setup the SQL password as a Secret

kubectl create -n banking secret generic cloudsql-db-credentials --from-literal=username=proxyuser --from-literal=password=XXXX

Create a secret for netup ssh access

kubectl create -n banking secret generic netup-ssh-key --from-file=../.ssh/id_rsa

Jobs

Run the migration kubectl apply -f deploy/k8s/jobs/job-migrate.yml kubectl get jobs kubectl get pods

kubectl logs pods/<POD NAME> -c banking-images-migrate

delete the job

kubectl delete job banking-images-db-migrate

Services

Lets bring up the web service

kubectl apply -f deploy/k8s

kubectl port-forward banking-images-web-79c559bf87-47sk4 3000 http://127.0.0.1:3000/documentation

RabbitMq

gcloud iam service-accounts create rabbitmq-banking-user
export MQ_USER_EMAIL="$(gcloud iam service-accounts list --format=json | jq -r '.[] | select(.email | startswith("rabbitmq-banking-user@")) | .email')"
gcloud iam service-accounts keys create deploy/.keys/rabbitmq-banking-user.json --iam-account $MQ_USER_EMAIL
kubectl create secret generic rabbitmq-instance-credentials --from-file=credentials.json=deploy/.keys/rabbitmq-banking-user.json
kubectl create secret generic rabbitmq-credentials --from-literal=username=admin --from-literal=password=XXXXX

To enable the rabbitmq management ui interface.

kubectl exec -n banking -it rabbitmq-0 rabbitmq-plugins enable rabbitmq_management

Now bring it up kubectl apply -f deploy/k8s/rabbitmq

Access it locally kubectl port-forward -n banking rabbitmq-0 15672

To retrieve the uname/password you can run

kubectl get -n banking secret rabbitmq-credentials -o yaml and then decode it echo password | base64 -D

kubectl exec -n banking -ti banking-images-web-0 bash

KONG

To connect to KONG you will need to install kong dashboard first.

npm install -g kong-dashboard

kubectl port-forward -n kong kong-rc-5cc6c587b8-lk9r9 8001
kong-dashboard start --kong-url http://127.0.0.1:8001

Move DB from pg_dump to CloudSQL

pg_dump -h 127.0.0.1 -d banking -w -U database_backup --no-owner --format=plain --no-acl | sed -E 's/(DROP|CREATE|COMMENT ON) EXTENSION/-- \1 EXTENSION/g' > banking.sql

gzip banking.sql

make sure the dump contains 'USE banking_production;' otherwise the import will not work

Create a bucket on google and upload the sql file https://cloud.google.com/sql/docs/postgres/import-export/importing

gcloud sql instances describe banking-images-db
gsutil acl ch -u ioiqucjcjzezpob4bgg5uc5cpu@speckle-umbrella-pg-7.iam.gserviceaccount.com:W gs://sql-banking-castle-one
gsutil acl ch -u ioiqucjcjzezpob4bgg5uc5cpu@speckle-umbrella-pg-7.iam.gserviceaccount.com:R gs://sql-banking-castle-one/banking.sql
gcloud sql import sql banking-images-db gs://sql-banking-castle-one/banking.sql --database=banking_production

Use the console. Make sure you uncomment the extensions in the dump file.