ChatGPT, the Kube Administrator

Posted by Will Cushen on Monday, January 16, 2023

ChatGPT, the hype

For two years running, the year’s most talked about story in tech came at a time where most are caught up in the end of year wind down, with Black Friday and the festive season serving as the ultimate distractions. In 2021, it was the log4j scare, the exposed vulnerability of a Java-based logging utility, and again 2022 gave birth to what many are calling the most progressive leap forward in natural language processing since Google’s search engine.

Introduced to the world by OpenAI on November 30 last year, there’s been no shortage of bewilderment surrounding ChatGPT’s ability to provide human-quality answers to quite literally anything that’s thrown at it.

ChatGPT is a large language model (LLM) based on GPT-3.5, the third generation Generative Pre-trained Transformer, that according to Stanford University “has 175 billion parameters and was trained on 570 gigabytes of text. For comparison, its predecessor, GPT-2, was over 100 times smaller at 1.5 billion parameters."

As we seemingly enter yet another new frontier in AI, the emergence of ChatGPT has again invoked the age-old question, “will this thing take my job?".

Certainly, its position to supersede human aptitude is already in question, including what it means for the future of AI in education as an example. A school in New York has recently banned the tool, prohibiting it out of fear that it could write A+ essays on the American Civil War and even Australian universities are incredibly considering a retreat to ‘pen and paper’ exams.

Conversely however, there’s many out there that are taking a glass half-full approach and speaking to ways in which ChatGPT can aid certain vocations, such as software development as explained here.

Unsurprisingly, the advancement of ChatGPT is moving at a scary pace. At the time of writing, OpenAI supposedly is on the verge of (Q1 of 2023) releasing GPT-4. The rumour mill’s been in overdrive, but reports are hinting at a bump up to 1 trillion model-fed parameters. Mind-boggling.

So what does it know about Kube?

In this blog, I too am going to put ChatGPT to the test and see how much it can help (replace) me in my day gig managing Kubernetes clusters.

We’ll throw it some Kube 101 queries and then go onto gauge if it can assist on more complex topics such as service mesh (Istio).

Watch the below video for the interrogation 😈

A couple of things to note here:

  1. I ran the same script of four prompts approximately a dozen times and there was some variation in the responses for most of them, aside maybe from asking it how to scale up a deployment to two replicas (i.e. something very straightforward).
  2. It wasn’t captured in the recorded video demo, but on other occasions when I asked ChatGPT to deploy to an actual cluster, it reaffirmed its constraints stating, “I’m sorry, but I am a text-based AI and do not have the ability to interact with external systems." So as expected, it cannot literally deploy a workload for you.
  3. With respect to Service Mesh, I chopped and changed in my testing between referring to Istio and OpenShift Service Mesh and only on one occasion did I receive direction about installing a Gateway and VirtualService, necessary for getting traffic into the mesh from externally.

OpenShift Service Mesh is actually based on an opinionated distribution of Istio called Maistra and it differs to generic Istio by instead of injecting the required Envoy sidecar into all deployments within labeled projects, Maistra relies on presence of the sidecar.istio.io/inject annotation AND the project being listed in the ServiceMeshMemberRoll.

Deploying an Istio microservice - according to ChatGPT

So with all that said, I combined the best-of-breed responses from my numerous askings to ChatGPT and executed the YAML it provided me on the cluster.

Given my line of questioning, I’m going to give it benefit of the doubt and assume it believed that I had already installed OpenShift Service Mesh and enabled a namespace to be managed by the Service Mesh Control Plane.

As you saw in the video, ChatGPT rendered a Deployment manifest that I took and applied to the cluster.

From there I exposed the deployment.

$ oc expose deployment/nginx --port=80
service/nginx exposed

We bound a highly privileged Security Context Constraint (SCC) to the default ServiceAccount in order to make this particular deployment run in OpenShift. I would like to stress that this should NEVER be used in production.

…And then I went on to create the Gateway and VirtualService (that ChatGPT told me to create after some continual nudging on my behalf!)

oc apply -f - <<EOF
apiVersion: networking.istio.io/v1alpha3
kind: Gateway
metadata:
  name: nginx-gateway
  namespace: my-mesh-enabled-namespace
spec:
  selector:
    istio: ingressgateway # use istio default controller
  servers:
  - port:
      number: 80
      name: http
      protocol: HTTP
    hosts:
    - "*"
EOF
oc apply -f - <<EOF
apiVersion: networking.istio.io/v1alpha3
kind: VirtualService
metadata:
  name: nginx-vs
  namespace: my-mesh-enabled-namespace
spec:
  hosts:
  - "*"
  gateways:
  - nginx-gateway
  http:
  - match:
    - uri:
        prefix: /
    route:
    - destination:
        host: nginx
        port:
          number: 80
EOF

I’m prepared to give ChatGPT a pass mark despite the fact that it did not instruct me to retrieve the GATEWAY_URL to access NGINX’s welcome page. For others though, the jury is still out regarding its competence around Istio.

$ export GATEWAY_URL=$(oc -n istio-system get route istio-ingressgateway -o jsonpath='{.spec.host}')

Retrieving the outputted URL and heading across to a browser gives me… 🎉

Wrap Up

In keeping with the theme of this blog post (as well as finding an opportunity to lazily outsource), I thought I’d hand over the reigns of this conclusion to ChatGPT and see what it could come up with.

And so, I fed it the following prompt:

Provide me with a conclusion to a blog post that discusses the use of ChatGPT to develop Kubernetes manifests and provide guidance on how to deploy to a Kubernetes cluster.

“In conclusion, ChatGPT can be a useful tool for generating Kubernetes manifests and providing guidance on how to deploy to a Kubernetes cluster. It is able to quickly generate manifest files and provide recommendations based on user input, making it a time-saving and convenient tool for those working with Kubernetes. However, it is important to note that ChatGPT is not a replacement for in-depth knowledge of Kubernetes and should be used as a supplement to, rather than a substitute for, manual configuration and understanding of the system. Overall, ChatGPT can be a valuable tool for those looking to streamline their Kubernetes workflow and deploy applications efficiently."

I like this answer for two reasons. First, it perfectly sums up the function ChatGPT serves for software developers and infrastructure engineers alike - a time-saving tool that makes it easier for folks to get their job done and focus on more important (à la innovative) tasks. Second, it’s reaffirming the role of humans to not take its output as gospel and always, always thoroughly review and test any manifests or deployment instructions generated by the tool before implementing them in a production environment - in short, our jobs are still safe tomorrow 😌

Put simply however, ChatGPT is an amazing playground to tinker in and I hope you dive right in to see what it can do for you, both personally and professionally!


comments powered by Disqus