OpenShift uses a modified version of the ELK stack known as EFK – Elasticsearch, Fluentd and Kibana to provide log aggregation. And yes, you guessed it, the setup is made easy with OpenShift and take just one command line switch. This time its Bogor’s turn. This lab is another in the OpenShift MiniLabs series.
Objective
Let’s get up you up and running with log aggregation using the EFK stack with a meaningful example in just a few minutes.
Setup
Initial Attempt
This tutorial assumes you have completed the OpenShift MiniLabs installation procedure. Then refresh before continuing. Adding the –logging parameter to the launch command will set up the EFK infrastructure when using a version of the oc tools at v1.4 or later. If you have been using v1.3 or earlier for these MiniLabs back up your profiles, download the v1.4 distribution and install it then continue. You can revert back to v1.3 by switching the oc tool in your PATH.
$ mv ~/.oc ~/.oc-v1.3 $ oc version $ cd ~/containersascode $ ./oc-cluster-wrapper/oc-cluster down $ ./oc-cluster-wrapper/oc-cluster up --logging=true containersascode
Instructions
Verify Logging Services
The –logging switch will set up the EFK stack in a project called logging. You can inspect this using the Console using credentials admin/admin or via the CLI instructions below.
$ oc login -u system:admin $ oc project logging $ oc get pods
Inspect the Kibana Console
Login in to the Console using credentials: admin/admin and inspect the Kibana service at: https://kibana-logging.127.0.0.1.xip.io.
Generate Traffic
We need some container workload traffic to generate some log activity. Let’s do this by completing the A/B Deployment MiniLab. Once the two apps are ready to accept requests, launch a Browser and point it to http://ab-cotd.127.0.0.1.xip.io/ and step through the items and rate and save a few of them for application version (A). Do the same using another Browser so as to ensure you see a different version of the application (B) and again rate and save a few of the item. If you inspect the logs for any of these A/B apps you should see a json like string within the markers , .
$ oc login -u developer -p developer $ oc project cotd $ oc get pods $ oc logs -f $PODID_OF_cotd1 $ oc logs -f $PODID_OF_cotd2
Verify Success
Point your Browser at the Kibana console and login using credentials developer/developer. Ensure that the “project.cotd.NNNN” is selected from the drop down list at the top left hand corner. The search field probably will show a “*” to being with and so all log records will be displayed. Enter a search string such as message:”client_ip” in its place and then click the magnifying glass icon to refresh. You can also manipulate the time range to ensure you catch the logs entries you are expect to see something like:
Optional
Note that EFK offers many powerful capabilities not described here. But for example if you click the “^” beneath the histogram in the Console you can inspect the raw JSON request/response data. The request can be used as the payload for a curl command to capture raw JSON response log entries for subsequent data mining and analysis.
Trivia
Read all about the background details to installing log aggregation for yourself at: https://docs.openshift.com/enterprise/3.1/install_config/aggregate_logging.html or https://access.redhat.com/documentation/en/openshift-enterprise/3.0/paged/administrator-guide/chapter-5-aggregating-container-logs.
Google Fluentd, Kibana and Elastic Search for a wealth of information on using these specific tools including how to form queries and searches at: https://www.elastic.co/guide/en/elasticsearch/reference/5.0/index.html.