Easily Set Up Elasticsearch and Kibana Locally Using Docker-Compose

Elasticsearch is a robust and reliable technology that can handle and analyze large volumes of data efficiently. Complementing Elasticsearch, Kibana provides a visual interface where you can create, manage, and visualize your data interactively. Together, they form a powerful duo to help you make sense of your data, get insights, and troubleshoot issues quickly.

In this guide, I’ll walk you through a simplified setup of Elasticsearch and Kibana on your local machine using Docker Compose. This method abstracts much of the cumbersome configuration and setup, letting you dive straight into data exploration and management.

Prerequisites:

  • Ensure Docker and Docker Compose are installed on your machine. If not, follow the instructions here.

Step 1: Create the Docker-Compose File

Start by creating a file named docker-compose.yml in a new directory, and copy the following content into it. The file configures node deployment, Xpack security configuration, and required config for username and password which Kibana will use to authenticate against.

version: "3.7"
services:   
    elasticsearch:
        image: elasticsearch:7.17.10
        restart: always
        ports:
            - 9200:9200
        environment:
            - discovery.type=single-node
            - xpack.security.enabled=true
            - ELASTIC_PASSWORD=<YOUR_PASSWORD>
    kibana:
        image: kibana:7.17.5
        restart: always
        ports:
            - 5601:5601
        environment:
            - ELASTICSEARCH_HOSTS=http://elasticsearch:9200
            - XPACK_MONITORING_ENABLED=true
            - XPACK_MONITORING_COLLECTION_ENABLED=true
            - XPACK_SECURITY_ENABLED=true
            - ELASTICSEARCH_USERNAME=elastic
            - ELASTICSEARCH_PASSWORD="<YOUR_PASSWORD>"
        depends_on:
            - elasticsearch

This configuration sets up a single-node Elasticsearch cluster with security features enabled and a Kibana instance connected to it. Replace <YOUR_PASSWORD> with a strong password of your choice.

Step 2: Launch the Containers

Navigate to the directory containing the docker-compose.yml file and run the following command:

# from the same folder run 
docker-compose up

Step 3: Access Kibana

Open a browser and head to http://localhost:5601/. Use the credentials: Username – elastic, Password – <YOUR_PASSWORD> to log in.

Step 4: Execute and Debug Elasticsearch Queries

With Kibana operational, execute API calls against your local Elasticsearch instance from the Dev Tools console: http://localhost:5601/app/dev_tools#/console.


This setup provides a seamless, secure local environment to explore Elasticsearch queries and Kibana visualizations, enabling you to debug and refine your search queries effectively. Using Docker-Compose abstracts away much of the manual configuration for nodes/replicas, security, providing a straightforward way to delve into the powerful features of Elasticsearch and Kibana.

About the author

Muaaz

View all posts