Skip to content

ELK - Containerized Elastic Stack (Elasticsearch, Logstash, and Kibana) with Docker Compose 🐳.

Notifications You must be signed in to change notification settings

tanhongit/docker-elasticsearch-logstash-kibana

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

23 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

docker-elasticsearch-logstash-kibana

Containerized Elastic Stack (Elasticsearch, Logstash, and Kibana) with Docker Compose.

🐳 Welcome to the Elastic Stack Docker images 🐳

This repository contains the source for building an immutable Docker image for the Elastic Stack. The images are published on Docker Hub and can be used as the base image for running the Elastic Stack in a containerized environment.

Requirements

πŸ› οΈ Configuration

The Elastic Stack services can be configured using environment variables in the .env file. The following variables are available:

  • STACK_VERSION: The version of the Elastic Stack to use. The default value is 8.14.3.
  • ELASTICSEARCH_HTTP_PORT: The port on which Elasticsearch listens for incoming connections. The default port is 9200.
  • ELASTIC_PASSWORD: The password for the elastic user. This password is used to authenticate with Elasticsearch and Kibana.
  • KIBANA_PORT: The port on which Kibana listens for incoming connections. The default port is 5601.
  • LOGSTASH_PORT: The port on which Logstash listens for incoming connections. The default port is 5044.
  • ELASTICSEARCH_JAVA_OPTS: The Java options for Elasticsearch. The default value is -Xmx1g -Xms1g.
  • LS_JAVA_OPTS: The Java options for Logstash. The default value is -Xmx1g -Xms1g.

πŸ›©οΈ Quick Start

To get started, please clone this repository to the local machine:

git clone git@github.com:tanhongit/docker-elasticsearch-logstash-kibana.git

Change the directory to the cloned repository:

cd docker-elasticsearch-logstash-kibana

Create a .env file from the .env.example file:

cp .env.example .env

Change the value of the ELASTIC_PASSWORD and any other variables in the .env file as needed.

Start the Elastic Stack services:

docker-compose up -d

Access the Kibana web interface by navigating to http://localhost:5601 in a web browser. (Use your custom KIBANA_PORT if you have changed it in the .env file.)

πŸ’» Start in Mac with arm64

To run as amd64. You need to set the default platform to linux/amd64:

export DOCKER_DEFAULT_PLATFORM=linux/amd64

πŸ‚ Usage

Import Data into Elasticsearch using Logstash

This is an example of how to import data (csv file) into Elasticsearch using Logstash:

  1. Create a logstash.conf file with the following content:
input {
  file {
    path => "/usr/share/logstash/data/employees.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

filter {
  csv {
    separator => ","
    columns => ["id", "name", "code", "salary"]
  }

  mutate {
    convert => {
      "id" => "integer"
      "name" => "string"
      "code" => "integer"
      "salary" => "float"
    }
  }
}

output {
  elasticsearch {
    hosts => "${ELASTIC_HOSTS}"
    user => "elastic"
    password => "${ELASTIC_PASSWORD}"
    index => "employees"
  }
  stdout { codec => rubydebug }
}

Note:

  • The logstash.conf file reads data from the employees.csv file and imports it into Elasticsearch.
  • The employees.csv file should be placed in the logstash/data directory.
  • The ELASTIC_HOSTS and ELASTIC_PASSWORD environment variables are used to connect to Elasticsearch.
  • The employees index is created in Elasticsearch.
  • The rubydebug codec is used to output the data to the console.
  • The sincedb_path is set to /dev/null to avoid saving the state of the file.
  1. Create a employees.csv file with the following content:
id,name,code,salary
1,Alice,1001,50000
2,Bob,1002,60000
3,Charlie,1003,70000
4,Dave,1004,80000
5,Eve,1005,90000
  1. Update docker-compose.yml to include the Logstash service:
  logstash:
    build:
      context: logstash
      args:
        STACK_VERSION: ${STACK_VERSION:-8.14.3}
    container_name: "${COMPOSE_PROJECT_NAME}-logstash"
    environment:
      NODE_NAME: "logstash"
      LS_JAVA_OPTS: "${LS_JAVA_OPTS}"
      ELASTIC_USERNAME: "elastic"
      ELASTIC_PASSWORD: "${ELASTIC_PASSWORD}"
      ELASTIC_HOSTS: "http://elasticsearch:9200"
    volumes:
      - ./logstash/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
      - ./logstash/logstash.yml:/usr/share/logstash/config/logstash.yml
      - ./logstash/data/employees.csv:/usr/share/logstash/data/employees.csv # Add this line
      ...
  1. Start the Logstash service:
docker-compose up -d logstash
  1. Accessing the Elasticsearch API

You can access the Elasticsearch API using curl or tools like Postman. Here are some examples:

curl -X GET "localhost:9200/employees/_search?pretty"

Note:

  • Replace employees with the name of the index you want to query.
  • Change the port number if you have modified the ELASTICSEARCH_HTTP_PORT in the .env file.

❀️‍πŸ”₯ Check this branch to see the full example: [feat/import-csv-with-logstash/docker-compose] ❀️‍πŸ”₯

About

ELK - Containerized Elastic Stack (Elasticsearch, Logstash, and Kibana) with Docker Compose 🐳.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published