Search

Devops tooling

Tag

docker-composer

Graphite and Grafana in Docker

https://github.com/VAdamec/docker-composer-market/tree/master/graphite-grafana
 docker-compose up
...
 docker-compose down

Graphite + Carbon-cache

  • 8888 the graphite web interface admin/admin
  • 2003 the carbon-cache line receiver (the standard graphite protocol)
  • 2004 the carbon-cache pickle receiver
  • 7002 the carbon-cache query port (used by the web interface)

Kibana

  • 3000 Kibana4 admin/admin
  • add new data sources
Graphite
http://172.23.0.10:80
proxy

Send some fake stats to Graphite

while true
do
  echo "local.random.diceroll $(((RANDOM % 10) + 1)) `date +%s`" | nc -c localhost 2003
done
<CTL-C>

Create new dashboard with graph from new datasource

  • local.random.diceroll

ElasticSearch v5 + Kibana


X-Pack


Repozitory structure overview

  • set of README files:
    • README – this document
    • README_api_usage – basic handling of ELK via curl
    • README_mapping – ELK mapping example
    • README_query_cheatsheet – examples of query language
    • README_watcher – how to setup simple alerting
    • README_workshop – labs + basic terms
    • README_geoip – info about GeoIP coordinates handling
    • README_s3 – experimental backup to S3 bucket simulated via Riak CS
  • Terraform – just for workshop, spin several servers with preinstalled docker, readme in directory

ELK Stack components

Fluentd

  • 1x Fluentd NEED TO BE RUN FIRST see how to run this stack
    • containers output logger (back to ELK)
    • index platform*

ElasticSearch

  • 3x server (data/client/master role) – you can start just one server (elasticsearchdataone) if you don't have HW resources or limit resources via docker CPU/Mem qutoas, see comments in common-services.yml
  • x-pack installed
  • exposed ports:
    • 920[1-3] / 930[1-3]

Kibana

Logstash

  • used for easy sample data upload
    • exposed ports:
    • 5000 – json filter
    • 5001 – raw, no filters
    • you can use .raw field for not_analyzed data
    • index: logstash*

Riak CS

  • used for AWS S3 simulation
    • exposed ports:
    • 8080 – API
  • not logged via Fluentd to ELK (API key created during start)

Mattermost

  • Mattermost server – running outside demo stack as a simple container with open access and webhook created
  • IP of server is setup in .env

Stack handling

Start stack

  • Start stack (do not use docker-compose up as there are some prerequisities to start stack)
  • This short script will prepare temporary data volumes for ES servers and start fluend container first
  • Download git repo:
$ git clone https://github.com/VAdamec/elk-stack-v5-xpack 
$ cd elk-stack-v5-xpack
$ ./_start

Stop stack

  • just stop containers, for removing network/artefact use docker-copose command
$ ./_stop

Used tools

You can use logstash or kibana containers (all is mounted as /code/) or install on your system (OSX/Lnx).

  • netcat – for logstash feedings
  • jq – for pretty outputs
  • curl – for shell work with Elastic
  • curator

Not covered

 

Powered by WordPress.com.

Up ↑