elasticsearch reference

Tools
– head for Chrome (ElasticSearch Head – Chrome Web Store)
– Postman (link)
– Insomenia (link)
– elasticdump – nodejs (link)

Monitoring
– ps_mem.py – monitor real memory utilization (github link)
ps -eo size,pid,user,command --sort -size | awk '{ hr=$1/1024 ; printf("%13.2f Mb ",hr) } { for ( x=4 ; x< =NF ; x++ ) { printf("%s ",$x) } print "" }' |cut -d "" -f2 | cut -d "-" -f1 | head -n 40
from : here
– netdata, dockerable too – (link)

System tuning
sysctl -w vm.max_map_count=262144
sysctl -w vm.swappiness = 0

verify
sysctl vm.max_map_count
sysctl vm.swappiness

Reference
https://stefanprodan.com/2016/elasticsearch-cluster-with-docker/

Memory tuning
https://qbox.io/blog/memory-considerations-in-elasticsearch-deployment
https://plumbr.io/handbook/gc-tuning-in-practice

Stuck shards
https://thoughts.t37.net/how-to-fix-your-elasticsearch-cluster-stuck-in-initializing-shards-mode-ce196e20ba95
https://www.datadoghq.com/blog/elasticsearch-unassigned-shards/

elasticdump (link)

# Backup index data to a file: 
elasticdump \  
    --input=http://production.es.com:9200/my_index \  
    --output=/data/my_index_mapping.json \  
    --type=mapping
elasticdump \  
    --input=http://production.es.com:9200/my_index \  
    --output=/data/my_index.json \  
    --type=data 

# Backup and index to a gzip using stdout: 
elasticdump \  
    --input=http://production.es.com:9200/my_index \  
    --output=$ \  
           | gzip > /data/my_index.json.gz

Export elasticsearch to csv (link)

docker pull nimmis/java-centos:oracle-8-jdk
wget https://artifacts.elastic.co/downloads/logstash/logstash-7.1.1.tar.gz
tar zxf logstash-7.1.1.tar.gz
ln -s logstash-7.1.1 logstash
docker run -ti -d --name logstash -v `pwd`/logstash:/home/logstash nimmis/java-centos:oracle-8-jdk
docker exec logstash /home/logstash/bin/logstash-plugin install logstash-input-elasticsearch
docker exec logstash /home/logstash/bin/logstash-plugin install logstash-output-csv
Put this into `pwd`/logstash/export-csv.conf
input {
 elasticsearch {
    hosts => "elastic:9200"
    index => "datafeed"
    query => '
    {
     "query": {
     "match_all": {}
     } 
    } 
  '
  }
}
output {
  csv {
    # elastic field name
    fields => ["field1", "field2", "field3", "field4", "field5"]
    # This is path where we store output.   
    path => "/home/logstash/exported-data.csv"
  }
}

filter {
  mutate {
    convert => {
 "lat" => "float"
 "lon" => "float"
 "weight" => "float"
 }
  }
}
./bin/logstash -f /home/logstash/export-csv.conf

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.