Testing disk performance with fio

Disk performance is often the main performance issue in any high traffic servers and databases. Instane sizing at humio do provide a good way to measure the disk read/write bandwidth using fio. The test might simulate how humio will read/write. However, change the parameters accordingly. Read more : https://docs.humio.com/cluster-management/infrastructure/instance-sizing/ sudo fio –filename=/data/fio-test.tmp –filesize=1Gi –bs=256K -rw=read […]

elasticsearch on docker

# docker pull elasticsearch:6.8.5 SCRIPT=`realpath $0` BASE=`dirname $SCRIPT`         mkdir -p $BASE/esdata1         docker run -p 9200:9200 –name elasticsearch -v $BASE/esdata1:/usr/share/elasticsearch/data \         -e “http.host=” \         -e “cluster.name=elasticlogging” \         -e “node.name=esnode1” \         -e “node.master=true” […]

All about docker

Docker installation (Ubuntu) sudo apt update sudo apt -y install apt-transport-https ca-certificates curl software-properties-common curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add – sudo add-apt-repository “deb [arch=amd64] https://download.docker.com/linux/ubuntu bionic stable” sudo apt update apt-cache policy docker-ce sudo apt -y install docker-ce sudo systemctl status docker Add into bashrcalias drm=”docker rm”alias dps=”docker ps -a”alias dmi=”docker images”function da […]

elasticsearch reference

Tools– head for Chrome (ElasticSearch Head – Chrome Web Store)– Postman (link)– Insomenia (link)– elasticdump – nodejs (link) Monitoring– ps_mem.py – monitor real memory utilization (github link)– ps -eo size,pid,user,command –sort -size | awk ‘{ hr=$1/1024 ; printf(“%13.2f Mb “,hr) } { for ( x=4 ; x< =NF ; x++ ) { printf(“%s “,$x) } […]

mysqldump script (per tables)

#!/bin/bash # This is PER TABLE backup. Each table will be backed up individually ## to enable passwordless mysqdump, put your password in /etc/mysql/[mysqld|percona|maria].conf.d/client.conf # [client] # user=”” # pass=”” THEDB=”mydbname” # THE DATABASE OUTPUT=”/home/backup/mysqldump” DATE=`date +%Y%m%d` OPTS=”–max_allowed_packet=512M ” tables=`mysql -e “use $THEDB; show tables;” | tr -d “| ” | grep -v -E “^Tables_in_” […]