docker-moving container to another server

I have a running container, going through edits, changes, and i need to move this to a new server.


First, commit (ref)

docker commit <containerid>  myimages/lamp:v1.1

You can see the list of images that you have with “docker images.”

Save to a file(ref)

sudo docker save -o <imagefile.tar> imageid

Transfer the imagefile.tar to the new server.

Load them back (ref)

docker load -i <path to image tar file>
docker tag <Image-ID> myimages/lamp:v1.1

Run back.

Afterlogic webmail for cpanel

Afterlogic have some interesting new look to the webmail of cpanel. Guides available at

cd /root/


tar -xzvf ./webmail-panel-installer.tar.gz
cd ./webmail-panel-installer
chmod a+x ./installer
./installer -t lite -a install

elasticsearch reference

– head for Chrome (ElasticSearch Head – Chrome Web Store)
– Postman (link)
– Insomenia (link)
– elasticdump – nodejs (link)

– – monitor real memory utilization (github link)
ps -eo size,pid,user,command --sort -size | awk '{ hr=$1/1024 ; printf("%13.2f Mb ",hr) } { for ( x=4 ; x< =NF ; x++ ) { printf("%s ",$x) } print "" }' |cut -d "" -f2 | cut -d "-" -f1 | head -n 40
from : here
– netdata, dockerable too – (link)

System tuning
sysctl -w vm.max_map_count=262144
sysctl -w vm.swappiness = 0

sysctl vm.max_map_count
sysctl vm.swappiness


Memory tuning

Stuck shards

elasticdump (link)

# Backup index data to a file: 
elasticdump \  
    --input= \  
    --output=/data/my_index_mapping.json \  
elasticdump \  
    --input= \  
    --output=/data/my_index.json \  

# Backup and index to a gzip using stdout: 
elasticdump \  
    --input= \  
    --output=$ \  
           | gzip > /data/my_index.json.gz

Export elasticsearch to csv (link)

docker pull nimmis/java-centos:oracle-8-jdk
tar zxf logstash-7.1.1.tar.gz
ln -s logstash-7.1.1 logstash
docker run -ti -d --name logstash -v `pwd`/logstash:/home/logstash nimmis/java-centos:oracle-8-jdk
docker exec logstash /home/logstash/bin/logstash-plugin install logstash-input-elasticsearch
docker exec logstash /home/logstash/bin/logstash-plugin install logstash-output-csv
Put this into `pwd`/logstash/export-csv.conf
input {
 elasticsearch {
    hosts => "elastic:9200"
    index => "datafeed"
    query => '
     "query": {
     "match_all": {}
output {
  csv {
    # elastic field name
    fields => ["field1", "field2", "field3", "field4", "field5"]
    # This is path where we store output.   
    path => "/home/logstash/exported-data.csv"

filter {
  mutate {
    convert => {
 "lat" => "float"
 "lon" => "float"
 "weight" => "float"
./bin/logstash -f /home/logstash/export-csv.conf

mysqldump script (per tables)


# This is PER TABLE backup. Each table will be backed up individually

## to enable passwordless mysqdump, put your password in /etc/mysql/[mysqld|percona|maria].conf.d/client.conf
# [client]
# user=""
# pass=""

DATE=`date +%Y%m%d`
OPTS="--max_allowed_packet=512M "

tables=`mysql -e "use $THEDB; show tables;" | tr -d "| " | grep -v -E "^Tables_in_" `

for table in $tables; do
STAGEDDATE=`date +%Y%m%d`
echo "Dumping table : " $table
mysqldump -a $OPTS $THEDB $table > $OUTPUT/$THEDB-$DATE/$STAGEDDATE-$table.sql

echo "Backup done at " $OUTPUT/$THEDB-$DATE

Web server tuning (apache and nginx)

Key points

  • enable http2 & change mpm prefork to event
  • php-fpm


Enable http2

On apache + prefork to event –

sudo add-apt-repository ppa:ondrej/apache2
sudo apt update
sudo apt upgrade
sudo apt install php7.0-fpm 
sudo a2enmod proxy_fcgi setenvif
sudo a2enconf php7.0-fpm 
sudo a2dismod php7.0 
sudo a2dismod mpm_prefork 
sudo a2enmod mpm_event 
sudo service apache2 restart
sudo service php7.0-fpm restart

Add in <VirtualHost>… </VirtualHost> for individual site, or in apache.conf file, for global settings.

Protocols h2 h2c http/1.1

sudo a2enmod http2
sudo service apache2 restart

Nginx – Need to compile nginx with http2 module
./configure –with-compat –add-dynamic-module=../ModSecurity-nginx –with-http_ssl_module –with-stream_ssl_module –prefix=/etc/nginx –with-http_v2_module

** mod_security for nginx, follow this :


This post content is under development.. new content will be added in the future.