mod_fcgid: HTTP request length xxxxx (so far) exceeds MaxRequestLen (131072)

This error message appears today, and it seems related to fcgi.

As defined here,, the default value for MaxRequestLen is 131072, which is quite low for most implementation. I would prefer to use a higher value for this.

For cpanel users, the configurations lies in this file.

You need to add an additional line for this, which set the limit to 2MB.
MaxRequestLen 2097152

Restart your apache, and you’re done.

Slow log – monitoring application’s health through slow log

The concept lies on logging slow performing application or scripts, to know which task are the most difficult to be completed on time. From there, you will know which task taking too much time and most probably resources, and need to be optimize to make sure it doesn’t affect the other tasks to be completed in timely manner

Mysql implemented slow log, which will log queries taking too much time to be completed. Mysql do have features where you can set global configuration on the fly, and will be in effect until the next mysql restart. If you want to make it permanent, it need to be placed in my.cnf configuration file.

To make enable slow log:

SET GLOBAL slow_query_log := 1;
SET GLOBAL long_query_time=10;
SET GLOBAL slow_query_log_file = ‘path’;

slow_query_log (1 = enable, 0 = disable)
long_query_time (the treshhold to consider it is slow, in seconds)
LOG_QUERIES_NOT_USING_INDEXES (if turned ON, it will consider queries not using indexes as slow)
slow_query_log_file (the path to the log file)

If you want to make the settings permanent in my.cnf file:
log-queries-not-using-indexes =1

For more information, please refer to official documentation on slow query log from

Besides mysql, your web server is a crucial part in processing your web-application as well, and for that, you might want to consider using this apache module, modlogslow, to log scripts which takes a long time to be completed.

It is an apache module, you need to compile it using apxs. Add some configurations to load the module file, and configure how will it behave, and you’re ready to go. The documentation provided in the archive, and also in the Google Code page, enough to get it running.

the code below will be enough to enable the module. Find httpd.conf, and put these few lines to enable it
LoadModule log_slow_module modules/
LogSlowEnabled On
LogSlowLongRequestTime 8000
LogSlowFileName /usr/local/apache/logs/slow_log

Hope this would help you in monitoring your server, get to know who is sick and need attention.

Caching, practical caching.

As I go along managing a few sites, managing a few servers before, most of my time online were spent studying on socializing, information security, server administration (security, optimization), and others. ‘Others’ might contribute to a bigger portion, but just to emphasize that one part of server administration is optimization, to make a software work better in our case.

One major part of optimization is to use caching. One good example of caching is in reverse-proxy, which will sit infront of your server, and watching all requests passing through. Static contents will be cached in the first request, and the following request will be served from the cache. Previously, squid have been a good option for this. Afterwards, nginx comes into place, where it has proven itself to be a better candidate with small memory footprint, and capability to better handle static contents and a lot of request. Remi from WebFaction have done a benchmark where Nginx is serving static content at 10k requests per second. Cool!

Last few month I have read about varnish, and never had much more reading into it, until last few weeks where someone have come out with a case study on how varnish helps them in static content caching. I have done some tests on varnish, and it is not that easy to deploy one in your environment to best utilize it capability. It need to be tuned in accordance to your application, because you need to tell what to cache and what not to. On top of that, varnish is sensitive to cookies, you need to manage all the cookies as well. In an environment where you have unpredicted application deployment, such as hosting company, it would not be as affective as a dedicated configuration for a single application. Thats the compromise that you have to make. However, this is a good option to have, at least to reduce the requests that you’re getting on the server.

Content Delivery Network, or CDN do implement caching as well for all static contents that it is serving. The whole CDN idea is about caching as well, retrieving the same contents from the nearest node, and to reduce the loads from the main server. there are a few options of CDN nowadays. Cloudflare provide and easy interface to start using CDN. You just need to change your NS record of your domain, and you can start using much more features offered by Cloudflare such as DDOS protection and application firewall. Another option is MaxCDN, and Aflexi. Aflexi offers CDN software, for anyone to start offering CDN service to their clients. You can apply one from Exabytes, which do offer CDN harnessed by Aflexi.

Talking about application side caching, if you’re using wordpress, wordpress have some plugins which will do caching, such as wp-cache, wp-supercache. I personally prefer wp-cache, which works for me last time I tried it. One thing to note that, wp-cache will cache the whole page generated by wordpress, and will keep it for a pre-determined duration, in configuration section. Besides that, Jeff Star have written an article on how to make WordPress faster, basically by turning on some internal variables, which will skipped database queries for certain information by harcoding them in wp-config.php file.

For example, defining blog and site URL:
define('WP_HOME', ''); // blog url
define('WP_SITEURL', ''); // site url

Hardcode stylesheet and template path
define(‘TEMPLATEPATH’, ‘/absolute/path/to/wp-content/themes/H5’);
define(‘STYLESHEETPATH’, ‘/absolute/path/to/wp-content/themes/H5’);

And defining encryption key for internal data in wordpress. You can generate it from secret-key service.

If you are a programmer, consider using memcache. Memcached is a lightweight in-memory object caching server, which can store object data from your application, to be retrieved again faster. It was developed by LiveJournal to harness their web operation until now. Detail of LiveJournal setup was entailed in this article, Distributed Caching with Memcached, by Brad Fitzpatrick. The article describe technically how memcached works, and how it is scalable, to be implemented site-wide. You will get the idea, on why the same web server can host memcached as well, possible more than 1 instance of memcached, and how your application will make use of the whole memcached cluster. In LiveJournal case, they have 28 instance of memcached running, holding 30GB of popular data.

One more thing, install this PHP script, to monitor your memcached cluster. Written by Harun Yayli, the script will be password protected, and enabled you to view information from each memcached instances configured. Hit rate, miss rate, uptime, version and data size.

ssh’ing shorter with ssh alias

This is necessary thing to do if you have access to many servers, and too many configurations to remember and typed, from the user, hostname and port number. You can make this short with ssh alias.

ssh alias allow you to connect your remote machine just by words.

# ssh mymachineor
# scp file.txt mymachine:~/

Isn’t that awesome? You can configure all your host in a config file, located at
And below are the sample configuration

Host mymachine
HostName [ip address or hostname]
User [username]
Port [port number]

You can just add alias for another host using the same syntax like above, in the same configuration file. If you’re connecting to default port, which is 22, you can just ignore the “Port” configuration line.

If you placed the tricks of ssh’ing into your remote server without password, you can go into with less typing, but be carefull with your private key, as it is the key into your server.

Please refer here, as this is my source

JailBreakMe and vulnerability waiting to be exploited

iPhone users jailbroke their iPhones to circumvent any blocking by the original IOS firmware, and allow them to have more control over the phone, with regards to software installation and so on. I have a friend who jailbreak his phone, and it look tedious. I dont really know, because I dont have an iPhone.

Jailbreaking becomes much easier now. You just need to browse through the site, some clicks, and your iPhone were jailbroken. What did is kind of a drive-by exploitation, a technique malware writers used to infect with malware/virus, by exploiting vulnerability in the browser by viewing a PDF file. In this case, used the exploit code to exploit a vulnerability in PDF reader in iDevices, to jailbreak the device. @comex found the vulnerability, and use it in And actually, at the time when this article was written, the PDF files can be found here.

For whatever purpose you want to jailbreak your iPhone, I don’t care. But the issue arises as it is really EXPLOITING a vulnerability in your iPhone device. Practically, not just iPhone, but iPad and iPod. It means, your iDevice might still have this vulnerability which is exploitable. What does it mean by exploitable? It means someone can execute a malicious code, some people might understand as “hacking into” your iDevices, to do anything that they want, without your authorization.

Apple have announced that the patch will be available in the next update, which leave your iPhone hackable or exploitable in the mean time. Jailbroken iDevices by can download additional patch to patch the vulnerability, called PDF patch 2. But for virgin IDevice, you remains vulnerable until the next update.

The PDF file used by is available for public access, at There are rumors that someone have started writing malicious code to exploit this vulnerability, and it is on its way now.

I found a few articles which gives an insight on security in mobile devices, and currently pointing to the few who conquer the mobile device market share currently. Security Showdown: Android Vs. iOS and by ThreatPost, New iPhone Jailbreak Make Short Work of World’s ‘most secure’ OS

Additional note:
IOS Hardening Configuration Guide, by Australia Government, Department of Defense

Phishing email. Do you collect them?

Do you receive something like this? Check your spam folder, you might missed it, thanks to effective spam filter by your provider.

For simple analysis, you can try identify where it comes from. From the mail header, it will tell you which email server does it come from. In some cases, some email server will have the IP address of the user who sent the email as well. Go havva look, the picture below show you how to get the mail header in Gmail. For other main clients, you can refer to this site.

My computer is infected. So what?

I bet most of us still do not have the idea of rising internet threats nowadays. It seems irony, as everyone goes online and relying their business to the internet, they still tend to be the victim, and become part of the darkside of the internet without them knowing.

One scenario that we might think about, my computer have been infected with virus, but I still can send my email as usual. So, whats the deal? Well, there’s a lot more about the virus that you dont know. We’ll just discussed about 2 issues here, banking trojan like zeus, and also botnet.

Zeus trojan

Zeus trojan is a banking centric trojan, where it steals banking information by keylogging. There quite a number of infection nowadays, and thousands of variants. Zeus package were sold for anyone to run the botnet. The package comes with the virus builder, that were custom made for your configurations. So, your banking information might not be received by one person. Anyone could steal your banking information. The data will be submitted to Command and Control (CnC) server, specified by the creator.

Picture from, showing what kind information being stealed
Continue reading

Protect your google account with account recovery option

I just bugged by google when they asked me to re-login again. But there’s a new notification about entering our mobile number, as another verification to claim our account.
Google Recovery Option

It is a good way to verify our account, and claiming your account if you happen to lost your account due password lost. It might be a concern about your personal data, personal information being held by google. But by using their service, we’re compromising our privacy to google already.

However, how did they verify that I entered my correct mobile number? No verification made for the phone number. Anyhow, its a good step to limit the incident related to account lost by google. Other services like facebook also did have some mechanism to mitigate the issues, like notifying you if someone else accessing your account from another location, which seems to be unlikely.

I’m glad to see such more internet services taking this issue seriously, and doing something about it. As they grow larger, they still seems to care for those small portion of their users, whose having this kind of problem.

Fine tuning apache and mysql for performance and security

I’ve gone through a process of migration of a server, with help with provided migration scripts, and some hard-coded scripts to help synchronizing latest data, and permission fixing. All done well, and working fine now. However, all installation of services were default, and did not optimized for our needs.

There are few aspect of tuning need to be done, relative to your resource, in terms of available memory, harddisk I/O, and some other considerations. Surely, you want the best performance, with high level of security. At some point, there are some aspect that you should be compromised to get the best in other aspect.


There a few configuration that would need a change, for example, HostnameLookups. By default it was turned On. While turned on, it will add a latency to the request to resolve the IP before the request were completed. You can disable it by replacing it with

HostnameLookups off

DirectoryIndex negotiation is an option for you to determine what file to be the default file to be loaded in any document root. Avoid using wildcards, and enter the options specifically in httpd.conf file, ordered by the priority
For example:

DirectoryIndex index.php index.htm index.html index.shtml index.phtml Default.htm Default.html

Apache 2.0 equipped with Multi-Processor Module (MPM) which will handle apache connections, handling requests, and forking child process for the requests. There are a few options to chose from, and the most common are worker and prefork. Each of them have their own advantages and disadvantages. Make sure you read through the documentation to understand each options, and how they would help you apache process/request handling.

In my case, I’m using prefork.
StartServers 15
<IfModule prefork.c>
MinSpareServers 10
MaxSpareServers 25
MaxClients 255
MaxRequestsPerChild 10000

If you are using cPanel, you can edit this in WHM Control Panel -> Service Configuration -> Apache Configuration -> Global Configuration.

Continue reading