We have experienced high load on one of our web servers lately (Apache  2.0.59 on Linux). So, I started thinking about how to analyze this in a  bit more detail. We are running multiple virtual hosts (around 60) so  when using tcpdump it is not easy to see which URLs are requested,  therefore I looked around for some HTTP-oriented sniffer. My search  ended with “urlsnarf” from the “dsniff” package.
Last change in dsniff was in 2000 and it has libnet and libnids  dependencies. It took me a couple of tries (and a minor patch) to get it  to compile but when done I was happy with the result. urlsnarf simply  sniffs HTTP traffic on one or more parts and generates output in the  common log format making it easy to see which URLs are reqeusted.
I could then see what URL were the most frequently requested and  continue on to the awstats logs which we have for each virtual host.  Then it was quite easy to find a couple of pages with very large images  that were very frequently requested and from there we could take  relevant actions to reduce the HTTP load.

 
 
Tidak ada komentar:
Posting Komentar
Setelah membaca artikel di atas.
Apa komentar anda ??