Search engine crawlers are increasing my system load


Enter Your Query:
Use '%' for wildcards and quotes for "exact phrases"


Top Level » Apache Related Information

Search engine crawlers are increasing my system loadLast Modified: Oct 30, 2013, 2:40 pm
Since a search engine like google need to parse your website to determine what to search for, if your website has a lot of data, this can often cause a high load on your system if the crawl is done in a short amount of time.

By creating a robots.txt file in your public_html folder, you can instruct these crawlers to slow down.

A sample robots.txt might look like this:

User-agent: *
Crawl-delay: 300

Which tells all crawlers to wait 300 seconds before each request.

Without it, a cralwer might make multiple requests per second, thus increasing your system load.
 
Related Helpfiles
How to track which site is using the apache processes and causing load

© 2018 JBMC Software, Suite 173  3-11 Bellerose Drive, St Albert, AB  T8N 1P7  Canada.  Mon-Fri 9AM-5PM MST