|Since a search engine like google need to parse your website to determine what to search for, if your website has a lot of data, this can often cause a high load on your system if the crawl is done in a short amount of time.|
By creating a robots.txt file in your public_html folder, you can instruct these crawlers to slow down.
A sample robots.txt might look like this:
Without it, a cralwer might make multiple requests per second, thus increasing your system load.
|How to track which site is using the apache processes and causing load|
© 2018 JBMC Software, Suite 173 3-11 Bellerose Drive, St Albert, AB T8N 1P7 Canada. Mon-Fri 9AM-5PM MST