84
edits
Changes
→Typical Reasons for not allowing Robots
* While indexing a dynamic site, robots can put an extra strain on the server, causing a slow response, or in some cases, pegging the CPU at 100%.
* Some content is intentionally shielded from search engines to help shape how a websites resources are presented in search results. For example, if an organization has put a lot of PDFs online and doesn't want those to turn up in search results.
== Throttling ==