SERPtimizer is a SEO tool that analyzes websites and checks them for errors. Your website was visited by SERPtimizerBot for one of the following reasons:
- Checking external links that led to your site.
- Analysis of search result pages for a specific keyword.
- Search for links to other sites.
- The website was created by one of our customers as a project and is analyzed for SEO criteria.
Before accessing a page we first read the Robots.txt file. Here you can block single pages or complete access for certain or all crawlers.
How to block SERPtimizerBot
To block our bot from further access to your site you need to add the following entry to the robots.txt file.
User-agent: SERPtimizerBot Disallow: /
Block pages and subdirectories for all bots
If you only want to protect individual pages or subdirectories from access, you can enter them in the following way. The * wildcard character in the user-agent blocks the specified resources for all bots.
User-agent: * Disallow: /imprint.html Disallow: /private/
Reduce access by setting a wait time
Another method to reduce the load on the web server is to specify a crawl delay. In the following example, our bot will only execute a request every 0.5 seconds.
User-agent: SERPtimizerBot Crawl-delay: 0.5