SEMRush is a powerful tool that lets you see how your website performs in traffic and keywords. However, SEMRush can also be a pain for website owners, as it can send out bots that crawl your site and generate a lot of traffic. This can quickly eat up your bandwidth and slow down your site.
Fortunately, there are a few ways to block semrush bots listed below.
How to block the SEMRush Bot from your website
If you’ve seen a lot of strange traffic coming from semrush.com, you may want to block the semrush bot. Here’s how to block the SEMRush crawler. This will help keep your website safe and secure from unauthorized access.
Use Robots.TXT to block SEMRush Bot
Add the code below to your Robots.txt file. This file tells search engines not to index certain parts of your site, and you can use it to exclude the semrush bot from crawling your site.
User-agent: SemrushBot Disallow: /
Download a Robots.TXT template that will block the SEMRush Bot.
Test the SEMRush Bot Block
After adding the disallow information to your Robots.TXT file, it’s always a good idea to ensure that the process was done properly.
You can test this by going to your website and adding /robots.txt to the URL.
How to find and edit your robots.txt file
The robots.txt file is located in the root directory of your website, and it can be edited using a text editor like CyberDuck. To edit the file, open it in your text editor and make the necessary changes. Be sure to save the file after making any changes, and upload it to your server if necessary. With a few simple edits, you can control how search engine bots index your website and ensure that only the pages you want to be indexed are included in the search results.
Using WordPress, you can create and edit the robots.txt file using the Yoast plugin. In WordPress, you click on Yoast SEO >> Tools >> File Editor >> robots.txt
* If you do not have a robots.txt file, you can generate one here.
* You must place a robots.txt file on each subdomain if you have subdomains.
How to tell SEMRushBot to slow down crawling your website
This code added to your Robots.txt file tells the bot to crawl one URL every minute.
User-agent: SemrushBot Crawl-delay: 60
How to block specific SEMRush crawlers
To block SemrushBot from crawling your site from the SEO Technical Audit:
User-agent: SiteAuditBot Disallow: /
To block SemrushBot from crawling your site for the Backlink Audit tool:
User-agent: SemrushBot-BA Disallow: /
To block SemrushBot from crawling your site for the On-Page SEO Checker tool and similar tools:
To block SemrushBot from checking URLs on your site for the SWA tool:
User-agent: SemrushBot-SWA Disallow: /
To block SemrushBot from crawling your site for Content Analyzer and Post Tracking tools:
To block SemrushBot from crawling your site for Brand Monitoring:
To block SplitSignalBot from crawling your site for the SplitSignal tool:
To block SemrushBot-COUB from crawling your site for the Content Outline Builder tool:
Now that you understand how to block the SEMRush Bot and have some tips for improving your website SEO, it’s time to put them into practice. Hiring a professional SEO agency can help take your website to the next level and improve your online presence. Are you ready to get started?
What is the SEMrush Bot?
SEMrush is a powerful tool that helps businesses to improve their online visibility. SEMrush offers a suite of tools that allow businesses to track their organic search rankings, analyze their website traffic, and research their competitors. One of the most unique features of SEMrush is the SEMrush Bot. The SEMrush Bot is an automated tool that allows businesses to monitor their online presence and make changes accordingly. The Bot can be customized to track specific keywords, websites, or even competitor movements. This allows businesses to stay ahead of the curve and make sure that they are always visible in the search engines. Overall, the SEMrush Bot is a valuable tool that can help businesses to improve their online visibility and stay ahead of the competition.
Why Does SEMRush Bot crawl your website?
SemrushBot is a web crawler that collects data about websites. This data is then used to provide information to Semrush users, such as website traffic statistics and keyword rankings. SemrushBot also helps to improve the accuracy of Semrush’s database by identifying new and updated websites. In addition, the data collected by SemrushBot is used to create customized reports for individual users. These reports can include information about specific keywords, website traffic, and more. By using SemrushBot, Semrush is able to provide its users with accurate and up-to-date information about the online world.
What does SemrushBot use with the data collected?
– Public backlink search engine index maintained as a dedicated tool called Backlink Analytics (webgraph of links) – Site Audit tool, which analyzes on-page SEO, technical and usability issues the Backlink Audit tool, which helps discover and clean up potentially dangerous backlinks of your profile – Link Building tool, which helps you find prospects, reach out to them and monitor your newly acquired backlinks – SEO Writing Assistant tool to check if URL is accessible – Brand Monitoring tool to index and search for articles – Content Analyzer and Post Tracking tools reports – On Page SEO Checker and SEO Content template tools reports – Topic Research tool reports – SplitSignal tool to create SEO A/B tests on your website – Content Outline Builder tool reports
How often does SEMRush check your Robots.txt file?
SEMrush checks your Robots.txt file after one hour or after 100 requests to re-check your robots.txt file and process its contents.
What bot does SEMRush use to crawl?
SEMRush uses a custom-built crawler to crawl websites. The SEMRush bot was designed specifically for SEMRush and has several features that make it well-suited for the task. Overall, the SEMRush bot is a powerful and efficient crawler that helps make SEMRush the comprehensive SEO tool it is.
Published on: 2022-08-20
Updated on: 2022-09-07