Applied on: Windows hosting accounts

Search engines bots, crawlers and spiders helps your site getting discovered by users. However, some search engines are very aggressive and consume a lot of server bandwidth. And sometimes, bots and spiders can be malicious and try to extract sensitive data. In this article I will give you a step by step guide on How to block malicious bots and spiders in web.config

1. Open the web.config file of your site or ASP application

2. Look for the <security> tag:

2. Inside this tag, we are going to apply the Request Filtering rule. Open and close a <requestFiltering> and a <filteringRules> tag:

3. Now, let’s make a new filtering rule. For this example, we will filter the BingBot. We will name the rule, and also define the user agent within the add string.

<filteringRule name="BlockSearchEngines" scanUrl="false" scanQueryString="false">

 <scanHeaders><clear />

<add requestHeader="User-Agent" />  </scanHeaders>

<appliesTo> <clear /></appliesTo> 

<denyStrings> <clear />

<add string="YandexBot" />  </denyStrings>

 </filteringRule>

Note: You can add another bot, adding a new filtering rule.

4. Save your changes and that's it! The automated bot will be filter and won't waste useful server resources.

This concludes how to block malicious bots and spiders in web.config