# Begin robots.txt file User-agent: Amazonbot Disallow: / User-agent: MJ12Bot Disallow: / User-agent: PetalBot Disallow: / User-agent: AspiegelBot Disallow: / User-agent: AhrefsBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: DotBot Disallow: / User-agent: MauiBot Disallow: / User-agent: Yandex Disallow: / User-agent: ClaudeBot Disallow: / User-agent: Zoominfobot Disallow: / User-agent: MojeekBot Disallow: / User-agent: coccocbot Disallow: / User-agent: SeznamBot Disallow: / User-agent: AwarioRssBot User-agent: AwarioSmartBot Disallow: / User-agent: DataForSeoBot Disallow: / User-agent: serpstatbot Disallow: / User-agent: BLEXBot Disallow: / User-agent: * Disallow: /*/ctl/ # Googlebot permits * Disallow: /admin/ Disallow: /App_Browsers/ Disallow: /App_Code/ Disallow: /App_Data/ Disallow: /App_GlobalResources/ Disallow: /bin/ Disallow: /Components/ Disallow: /Config/ Disallow: /contest/ Disallow: /controls/ Disallow: /Documentation/ Disallow: /HttpModules/ Disallow: /Install/ Disallow: /Providers/ Disallow: /Activity-Feed/userId/ # Do not index user profiles # End of robots.txt file