mirror of
https://github.com/discourse/discourse.git
synced 2025-05-28 13:51:18 +08:00
FEATURE: allow for setting crawl delay per user agent
Also moved to default crawl delay bing so no more than a req every 5 seconds is allowed New site settings: "slow_down_crawler_user_agents" - list of crawlers that will be slowed down "slow_down_crawler_rate" - how many seconds to wait between requests Not enforced server side yet
This commit is contained in:
@ -5,6 +5,12 @@ class RobotsTxtController < ApplicationController
|
||||
def index
|
||||
if SiteSetting.allow_index_in_robots_txt
|
||||
path = :index
|
||||
@crawler_delayed_agents = []
|
||||
|
||||
SiteSetting.slow_down_crawler_user_agents.split('|').each do |agent|
|
||||
@crawler_delayed_agents << [agent, SiteSetting.slow_down_crawler_rate]
|
||||
end
|
||||
|
||||
if SiteSetting.whitelisted_crawler_user_agents.present?
|
||||
@allowed_user_agents = SiteSetting.whitelisted_crawler_user_agents.split('|')
|
||||
@disallowed_user_agents = ['*']
|
||||
|
Reference in New Issue
Block a user