mirror of
https://github.com/discourse/discourse.git
synced 2025-05-24 03:36:18 +08:00
FEATURE: let Google index pages so it can remove them
Google insists on indexing pages so it can figure out if they can be removed from the index. see: https://support.google.com/webmasters/answer/6332384?hl=en This change ensures the we have special behavior for Googlebot where we allow indexing, but block the actual indexing via X-Robots-Tag
This commit is contained in:
@ -132,6 +132,7 @@ RSpec.describe RobotsTxtController do
|
||||
get '/robots.txt'
|
||||
|
||||
expect(response.body).to_not include("Disallow: /u/")
|
||||
expect(response.body).to include("User-agent: googlebot\nAllow")
|
||||
end
|
||||
|
||||
it "returns overridden robots.txt if the file is overridden" do
|
||||
|
Reference in New Issue
Block a user