FEATURE: let Google index pages so it can remove them

Google insists on indexing pages so it can figure out if they
can be removed from the index.

see: https://support.google.com/webmasters/answer/6332384?hl=en

This change ensures the we have special behavior for Googlebot
where we allow indexing, but block the actual indexing via
X-Robots-Tag
This commit is contained in:
Sam Saffron
2020-05-11 12:14:21 +10:00
parent 4a74f18e95
commit bb4e8899c4
4 changed files with 18 additions and 2 deletions

View File

@ -132,6 +132,7 @@ RSpec.describe RobotsTxtController do
get '/robots.txt'
expect(response.body).to_not include("Disallow: /u/")
expect(response.body).to include("User-agent: googlebot\nAllow")
end
it "returns overridden robots.txt if the file is overridden" do