FEATURE: let Google index pages so it can remove them

Google insists on indexing pages so it can figure out if they
can be removed from the index.

see: https://support.google.com/webmasters/answer/6332384?hl=en

This change ensures the we have special behavior for Googlebot
where we allow indexing, but block the actual indexing via
X-Robots-Tag
This commit is contained in:
Sam Saffron
2020-05-11 12:14:21 +10:00
parent 4a74f18e95
commit bb4e8899c4
4 changed files with 18 additions and 2 deletions

View File

@ -1853,7 +1853,7 @@ RSpec.describe TopicsController do
get "/t/#{topic.slug}/#{topic.id}.json"
expect(response.headers['X-Robots-Tag']).to eq('noindex')
expect(response.headers['X-Robots-Tag']).to eq('noindex, nofollow')
end
it "doesn't store an incoming link when there's no referer" do