mirror of
https://github.com/discourse/discourse.git
synced 2025-05-28 13:51:18 +08:00
FEATURE: let Google index pages so it can remove them
Google insists on indexing pages so it can figure out if they can be removed from the index. see: https://support.google.com/webmasters/answer/6332384?hl=en This change ensures the we have special behavior for Googlebot where we allow indexing, but block the actual indexing via X-Robots-Tag
This commit is contained in:
@ -1853,7 +1853,7 @@ RSpec.describe TopicsController do
|
||||
|
||||
get "/t/#{topic.slug}/#{topic.id}.json"
|
||||
|
||||
expect(response.headers['X-Robots-Tag']).to eq('noindex')
|
||||
expect(response.headers['X-Robots-Tag']).to eq('noindex, nofollow')
|
||||
end
|
||||
|
||||
it "doesn't store an incoming link when there's no referer" do
|
||||
|
Reference in New Issue
Block a user