From 11e2ff87a16565fc21eeb9c4b7904da1c5e612f0 Mon Sep 17 00:00:00 2001 From: Stephen Smoogen Date: Tue, 19 Mar 2019 19:49:16 +0000 Subject: [PATCH] [proxies/robots.txt] Make it so that we force the proxy to use a local robots.txt The various openshift tools get hit by various crawlers and do not send a robots.txt. This seems to be due to the balancer code used to send back to the nodes. This forces the proxy robots.txt to be honored always. --- roles/httpd/website/templates/robots.conf | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/roles/httpd/website/templates/robots.conf b/roles/httpd/website/templates/robots.conf index f442128e2e..a59eb685e4 100644 --- a/roles/httpd/website/templates/robots.conf +++ b/roles/httpd/website/templates/robots.conf @@ -1 +1,7 @@ +## Make sure that we don't skip this because we proxy pass it to a +## slow backend + + ProxyPass ! + + Alias /robots.txt /srv/web/{{site_name}}-robots.txt