[proxies/robots.txt] Make it so that we force the proxy to use a local robots.txt

The various openshift tools get hit by various crawlers and do not send
a robots.txt. This seems to be due to the balancer code used to send
back to the nodes. This forces the proxy robots.txt to be honored
always.
This commit is contained in:
Stephen Smoogen 2019-03-19 19:49:16 +00:00
parent 05892b26a8
commit 11e2ff87a1

View file

@ -1 +1,7 @@
## Make sure that we don't skip this because we proxy pass it to a
## slow backend
<Location "/robots.txt">
ProxyPass !
</Location>
Alias /robots.txt /srv/web/{{site_name}}-robots.txt