[proxies/robots.txt] Make it so that we force the proxy to use a local robots.txt
The various openshift tools get hit by various crawlers and do not send a robots.txt. This seems to be due to the balancer code used to send back to the nodes. This forces the proxy robots.txt to be honored always.
This commit is contained in:
parent
05892b26a8
commit
11e2ff87a1
1 changed files with 6 additions and 0 deletions
|
@ -1 +1,7 @@
|
|||
## Make sure that we don't skip this because we proxy pass it to a
|
||||
## slow backend
|
||||
<Location "/robots.txt">
|
||||
ProxyPass !
|
||||
</Location>
|
||||
|
||||
Alias /robots.txt /srv/web/{{site_name}}-robots.txt
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue