mirror of
https://github.com/zedeus/nitter.git
synced 2024-11-14 20:51:25 +00:00
Block search engines via robots.txt (#631)
Prevents instances from being rate limited due to being senselessly crawled by search engines. Since there is no reason to index Nitter instances, simply block all robots. Notably, this does *not* affect link previews (e.g. in various chat software).
This commit is contained in:
parent
778c6c64cb
commit
c543a1df8c
1 changed files with 2 additions and 0 deletions
2
public/robots.txt
Normal file
2
public/robots.txt
Normal file
|
@ -0,0 +1,2 @@
|
|||
User-agent: *
|
||||
Disallow: /
|
Loading…
Reference in a new issue