mirror of
https://git.pleroma.social/pleroma/pleroma.git
synced 2024-12-23 16:40:29 +00:00
4fcf272717
Since python doesn't have a way to lock deps for a particlar project by default, I didn't bother with it. This resulted in mkdocs updating at some point, bringing a breaking change to how tabs are declared and broken tabs on docs-develop.pleroma.social. I've learned my lesson and locked deps with pipenv in pleroma/docs!5. This MR updates Pleroma docs to use the new tab style, fortunately my editor did most of it. Closes #2045
652 B
652 B
Managing robots.txt
{! backend/administration/CLI_tasks/general_cli_task_info.include !}
Generate a new robots.txt file and add it to the static directory
The robots.txt
that ships by default is permissive. It allows well-behaved search engines to index all of your instance's URIs.
If you want to generate a restrictive robots.txt
, you can run the following mix task. The generated robots.txt
will be written in your instance static directory.
=== "OTP"
```sh
./bin/pleroma_ctl robots_txt disallow_all
```
=== "From Source"
```sh
mix pleroma.robots_txt disallow_all
```