This time, I will introduce how to install robots.txt in PythonAnywhere django.
You don't want Google's crawler to crawl you by placing a robots.txt,you can teach the page to crawlers.
[Robots.txt] How to set in PythonAnywhere django.
Create robots.txt in the directory under templates.
User-agent: * Allow: / Disallow: /admin/ Sitemap: https://sohtani.pythonanywhere.com/sitemap.xml
The following means that all crawlers are covered.
User-agent : *
The following means that the target you want to deny access to is the admin page.
Disallow: /admin/
The following can tell you the location of the Sitemap.
Sitemap: https://sohtani.pythonanywhere.com/sitemap.xml
Add the following to urls.py in the directory under the application.
from django.views.generic import TemplateView urlpatterns = [ path('robots.txt', TemplateView.as_view(template_name='robots.txt', content_type='text/plain')), ]
Set the path to robots.txt in TemplateView and you're done.
In the previous article, I introduced how to configure sitemap.xml.
And this time I told you how to set robots.txt.
Even with this, it can still take a significant amount of time for newly written articles to be indexed.
In that case, it is a method to request indexing directly in Google Search Console.
You can read it in the article below.