'Sitemap of Django site not getting crawled after successful submission
I have a redesigned Django site that has the same URL as the previous one. After 2 weeks of launching the site, Google is still showing old URLs and after accessing them, they are giving 404 errors.
I have added the site in Google's webmasters tool and with the help of robots.txt, the website was crawled. It started showing some of the new URLs in search but after adding a sitemap to the tool, no crawling has been done as per the crawl stats and crawl errors reports. I have tested the sitemap and it has submitted around 500 pages but none are indexed so far. I don't know where I am going wrong.
Please guide me.
Solution 1:[1]
Have you pinged Google after the changes? The Django docs have a well explained section on this.
You could even implement it so that Google is pinged each time any change happens relevant to the sitemap:
from django.contrib.sitemaps import ping_google
class Entry(models.Model):
# ...
def save(self, force_insert=False, force_update=False):
super(Entry, self).save(force_insert, force_update)
try:
ping_google()
except Exception:
# Bare 'except' because we could get a variety
# of HTTP-related exceptions.
pass
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | gdvalderrama |
