Very well established blog, new posts now being indexed very late
-
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so.
But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt.
This is the current robots file.
User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz
Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
-
The robots.txt file is designed to completely block content. Normally, if your robots.txt file was a factor then your content would not appear in SERPs at all.
It is possible for content to appear in SERPs even though it is blocked by robots.txt if it is linked from other sources. Since this is new content, it is less likely that is the case unless you are immediately sharing links and Google is seeing those links within the time frame you shared.
The first place I would look is your sitemap or whatever tool is used to inform Google that you have new content. When you publish a new blog article, your software should ping Google and inform them there is new content. That is where any investigation should begin. Next step is to check server logs to see how long it takes Google to respond to the alert. If it takes them 12 hours, then there is nothing further you can do about it.
I would be interested in a lot more detail. How many articles how you confirmed as being affected by this issue. Exactly how did you confirm the issue?
As a side note, your robots.txt file is bloated and doesn't adhere to any standards I have seen. How exactly was it created? Did someone go in and make manual modifications to the file?
-
Are you using Feedburner? Has the feed publishing service gotten out of sync? You can re-sync it under the Troubleshootize section.
-
Yes, its a wordpress site and I always had the all in one SEO plugin enabled.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Do you use a word press platform? If so do you use a SEO plug in. Different plug ins can effect the index time.
-
Could you possibly reverse the changes of the robots.txt to a previous "working" version where your site was getting indexed quicker?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Some Issues about my Blog
I am facing issue regarding to my Blog https://digitalmedialine.com/blog/. As some pages are not Rank in google yet. Can Anyone help me out how to rank those blogs to improve my Traffic. Thanks in Advance.
Technical SEO | | qwaswd0 -
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
No Index PDFs
Our products have about 4 PDFs a piece, which really inflates our indexed pages. I was wondering if I could add REL=No Index to the PDF's URL? All of the files are on a file server, so they are embedded with links on our product pages. I know I could add a No Follow attribute, but I was wondering if any one knew if the No Index would work the same or if that is even possible. Thanks!
Technical SEO | | MonicaOConnor0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
No index on subdomains
Hi, We have a subdomain that is appearing in the search results - I want to hide this as it looks really bad. If I were to add the no index tag to the sub domain would URL would this affect the whole domain or just that sub domain? The main domain is vitally important - it is just that sub domain I need to hide. Many thanks
Technical SEO | | Creditsafe0 -
Should i index or noindex a contact page
Im wondering if i should noindex the contact page im doing SEO for a website just wondering if by noindexing the contact page would it help SEO or hurt SEO for that website
Technical SEO | | aronwp0 -
Local Keywords Not Ranking Well in a Geographic Location (but Rank Very Well Outside of Geographic Location)
Has anyone experienced, in the last few months, an issue where a website that once ranked well for 'local' terms in Google stopped ranking well for those terms (but saw a ranking decrease only within the geographic location contained within those keywords)? For example only, some 'root' keywords could be: Chicago dentist Chicago dentists dentist Chicago dentists Chicago What happens is that when a searcher searches from within the geographic area of Chicago, IL, the target website no longer ranks on the 1st page for these types of keyword phrases, but they used to rank in the top 3 perhaps. However, if someone was to search for the same keyword phrases from another city outside of Chicago or set a custom location (such as Illinois or even Milwaukee, WI perhaps) in their Google search, the target website appears to have normal (high) 1st page rankings for these types of terms. My own theory: At first I thought it was a Penguin related issue but the client's rankings overall haven't appeared to have been affected on the date(s) of Penguin updates. Authority Labs and Raven Tools (which uses Authority Labs data) did not detect any ranking decrease and still reports all the local keyword rankings as high on the 1st page of Google. However, when the client themselves goes to check their own rankings (as they are within that affected geographic area), they are no where to be found on the 1st page. :S After some digging I found that (one of) the company's Google Places listings (the main office listing) became an 'unsupported' status in Google Maps. So now I am thinking that this phenomenon is due to the fact that other listings are now appearing in search results for the same location. For example, in this case, an individual dentist's Google Places listing (who works within the dental office) is being displayed instead of the actual dental office's listing. Also, the dentist's name on the Google Places listing is being swapped out by Google with the name of the dental office, but if you click through to the Google Places listing, it shows the name of the individual Dentist. Anyone encounter a similar issue or have any other theories besides the Google Places issue?
Technical SEO | | OrionGroup0 -
Page not being indexed
Hi all, On our site we have a lot of bookmaker reviews, and we are ranking pretty good for most bookmaker names as keywords, however a single bookmaker seems to have been shunned by Google. For a search "betsafe" in Denmark, this page does not appear among the top 50: http://www.betxpert.com/bookmakere/betsafe All of our other review pages rank in top 10-20 for the bookmaker name as keyword. What to do if Google has "banned" a page? Best regards, Rasmus
Technical SEO | | rasmusbang0