Recovering from Blocked Pages Debaucle
-
Hi, per this thread: http://www.seomoz.org/q/800-000-pages-blocked-by-robots We had a huge number of pages blocked by robots.txt by some dynamic file that must have integrated with our CMS somehow. In just a few weeks hundreds of thousands of pages were "blocked." This number is now going down, but instead of by the hundreds of thousands, it is going down by the hundreds and very sloooooowwwwllly. So, we really need to speed up this process. We have our sitemap we will re-submit, but I have a few questions related to it: Previously the sitemap had the <lastmod>tag set to the original date of the page. So, all of these pages have been changed since then. Any harm in doing a mass change of the <lastmod>field? It would be an accurate reflection, but I don't want it to be caught by some spam catcher. The easy thing to do would be to just set that date to now, but then they would all have the same date. Any other tips on how to get these pages "unblocked" faster? Thanks! Craig</lastmod></lastmod>
-
Hey Dan,
I am actually not so concerned about the pages being indexed. I don't really think they were ever de-indexed. Unless I am wrong, I think they were de-ranked.
I know others have said that when they "disallowed" large portions of their sites, their pages dropped in the rankings, and did not necessarily disappear. This is more what I want to see recovery from.
Thanks!
Craig
-
Craig
D'you have screaming frog? BEST way to make sure you're all set is - run a crawl with Screaming Frog. By default it will acknowledge robots.txt and not crawl anything being blocked. Set the user agent to Googlebot.
If it crawls all the pages you want it to just fine, than you are all set!
-Dan
-
Thanks for jumping in Dan. The number of blocked pages, over a month later is still way up there. It really has barely gone done. As of today it is at 904,000.
So, we still wait and hope that:
A. That many pages aren't actually blocked (whatever blocked actually means.)
B. The rate at which that number falls will begin to increase.
Thanks for your answer!
Craig
-
Hey There
I see this question is a bit old ... are you still have these issues? If so, when you say "going down" do you mean according to the numbers showing in Webmaster Tools?
I do know that quite often there can be a delay in the data in Webmaster Tools (especially the indexation report which you may be referring to).
I don't think there's any harm in updating the dates to reflect the most recent version of the page, so long as they are accurate.
Let me know if that helps or if you're all set.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Amp page development issue
Hi everyone currently developing an amp version of my website it validates with no errors, however my <a name="blah"></a>some blah does not work for amp any ideas
Technical SEO | | livingphilosophy0 -
Recovering from disaster
Short Question: What's the best way to get Google to re-index duplicate URLs? Long Story: We have a long ago (1997) established website with a proprietary CMS. Never paid much attention to SEO (other than creating a sitemap) until four months ago. After learning some we started modifying the engine to provide better site to google (proper HTTP codes, consistent URLs to eliminate duplicates - we had something like 15,000 duplicates - etc...) Things went great for three and half months and we reached the first page on google for our main keyword (very, very competitive keyword). Before the SEO we were getting around 25,000 impressions and 3000 clicks on google. After our SEO efforts, we reached 70,000 daily impressions and more than 7000 daily clicks. On Aug 30th, 2014, one of our programmers committed a change to the live server by mistake. This small change effectively changed every article's URL by adding either a dash at its end or a dash and a keyword '-test-keyword' (literally). Nobody noticed anything until two days later as the site worked perfectly for humans. The result of this small code change is that within five days our site practically disappeared from Google's results pages except when one searched for our site's name. Our rank dropped from 8 and 10 to 80 and 100 for our main keywords. We reverted the change as soon as we noticed the problem, but during those two days, Google's bots went on a binge crawling five times the usual number of page crawled per day. We've been trying to recover and nothing seems to be working so far. Google's bots aren't crawling the repaired URLs to get the 301 headers back to the original URL and now we still have over 2300 duplicates as reported by the webmaster tools. Our Google impressions and clicks dropped to way below what we had before we did any SEO, down to 5000 impressions and 1200 clicks (inclusive of our direct domain name search). During the last 15 days (after we fixed the problem), our duplicate count went from a maximum of 3200, down to 1200, then back up to 2300 without any changes on our end. we've redone our sitemap and resubmitted it on day 3. So, what do we do? Do we go through the URLs with 'fetch as Google' function? (that's a bit tedious for 2300 URLs) or we wait for the bots to come around whenever they feel like it? if we do this, should we submit the bad URL, have google fetch it, get the redirect, follow it and then submit the followed URL to the index? Or is there a better solution that I'm unaware of? Second question: Is this something to be expected when something like this happens knowing that our inbound link rarely link to the actual articles?
Technical SEO | | Lazeez0 -
Duplicate pages on wordpress
I am doing SEO on a site which is running on WP. And it has all pages and categories duplicates on domain.com/site/ However, as it got crawled I saw that all domain.com/ pages have rel=canonical with main page tag (does it mean something?). Thing is I will fix permalinks structure and I think WP automatically redirects if it is changed from /?page_id= to /%category%/%postname%/ or /%postname%/ Isn't there something I miss? Second problems is a forum. After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ (fortunately, there are no domain.com/site/forum duplicates). What could be an easy solution to this?
Technical SEO | | OVJ0 -
Duplicate Page Content
Hi, I just had my site crawled by the seomoz robot and it came back with some errors. Basically it seems the categories and dates are not crawling directly. I'm a SEO newbie here Below is a capture of the video of what I am talking about. Any ideas on how to fix this? Hkpekchp
Technical SEO | | mcardenal0 -
Pages not being indexed
Hi Moz community! We have a client for whom some of their pages are not ranking at all, although they do seem to be indexed by Google. They are in the real estate sector and this is an example of one: http://www.myhome.ie/residential/brochure/102-iveagh-gardens-crumlin-dublin-12/2289087 In the example above if you search for "102 iveagh gardens crumlin" on Google then they do not rank for that exact URL above - it's a similar one. And this page has been live for quite some time. Anyone got any thoughts on what might be at play here? Kind regards. Gavin
Technical SEO | | IrishTimes0 -
How to Find all the Pages Index by Google?
I'm planning on moving my online store, http://www.filtrationmontreal.com/ to a new platform, http://www.corecommerce.com/ To reduce the SEO impact, I want to redirect 301 all the pages index by Google to the new page I will create in the new platform. I will keep the same domaine name, but all the URL will be customize on the new platform for better SEO. Also, is there a way or tool to create CSV file from those page index. Can Webmaster tool help? You can read my question about this subject here, http://www.seomoz.org/q/impacts-on-moving-online-store-to-new-platform Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Duplicate Page Content Lists the same page twice?
When checking my crawl diagnostics this morning I see that I have the error Duplicate page content. It lists the exact same url twice though and I don't understand how to fix this. It's also listed under duplicate page title. Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Personal Assistant | Virtual Assistant | Charlotte, NC http://charlottepersonalassistant.com/110 Does this have anything to do with a 301 redirect here? Why does it have http;// twice? Thanks all! | http://www.charlottepersonalassistant.com/ | http://http://charlottepersonalassistant.com/ |
Technical SEO | | eidna220