I want to put 65.000 productpages on NOINDEX, FOLLW at once! Would Google mind?
-
Or have we do this step by step, i.e:
13.000 pages on noindex
13.000 pages on noindex
13.000 pages on noindex
13.000 pages on noindex
13.000 pages on noindexMakes together: 65.000 pages
-
I think you would be fine doing it all at once. The gradual roll-out thing Matt Cutts was referring to had to do with adding pages, not removing them.
You could also use the URL removal tool to get them taken out of the index faster once you have added the noindex,follow tag. You can remove an entire directory that way in one fell swoop as long as you don't need any other pages in that directory to be indexed either.
This is a smart move for a lot of eCommerce sites that like to drop ship products and use manufacturer or distributor-supplied product descriptions at a level that is not scaleable for rewriting on their own. In such cases I optimize the best performing products, get the rest out of the index, and rely heavily on category pages for bringing in search traffic. It isn't the best situation, but it's better than keeping a Panda penalty.
-
Always keep in mind that Google will not simply re-crawl all those pages right away.
It can take weeks/months for Google to crawl each page and change its indexing.
I have personally done this for over 1.5 million pages all in one go on my site and had no issues.
Staggering it will not help control the speed at which googlebot looks at them.
-
That's should be fine. I am glad you said noindex,Follow and not noindex,nofollow.
Remus presents a link by matt cuts, but that is talking about adding pages in great numbers -
Hi Wesley,
It's better if you do it in stages. A more gradual roll out is also recommended by Matt Cutts. His example is for 200K pages, but 65K is not so far away. I think the approach should be the same.
Check his video, it will help you understand what's better:
Should I add an archive of hundreds of thousands of pages all at once?
I hope this helps, good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website dropped out from Google index
Howdy, fellow mozzers. I got approached by my friend - their website is https://www.hauteheadquarters.com She is saying that they dropped from google index over night - and, as you can see if you google their name, website url or even site: , most of the pages are not indexed. Home page is nowhere to be found - that's for sure. I know that they were indexed before. Google webmaster tools don't have any manual actions (at least yet). No sudden changes in content or backlink profile. robots.txt has some weird rule - disallow everything for EtaoSpider. I don't know if google would listen to that - robots checker in GWT says it's all good. Any ideas why that happen? Any ideas what I should check? P.S. Just noticed in GWT there was a huge drop in indexed pages within first week of August. Still no idea why though. P.P.S. Just noticed that there is noindex x-robots-tag in headers... Anyone knows where this can be set?
Intermediate & Advanced SEO | | DmitriiK0 -
Google news and Yoast News
Hi, I have a blog, I want to send my blog to Google news with the plugin "Yoast news".
Intermediate & Advanced SEO | | JohnPalmer
If I'll change the meta-title and ill keep the title of the post as is, for example:
Meta-title (yoast) - TEXT for Search engines | My Brand name
Post tilte - for users - TExT For Users and BlaBla there is a problem? the title of the page and the title of the meta should be same for Google NEWS?0 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
What About Google Panda Update 22?
Maybe I haven't found the threads or whatever but I haven't seen lots of posts about the latest Google Panda update from November 21-22 on SEOmoz. Panda 22 is not even listed here: http://www.seomoz.org/google-algorithm-change Until November 21st, Google killed 3 of 5 websites I own through their Panda updates (never got hit by Penguin updates as I got only original content), accounting for about 25% of my income. Fortunately, the 2 remaining websites gained more traffic throughout the summer of 2012 so my income almost got back to 100% even though I got the "Unnatural Links" warning in Google Webmaster Tools in July. Since then, I did a huge link cleanup and according to the Link Detox Tool (from another SEO service), the number of "toxic links" went from about 350 to 50. Back link reports is as follow: 8% (52) Toxic Links; 57% (382) Suspicious Links; 35% (235) Healthy Links; Out of the 382 suspicious, most of them are coming from the same domain and they are all directories to which my website has been submitted automatically (not using any specific keyword anchor). On the opposite, healthy links are coming from different domains so I like to think they have a stronger impact than suspicious links. That said, my two remaining websites were still doing well until November 21 where it got hit by the Panda. Now traffic has dropped by 55% and income has dropped by 75% (yes I'll have to look for a job within a year if I don't fix this). (I want to add that none of my websites are "thin websites". One has over 1500 pages of content and the other has about 500 pages. All websites have content added 3 to 5 times a week.) What I don't get is that all my "money keywords" are still ranked in the top 10 results on Google according to multiple tools / services I use, yet the impressions dropped from 50% to 75% for those keywords?!? I have a feeling that this time it's not only a drop in ranking. There's a drop in impressions caused by something else. Is it caused by emphasis on local search? Are they showing more ads and less organic results? But here's the "funny part": For the last 5 years, I was never able to advertise my website on Google Adwords. Each time, I got a quality score of about 4/10 only to see it drop to 1/10 within a few hours of launching the campaign. On November 22nd, I build new PPC campaigns based on the exact same PPC campaigns I had the past (same keywords, same ads, same landing pages). Guess what? Now the quality score is between 7/10 and 10/10 (most of them have 10/10) for the exact same PPC campaign! What a "coincidence" huh?
Intermediate & Advanced SEO | | sbrault740 -
404 in google webmaster tool
I have redesigned my website with new web address over 6 months ago and in the google webmaster tools it still shows my old urls with a reponse code 404 and still crawls those pages. How do I make sure they don't appear anymore in the webmaster tool and don't get crawled anymore ? or should I do a re-direct ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How long is the google sandbox these days?
Hello, I'm putting up a new site for the first time in a while. How long is the Google Sandbox these days, and what has changed about it. Before it was 6 months to 1 year long. Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Who is beating you on Google (after Penguin)?
Hi,
Intermediate & Advanced SEO | | rayvensoft
After about a month of Penguin and 1 update, I am starting to notice an annoying pattern as to who is beating me in the rankings on google. I was wondering if anybody else has noticed this.
The sites who are beating me - almost without exception - fall into these 2 categories. 1) Super sites that have little or nothing to do with the service I am offering. Now it is not the homepages that are beating me. In almost all cases they are simply pages hidden in their forums where somebody in passing mentioned something relating to what I do. 2) Nobodies. Sites that have absolutely no links back to them, and look like they were made by a 5 year old. Has anybody else noticed this? I am just wondering if what I see only apply to my sites or if this is a pattern across the web. Does this mean that for small sites to rank, it is now all about on-page SEO? If it all about on-page, well that is great... much easier than link building. But I want to make sure others see the same thing before dedicating a lot of time to overhaul my sites and create new content.| Thanks!0 -
Google +1 and Yslow
After adding Google's +1 script and call to our site (loading asynchronously), we noticed Yslow is giving us a D for not having expire headers for the following scripts: https://apis.google.com/js/plusone.js
Intermediate & Advanced SEO | | GKLA
https://www.google-analytics.com/ga.js
https://lh4.googleusercontent.com... 1. Is their a workaround for this issue, so expire headers are added to to plusone and GA script? Or, are we being to nit-picky about this issue?0