To noindex or not to noindex
-
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do?
- Don't noindex anything - let Google decide what's worth indexing and not.
- Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it.
- Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed.
Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
-
1. yes - if you no index all but 20 pages, those 20 pages would get a boost in rankings. You would end up losing the long tail searches from those other thousands of page - so you'll need to do some cost / benefit analysis on that.
2. you'll need to do a cost / benefit analysis on this one. Are most of the visitors to your site searching in Chinese or English? Are your search terms mainly in Chinese or mainly in English? Are your Chinese speaking visitors more likely to want to visit the .zh subdomain?
You could publish 20 to 50 pages on each subdomain, and then focus on doing some link building. If you have strong rankings across those 40 to 100 pages, then you could start adding more pages slowly over time.
-
Nops, no need to include the no index tag as adding canonical is an indication to Google that what are the original pages that search engine need to index and crawl so al other pages then category pages will be crawled automatically.
-
Hi Moosa. Thanks very much for your reply and great suggestions. If I add canonical tags on each URL page referencing the category page where it belongs, should I also add noindex tags on it? Should then actually all URL pages have noindex tags and only allow category pages to be indexed?
-
Thanks for suggestions. I have some follow-up questions. Would really like to know what you think about the following:
- The "page rank will get shared to all of the pages that you have across your site". In general, does this mean that if I add noindex tags to all but a few pages, they will be ranked much higher? Currently thousands of pages are indexed. Is it correct to say that if only say 20 pages were indexed that would greatly improve their ranking?
- The zh and en versions of the website have different templates and most of the text content is also translated (with the main exception of old blog posts). We could add noindex on all of the zh website or all except the main pages. Would you recommend that?
-
Ok I might sound completely stupid here as I never come across this case before but here is my hypothesis….
While searching for a keyword or URL you another field (may be a checkbox) that represents the category of search.
So, ones the new URL will generate it will come under the specific category automatically.
Customize the category pages so that they look different from each other.
Index the category pages and add canonical tag on any new generated URL of the category page. For example if the new page generates like www.yourwebsite.com/movies/ice-age -3/ this page should have the canonical tag to http://www.yourwebsite.com/movies/
Why?
Creating category pages will allow you more unique pages to get indexed in SERPs without the duplicate content issue. Adding canonical tag on all other URLs will tell category pages are the real pages that Google should consider.
This might help you cater more chances to earn more search traffic from Google.
**This is my assumption what I think should work!
-
Creating a page every time someone performs a search could probably spiral out of control pretty quickly. If you have a certain amount of 'page rank', based on all of the back links you have, that page rank will get shared to all of the pages that you have across your site.
One way you could more naturally control what gets indexed, is by what you link to from your home page. For instance, if you track the most blocked big sites, as well as the most blocked keywords, and have those pages 1 link from your homepage, you could expect those to get indexed naturally when your site is spidered.
As you get more links from other sites, and your trust from the search engines and page rank grows, you should be able to support more pages getting indexed across your site.
There is the issue of your site contents potentially being regarded as 'thin content', since many of the pages appear to be the same from page to page.
One question I had - I saw your site hosts both Chinese language words and English language words, and checks whether those words are being filtered. Perhaps it would make more sense to only show the words in Chinese characters on the zh. subdomain, and the English words on the en. subdomain? Just a thought. Is there any difference between the zh and en subdomains, aside from the language of the template?
Really interesting website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Conditional Noindex for Dynamic Listing Pages?
Hi, We have dynamic listing pages that are sometimes populated and sometimes not populated. They are clinical trial results pages for disease types, some of which don't always have trials open. This means that sometimes the CMS produces a blank page -- pages that are then flagged as thin content. We're considering implementing a conditional noindex -- where the page is indexed only if there are results. However, I'm concerned that this will be confusing to Google and send a negative ranking signal. Any advice would be super helpful. Thanks!
Intermediate & Advanced SEO | | yaelslater0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Robots.txt vs noindex
I recently started working on a site that has thousands of member pages that are currently robots.txt'd out. Most pages of the site have 1 to 6 links to these member pages, accumulating into what I regard as something of link juice cul-d-sac. The pages themselves have little to no unique content or other relevant search play and for other reasons still want them kept out of search. Wouldn't it be better to "noindex, follow" these pages and remove the robots.txt block from this url type? At least that way Google could crawl these pages and pass the link juice on to still other pages vs flushing it into a black hole. BTW, the site is currently dealing with a hit from Panda 4.0 last month. Thanks! Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Should we include a canonical or noindex on our m. (mobile) pages?
According to https://developers.google.com/webmasters/smartphone-sites/details, we should include a canonicalicalize back to our desktop version of the URL, but what if that desktop URL is noindexed? Should the m. version be noindexed as well? Or is it fine to leave it as a canonical?
Intermediate & Advanced SEO | | nicole.healthline0 -
NoIndexing Massive Pages all at once: Good or bad?
If you have a site with a few thousand high quality and authoritative pages, and tens of thousands with search results and tags pages with thin content, and noindex,follow the thin content pages all at once, will google see this is a good or bad thing? I am only trying to do what Google guidelines suggest, but since I have so many pages index on my site, will throwing the noindex tag on ~80% of thin content pages negatively impact my site?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Sitemap contains Meta NOINDEX pages - Good or bad?
Hi, Our sitemap is created by our e-commerce software - Magento - We are probably going to make a lot of products Meta No Index for the moment, until all the content has been corrected on them - but by default, as they are enabled, they will appear in Sitemap. So, the question is: "Should pages that are Meta NOINDEX be listed in a sitemap"? Does it matter? thanks!
Intermediate & Advanced SEO | | bjs20100 -
Why do my https pages index while noindexed?
I have some tag pages on one of my sites that I meta noindexed. This worked for the http version, which they are canonical'd to but now the https:// version is indexing. The https version is both noindexed and has a canonical to the http version, but they still show up! I even have wordpress set up to redirect all https: to http! For some reason these pages are STILL showing in the SERPS though. Any experience or advice would be greatly appreciated. Example page: https://www.michaelpadway.com/tag/insurance-coverage/ Thanks all!
Intermediate & Advanced SEO | | MarloSchneider0 -
Does "Noindex" lead to Loss of Link Equity?
Our company has two websites with about 8,000 duplicate articles between them. Yep, 8,000 articles were posted on both sites over the past few years. This is the definition of cross-domain duplicate content. Plan A is to set all of the articles to "noindex,follow" on the site that we care less about (site B). We are not redirecting since we want to keep the content on that site for on-site traffic to discover. If we do set them to "noindex," my concern is that we'll lose massive amounts of link equity acquired over time...and thus lose domain authority...thus overall site rankability. Does Google treat pages changed to "noindex" the same as 404 pages? If so, then I imagine we would lose massive link equity. Plan B is to just wait it out since we're migrating site B to site A in 6-9 months, and hope that our more important site (site A) doesn't get a Panda penalty in the meantime. Thoughts on the better plan?
Intermediate & Advanced SEO | | M_D_Golden_Peak0