Canonicle & rel=NOINDEX used on the same page?
-
I have a real estate company: www.company.com with approximately 400 agents.
When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com
We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Agent1.com 301’s to agent1.company.comWe have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.comWhat happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results.
My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days laterThe content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match.
We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”.
We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%.
After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains.
Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version?Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site.
Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change).
Thank you in advance for any insight.
-
Not putting canonical on NOFOLLOW pages makes all sense in the world, because if page has NOFOLLOW, then bots WILL NOT FOLLOW ANY links on it, including canonical link.
**P.S. **If your question has been answered, please mark it as answered to prevent trashing the forum
-
Thanks, Charles for taking a moment to comment. I guess I was just a little unsure about using the canonical on the same page that contains a rel=NOINDEX. I read that google suggest not placing a canonical on pages with a rel=NOINDEX, NOFOLLOW but could not find anything on using a canonical with simply the NOINDEX tag.
Thanks again!
-
Hello, my friend.
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version?
Yes, link juice will flow, no problem. And agent page won't be indexed, correct. make sure to remove whatever blocking from robots.txt though.
Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change).
The answer to this question depends on if you want those category pages rank? if not whatsoever - go ahead and noindex-tag them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Structure - Is it ok to Keep current flat architecture of existing site pages and use silo structure on two new categories only?
Hi there, I have a site structure flat like this it ranks quite well for its niche site.com/red-apples.html site.com/blue-apples.html The site is branching out into a new but related lines of business is it ok to keep existing site architecture as above while using a silo structure just for the two new different but related business? site.com/meat/red-meat.html site.com/fish/oceant-trout.html Thanks for any advice!
Intermediate & Advanced SEO | | servetea0 -
301ing Pages & Moving Content To Many Other Domains
Recently started working with a large site that, for reasons way beyond organic search, wants to forward internal pages to a variety of external sites. Some of these external sites that would receive the content from the old site are owned, admin'd and/or hosted by the old site, most are not. All of the sites receiving content would be a better topic fit for that content than the original site. The process is not all at once, but gradual over time. No internal links on the old site to the old page or the new site/url would exist post content move and 301ing. The forwarding is mostly to help Google realize the host site of this content is not hosting duplicate content, but is the one true copy. Also, to pick up external links to the old pages for the new host site. It's a little like domain name change, but not really since the old site will continue to exist and the new sites are a variety of new/previously existing sites that may or may not share ownership/admin etc. In most cases, we won't be able to change any external link pointing to the original site and will just be 301ing the old url to the contents new home on another site. Since this is pretty unusual (like I wouldn't get up in the morning and choose to do this for the heck of it), here are my three questions: Is there any organic search risk to the old site or the sites receiving the old content/301 in this maneuver? Will the new sites pick up the link equity benefit on pages that had third party/followed links continuing to point to the old site but resolving via the 301 to this totally different domain? Any other considerations? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945011 -
Implementing AMP pages on WordPress blog
Hey Moz Users, Has anyone tried using the WordPress plugin for AMP pages on their blog yet? Here's the link to it: https://wordpress.org/plugins/amp/. The implementation seems pretty straightforward but since there will be an AMP and a mobile friendly version of the posts on my blog I'm worried it will create a lot of duplicate content issues. I've seen a lot of articles pointing to a rel canonical tag that can be used to fix this situation. Not sure if I'm going to have an AMP version of all the posts on my blog, so this seems like it would be a pain to place the tag manually on specific pages with the AMP version only. Has anyone tried this plugin and what have you done to fix this duplicate content issue? Thanks
Intermediate & Advanced SEO | | znotes0 -
We're currently not using schemas on our website. How important is it? And are websites across the globe using it?
Schemas looks like an important thing when it comes to structuring your website and ensuring the crawl bots get all the details. I've been reading a lot of articles around the web and most of them are saying that schemas are important but very few websites are using it. Why so? Are the schemas on schema.org there to stay or am I wasting my time?
Intermediate & Advanced SEO | | Shreyans920 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Should I noindex the site search page? It is generating 4% of my organic traffic.
I read about some recommendations to noindex the URL of the site search.
Intermediate & Advanced SEO | | lcourse
Checked in analytics that site search URL generated about 4% of my total organic search traffic (<2% of sales). My reasoning is that site search may generate duplicated content issues and may prevent the more relevant product or category pages from showing up instead. Would you noindex this page or not? Any thoughts?0 -
Getting individual website pages to rank for their targeted terms instead of just the home page
Hi Everyone, There is a pattern which I have noticed when trying to get individual pages to rank for the allocated targeted terms when I execute an SEO campaign and would been keen on anyones thoughts on how they have effectively addressed this. Let me try and explain this by going through an example: Let's say I am a business coach and already have a website where it includes several of my different coaching services. Now for this SEO campaign, I'm looking to improve exposure for the clients "business coaching" services. I have a quick look at analytics and rankings and notice that the website already ranks fairly well for that term but from the home page and not the service page. I go through the usual process of optimising the site (on-page - content, meta data, internal linking) as well as a linkbuilding campaign throughout the next couple of month's, however this results in either just the home page improving or the business page does improve, but the homepage's existing ranking has suffered, therefore not benefiting the site overall. My question: If a term already ranks or receives a decent amount of traffic from the home page and not from the page that its supposed to, why do you think its the case and what would you be your approach to try shift the traffic to the individual page, without impacting the site too much?. Note: To add the home page keyword target term would have been updated? Thanks, Vahe
Intermediate & Advanced SEO | | Vahe.Arabian0 -
Is there an optimal ratio of external links to a page vs internal links originating at that page ?
I understand that multiple links fro a site dilute link juice. I also understand that external links to a specific page with relevant anchortext helps ranking. I wonder if there is an ideal ratioof tgese two items
Intermediate & Advanced SEO | | Apluswhs0