Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to combine 2 pages (same domain) that rank for same keyword?
-
Hi Mozzers,
A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like:
- www.mycompetition1.com
- www.mycompetition2.com
- www.mywebsite.com/page1.html
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com
Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1.
Does anybody have any experience in this? Any advice is much appreciated.
-
Hi there,
Realistically, the tag should be used for duplicates, yes. How "duplicated" a page is, is subjective: a page with 50% of the same content as another page is probably going to count as duplicated as far as Google goes... where that line of duplication acceptability goes isn't something any of us really know.
For pages where the content is totally different besides the header and footer, you technically shouldn't use canonicalisation. However, experiments have shown that Google honours the tag, even if the pages aren't duplicates. Dr. Pete did an experiment when the tag came out (admittedly a few years ago) where he showed that you could radically reduce the number of pages Google had indexed for a site by canonicalising everything to the home page. I personally had a client do this by accident a couple of years ago, and sure enough, their number of indexed pages dropped very quickly, along with all the rankings those pages had. As an ecommerce site that was ranking for clothing terms, this was very very bad. It took about six weeks to get those rankings back again after we fixed the tags, and the tags were fixed within about five days (should have been quicker but our urgent request went into a dev queue).
So the answer would be that Google seems to honour the tag no matter the content of the pages, but I am pretty sure that if you asked a Googler, they'd tell you that it should only be used for dupes or near-dupes.
-
Hi Jane,
Thanks for the advice. One question. I was under the impression that the rel="canonical" tag was for two pages that had the same content to let google know that the page it is pointing to is the original and should be the one to rank. Do you have any experience using them between 2 pages that have totally different content (minus the header and footer)?
Thanks again.
-
If you are happy for the second page to still exist but not rank, you should use the canonical tag to point the second page to the first one. This will lend the first page the majority of the strength of the second page and perhaps improve its authority and ranking as a result. However, the second page will no longer be indexed because the canonical tag tells Google: "ignore this page over here; it should be considered the same as the canonical version, here."
Again, this can benefit the first page, but it does mean that the second page will no longer rank at all. Only do this if you are okay with that scenario.
Cheers,
Jane
-
I'm afraid that there isn't a perfect solution, but there are various options to consider.
1.) The only way to "combine the SEO juice of both pages" is to 301 redirect one of the pages to the other (and add the content from the old page to the remaining one). However, this means that the second page will no longer exist for your website visitors (coming from organic search or not).
2.) You can use a rel=canonical tag pointing from the secondary page to the preferred one to encourage Google to list only the preferred one the pages in search results. In addition, you could use the robots.txt file or noindex meta tag (the meta tag is the preferred option) to block search engines from indexing the page and having it appear in search results. However, this will not "combine the SEO juice."
Assuming that it is crucial that the second page still exist on your website, I would probably not do anything. You appear twice in the first page of results -- great! Why mess with that? I would just focus on doing all the good SEO best practices and earning more links to those two pages to push them higher over time. (Of course, if I knew your exact situation, I would probably have additional suggestions.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local SEO - ranking the same page for multiple locations
Hi everyone, I am aware that issue of local SEO has been approached numerous times, but the situation that I'm dealing with is slightly different, so I'd love to receive your expert advice. I'm running the website of a property management company which services multiple locations (www.homevault.com). From our local offices in the city center, we also service neighboring towns and communities ( ex: we have an office in Charlotte NC, from which we service Charlotte plus a dozen other towns nearby). We wanted to avoid creating dozens of extra local service pages, particularly since our offers are identical per metropolitan area and we're talking of 20-30 additional local pages for each area. Instead, we decided to create local service pages only for the main locations. Needless to say, we're now ranking for the main locations, but we're missing on all searches for property management in neighboring towns (we're doing good on searches such as 'charlotte property management', but we're practically invisible for 'davidson property management', although we're searvicing that area as well). What we've done so far to try and fix the situation: 1. The current location pages do include descriptions of areas that we serve. 2. We've included 1-2 keywords for the sattelite locations in the main location pages, but we're nowhere near the optimization needed to rank for local searches in neighboring towns (ie, some main local service pages rank on pages 2-4 for sattelite towns, so not good enough). 3. We've included the searviced areas in our local GMBs, directories, social media profiles etc. None of these solutions appear to work great. Should I go ahead and create the classic local pages for each and every town and optimize them on those particular keywords, even if the offer is practically the same, and the number of pages risks going out of control? Any other better ideas? Many thanks in advance!
Intermediate & Advanced SEO | | HomeVaultPM0 -
Keyword difficulty and time to rank
Hello, Is there a correlation between the keyword difficult and the time it takes to rank ? In other words let's say I try to rank for the keyword "seo" and it is going to take 2 years to rank 1 st whereas if I go for "best seo tools in 2018" and it takes just 2 weeks ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Can a .ly domain rank in the United States?
Hello members. I have a question that I am seeking to confirm whether or not I am on the right track. I am interested in purchasing a .ly domain which is the ccTLD for Libya. The purpose of the .ly domain would be for branding purposes however at the same time I do not want to kill the websites ability to rank in Google.com (United States searches) because of this domain. Google does not consider .ly to be one of those generic ccTLDs like. io, .cc, .co, etc. that can rank and Bitly has also moved away from the .ly extension to a .com extension. Back in 2011 when there was unrest in Lybia, a few well known sites that utilized the .ly extension had their domains confiscated such as Letter.ly, Advers.ly and I think Bitly may have been on that list too however with the unrest behind us it is possible to purchase a .ly so being able to obtain one is not an issue. From what I can tell, I should be able to specify in Google Search Console that the website utilizing the .ly extension is a US based website. I can also do this with Google My Business and I will keep the Whois info public so the whois data can been seen as a US based website. Based on everything I just said do any of you think I will be OK if I were to register and use the .ly domain extension and still be able to rank in Google.com (US Searches). Confirmation would help me sleep better. Thanks in advance everyone and have a great day!!
Intermediate & Advanced SEO | | joemaresca0 -
Why my website disappears for the keywords ranked, then reappears and so on?
Hello to everyone. In the last 2 weeks my website emorroidi.imieirimedinaturali.it has a strange behavior in SERP: it disappears for the keywords ranked and then reappears, and so on. Here's the chronicle of the last days: 12/6: message in GWT: Improvement of the visibility of the website in search. 12/6 the website disappears for all the keywords ranked 16/6 the website reappears for all the keywords ranked with some keywords higher in ranking 18/6 the website disappears for all the keywords ranked 22/6 the website reappears for all the keywords ranked 24/6 the website disappears for all the keywords ranked... I can't explain this situation. Could it be a penalty? What Kind? Thank you.
Intermediate & Advanced SEO | | emarketer0 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
What are the effects of having Multiple Redirects for pages under the same domain
Dear Mozers, First of all let me wish you all a Very Happy, Prosperous, Healthy, Joyous & Successful New Year ! I'm trying to analyze one of the website's Web Hosting UK Com Ltd. and during this process I've had this question running through my mind. This project has been live since the year 2003 and since then there have be changes made to the website (obviously). There have also been new pages been added, the same way some new pages have even been over-written with changes in the url structures too. Now, coming back to the question, if I've have a particular url structure in the past when the site was debuted and until date the structure has been changes thrice (for example) with a 301 redirect to every back dated structure, WOULD it impact the sites performance SEOwise ? And let's say that there's hundreds of such redirections under the same domain, don't you think that after a period of time we should remove the past pages/urls from the server ? That'd certainly increase the 404 (page not found) errors, but that can be taken care of. How sensible would it be to keep redirecting the bots from one url to the other when they only visit a site for a short stipulated time? To make it simple let me explain it with a real life scenario. Say if I was staying a place A then switched to a different location in another county say B and then to C and so on, and finally got settled at a place G. When I move from one place to another, I place a note of the next destination I'm moving to so that any courier/mail etc. can be delivered to my current whereabouts. In such a case there's a less chance that the courier would travel all the destinations to deliver the package. Similarly, when a bot visits a domain and it finds multiple redirects, don't you think that it'd loose the efficiency in crawling the site? Ofcourse, imo. the redirects are important, BUT it should be there (in htaccess) for only a period of say 3-6 months. Once the search engine bots know about the latest pages, the past pages/redirects should be removed. What are your opinions about this ?
Intermediate & Advanced SEO | | eukmark0 -
Could ranking problem be caused by Parked Domain?
I've been investigating a serious Google ranking drop for a small website in the UK. They used to rank top 5 for about 10 main keywords and overnight on 24/3/12 they lost rankings. They have not ranked in top100 since. Their pages are still indexed and they can still be found for their brand/domain name so they have not been removed completely. I've coverered all the normal issues you would expect to look for and no serious errors exist that would lead to what in effect looks like a penalty. The investigation has led to a an issue about their domain registration setup. The whois record (at domaintools) shows the status as "Registered and Parked or Redirected" which seems a bit unusual. Checking the registration details they had DNS settings pointing correctly to the webhost but also had web forwarding to the domain registrar's standard parked domain page. The domain registrar has suggested that this duplication could have caused ranking problems. What do you think? Is this a realistic reason for their ranking loss? Thanks
Intermediate & Advanced SEO | | bjalc20110 -
Sudden rank drop for 1 keyword
A page of mine (http://loginhelper.com/networks/facebook-login/) was ranking in the top 10 for keyword (facebook login) and has been for at least 2 months, moving between 5th and 10th. Suddenly in the last 3 days the rank for the keyword dropped from 7th to 46th, yet none of the other keywords have been affected (they target other pages) and their ranks have continued to improve. I am trying to figure out what caused this sudden drop in the ranking of 1 page (the page has quality mainly text based content and isn't in the least bit shallow or spammy) I have been thinking perhaps a crawl or server error may be to cause leaving the page temporarily unavailable or with a big load time... Otherwise what could cause one page to drop so much so quickly whilst other pages improved their rank?
Intermediate & Advanced SEO | | Netboost0