Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Two URL's for the same page
-
Hi, on our site we have two separate URL's for a page that has the same content. So, for example - 'www.domain.co.uk/stuff' and 'www.domain.co.uk/things/stuff' both have the same content on the page.
We currently rank high in search for 'www.domain.co.uk/things/stuff' for our targeted keyword, but there are numerous links on the site to www.domain.co.uk/stuff and also potentially inbound links to this page. Ideally we want just the www.domain.co.uk/things/stuff URL to be present on the site, what would be the best course of action to take?
Would a simple Canonical tag from the '/stuff' URL which points to the '/things/stuff' page be wise? If we were to scrap the '/stuff' URL totally and redirect it to the 'things/stuff' URL and change all our on site links, would this be beneficial and not harm our current ranking for '/things/stuff'?
We only want 1 URL for this page for numerous reasons (i.e, easier to track in Analytics), but I'm a bit cautious that changing the page that doesn't rank may have an affect on the page that does rank!
Thanks.
-
Hello Julian,
If you follow my advice above you should be fine.
-
Thank you for the long and detailed answer, theres some great advice there.
Basically both URL's have the right keywords in, it's just the URL was changed a while back so both still remain on the site. The newer URL is the one that ranks high on Google, the old one doesn't appear at all. There is no need for the old one, it serves no purpose that the new one doesn't. So surely getting rid of the old one won't affect the new ones ranking?
I see you put I should have full rankings back within 3-6 weeks, but there would be no reason why the URL that currently ranks high would lose any ranking surely?
Thanks again.
-
I'm going to weigh in here with a slightly different opinion. I wouldn't just go with whichever one ranks best because I think he can do this without long-term damage to rankings and it would be best to go with whichever one he wants from a usability/branding perspective barring any major technological issues/costs.
Though he didn't say why, he did say "Ideally we want just the www.domain.co.uk/things/stuff URL to be present on the site..." and I'm going to assume they have reasons for this.
In that case, I'd follow this course of action:
#1 Apply a rel = "canonical" tag to both pages and reference the /things/stuff URL as canonical. Make this an absolute path (i.e. include http://www.domain.com)
#2 While waiting for search engines to see this tag go ahead and begin updating all internal links to /stuff/* and point them to /things/stuff* instead. You may need to do some mod URL rewrites to change the URLs used within the system. The point here is to change everything you can instead of relying on the redirects as a band-aid for a problem you can mostly fix.
#2.5 Do not change the links in the XML sitemap yet. You want search engines to have a crawl-path to the old URLs for awhile longer so they can find their way back to the page and see the redirect faster than they would by relying on their database of URLs to randomly crawl.
#3 Because there may be external links you do not have the ability to update, apply the 301 redirect from /stuff/* URLs to the counterpart /things/stuff* URLs.
#3.5 Resubmit the old XML sitemap. Google may reject it because of the redirects, but it does usually spark a fresh crawl of the site.
#4 Update the XML sitemap and submit with the new URLs.
#5 Monitor closely. Keep an eye on new 404 errors, as you may have to add additional redirects that fell through the cracks. Crawl the site with Screaming Frog, looking for redirect loops, redirect chains, 301s that could be updated to link directly to the destination, 404 errors, 500 errors, non-canonical URLs... Keep an eye on rankings and traffic from search. If all went well you should have full rankings back within 3-6 weeks. If you do not have it back by 6 weeks you may have a technical issue to deal with that is out of the norm, in my experience. At that point I'd start taking a close look at log files with Splunk.
Note: In this case I would NOT use Google's "URL Removal Tool" as it could possibly cause some of the external links from the URL you're removing to move over via the redirect to the new URL. The 301 and the fact that you are updating all internal links (and external links you have direct control of) to the new URL should get the old one out of the index in due time.
Note: This advice is for moving from one directory to another with the exact same page and structure on the same domain. There are important differences between that and moving to a new domain, or redirecting to content that isn't an exact replica of that on the original URL.
-
Since you have links pointing at them both, I would just redirect the lower ranking one to the higher ranking one. 301
The one that ranks better woud be the one I would keep. Sometimes redirects and url changes can take a while for search engines to find, even if you fetch as Google.
-
Hi,
it wouldn't harm the page no, having said that for site navigation purposes it might be a bit confusing having 301 redirects all over the place instead of the tag. It may help but there is never a guarantee essentially the canonical tag works the same as a 301 for link juice so you can always give that a go first and if nothing happens then 301 it but its up to you.
It comes down the the user would it benefit the user 301 or would it add to page load times or get confusing? If its a permanent site resign go for it though.
it is just telling search engines "this (the page) is the new home/location of the page you're looking for" then they will update their records to reflect it - a bit like when you move house and tell the postman you moved.
Good luck!
-
Thanks for your reply Chris. The thing is I don't need / want the page that isn't ranking in Google anymore, it serves no purpose other than to confuse things when looking at the Analytics! If I were to do a redirect from the page that doesn't rank to the page that does, that wouldn't harm the page that does rank would it?
The page that doesn't rank is linked to from the main navigation, but the page that does rank isn't! Would I be right in thinking a redirect may actually help the page that does rank, even more?
Thanks again.
-
Hi there,
the canonical sounds perfect! Personally I tend to put it on the link closer to the homepage but its preference really logically make the page that's stronger to start with the "original". No need to scrap, the tag will let you keep your layout but give the SEO benefit to just one page.
in short canonical is the perfect match for your needs!
More info - https://support.google.com/webmasters/answer/139066?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changes taken over in the SERP's: How long do I have to wait until i can rely on the (new) position?
I changed different things on a particular page (mainly reduced the exaggerated keyword density --> spammy). I made it recrawl by Google (Search Console). The new version has now already been integrated in the SERP's.Question: Are my latest changes (actual crawled page in the SERP's is now 2 days old) already reflected in the actual position in the SERP's or should I wait for some time (how long?) to evaluate the effect of my changes? Can I rely on the actual position or not?
On-Page Optimization | | Cesare.Marchetti0 -
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | | Deluxe0 -
Duplicate URL's in Sitemap? Is that a problem?
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
On-Page Optimization | | Luciana_BAH0 -
Content hidden behind a 'read all/more..' etc etc button
Hi Anyone know latest thinking re 'hidden content' such as body copy behind a 'read more' type button/link in light of John Muellers comments toward end of last year (that they discount hidden copy etc) & follow up posts on Search Engine Round Table & Moz etc etc ? Lots of people were testing it and finding such content was still being crawled & indexed so presumed not a big deal after all but if Google said they discount it surely we now want to reveal/unhide such body copy if it contains text important to the pages seo efforts. Do you think it could be the case that G is still crawling & indexing such content BUT any contribution that copy may have had to the pages seo efforts is now lost if hidden. So to get its contribution to SEO back one needs to reveal it, have fully displayed ? OR no need to worry and can keep such copy behind a 'read more' button/link ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Hiding body copy with a 'read more' drop down option
Hi I just want to confirm how potentially damaging using java script to hide lots of on page body copy with a 'read more' button is ? As per other moz Q&A threads i was told that best not to use Javascript to do this & instead "if you accomplish this with CSS and collapsible/expandable <DIV> tags it's totally fine" so thats what i advised my clients dev. However i recently noticed a big drop in rankings aprox 1 weeks after dev changing the body copy format (hiding alot of it behind a 'read more' button) so i asked them to confirm how they did implement it and they said: "done in javascript but on page load the text is defaulting to show" (which is contrary to my instructions) So how likely is it that this is causing problems ? since coincides with ranking drop OR if text is defaulting to show it should be ok/not cause probs ? And should i request that they redo as originally instructed (css & collapsible divs) asap ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Why is my contact us page ranking higher than my home page?
Hello, It doesn't matter what keyword I put into Google (when I'm not signed in and have cleaned down my browsing history) the contact us page ranks higher than the home page. I'm not sure why this is, the home page has a higher page authority, more links and more social media shares, the website is an established one. When I have checked Google Analytics my home page gets more people landing on it than the contact us page. It looks like people are ignoring the contact us page and scrolling down until they find the home page. I'd appreciate any help or advice you might have. Thank you.
On-Page Optimization | | mblsolutions2 -
Home page and category page target same keyword
Hi there, Several of our websites have a common problem - our main target keyword for the homepage is also the name of a product category we have within the website. There are seemingly two solutions to this problem, both of which not ideal: Do not target the keyword with the homepage. However, the homepage has the most authority and is our best shot at getting ranked for the main keyword. Reword and "de-optimise" the category page, so it doesn't target the keyword. This doesn't work well from UX point of view as the category needs to describe what it is and enable visitors to navigate to it. Anybody else gone through a similar conundrum? How did you end up going about it? Thanks Julian
On-Page Optimization | | tprg0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0