Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why do I have 2 different URL's for the same page - is this good practice?
-
Hi GuysMy father is currently using a programmer to build his new site. Knowing a little about SEO etc, I was a little suspicious of the work carried out. **Anyone with good programming and SEO knowledge, please offer your advice!**This page http://www.thewoodgalleries.co.uk/gallery-range-wood-flooring/ which is soon to be http://www.thewoodgalleries.co.uk/engineered-wood/ you'll see has a number of different products. The products on this particular page have been built into colour categories like thishttp://www.thewoodgalleries.co.uk/engineered-wood/lights-greys http://www.thewoodgalleries.co.uk/engineered-wood/beiges http://www.thewoodgalleries.co.uk/engineered-wood/browns http://www.thewoodgalleries.co.uk/engineered-wood/darks-blacks This is fine. Eventually when we add to our selection of woods, we'll easily segment each product into "colour categories" for users to easily navigate to. My question is - Why do I have 2 different URL's for the same page - is this good practice? Please see below... Visible URL - http://www.thewoodgalleries.co.uk/engineered-wood/browns/cipressa/Below is the permalink seen in Word Press for this page also.Permalink: http://www.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/and in the Word Press snippet shows the same permalink urlCipressa | Engineered Brown Wood | The Wood Gallerieswww.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/
Buy Cipressa Engineered Brown Wood, available at The Wood Galleries, London. Provides an Exceptional Foundation for Elegant Décor, Extravagant ..
If this is completely ok and has no negative search impact - then I'm happy. If not what should I advise to my programmer to do?
Your help would be very much appreciated.
Regards
Faye
-
The site has been in progress for months now. During this time the company has developed some outstanding suppliers and subsequently more products have been added. Because of this we had to rethink the website structure by adding product categories. This has allowed us to implement cleaner urls which are better for SEO, as well as categorising our products which will provide a better user experience for our customers. It also provides the platform to add more products in the future.
I really appreciate you help Linda. I wanted to ask questions on here before getting in touch with my programmer.
This has confirmed my concerns as to "why" this has happened.
Thank you.
-
Agreed, no one should do 301 redirects for no reason. As I asked earlier, why does the other URL exist? If this is all being set up new, it should only be using the new, well-organized path. (Unless there are multiple paths that one can go through to arrive at that page and the developer wants them all to resolve to one, clean URL.) I think your best bet would be to just ask your programmer why.
-
Hi Linda
I'm aware of the purposes of 301's, but why do we have this, is the question? This is a company that has yet to begin advertising or trading, so there is no relevance its purposes?
Sure, if this page had link juice pointing to it, then a 301 would be required of course. But for a new start up, with completely unique pages - I'm not so sure my programmer is implementing best practice.
-
There is a URL: http://www.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/ and it's been around for a while, maybe has some links, built up some authority.
Now the organization of the site is being improved and the better URL: http://www.thewoodgalleries.co.uk/engineered-wood/browns/cipressa/ is to be used for the content that is on that page.
How does the authority of the old page get to the new page? With a 301 redirect. If this is not done, when someone goes to the old URL (maybe it's linked from somewhere, maybe they have a bookmark), they get a 404 error. When Google looks at links that go to the old URL, it wants to credit the links to that page, but that page does not exist anymore as far as Google knows. Google does not know there is a new page with the same information.
For you, the page is that one post in Wordpress or wherever and that stays the same—you are just renaming it. For Google, those two URLs are different pages and in order to tell Google that the one has become the other, you need to 301 redirect it.
-
Hi O2C
Forgive me but I disagree.
Canonical – Hey, (most) Search Engines: I have multiple versions of this page (or content), please only index this version. I'll keep the others available for people to see, but don't include them in your index and please pass credit to my preferred page.
We do not have multiple page versions which are the same, just the one unique for each - hence my "redirect" concern.
-
I would definitely add rel canonical tags to the website pages to let Google know which is the original page as Robert has suggested.
-
Hi Linda
Thanks for your input regarding this.
The only URL we require is http://www.thewoodgalleries.co.uk/engineered-wood/browns/cipressa/. This is based on our organisation of each product by "type" - "colour" - "brand name".
http://www.thewoodgalleries.co.uk/engineered-wood/lights-greys/
http://www.thewoodgalleries.co.uk/engineered-wood/beiges/
http://www.thewoodgalleries.co.uk/engineered-wood/browns/
http://www.thewoodgalleries.co.uk/engineered-wood/darks-blacks
We have the same for other products, http://www.thewoodgalleries.co.uk/parquet-reclaimed/lights-greys/ and http://www.thewoodgalleries.co.uk/prefinished-wood/lights-greys/ - and so on.
All that we require is for each URL to be changed accordingly, not to be redirected with a 301? As far as I'm aware, this page is not needed, is not part of our structure, but still exists as a 301.. Permalink: http://www.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/ ??
-
The URL that is being redirected to is the cleaner URL and also seems to make sense organizationally, which is why I would have gone with that structure from the start.
The question here is why does the other URL exist? Was the older site using that format? In that case, the new programmer is setting up a better organized structure for your father and doing the appropriate redirects, which is a good thing. The new URL will not be an incorrect URL, it will be the correct URL for that page.
-
Is a 301 redirect really necessary for a site such as this? I would like to know if what has been implemented is good practice?
I also do not want to "advertise" what is effectively an incorrect url (however similar), as this will be seen seen in the search engines?
Another possible downside of the 301 is that it does sometimes take a while for the search engines to attribute a page with the search authority of your the original page.
It seems to me a 301 redirect is not "best practice" for a new site with 70 individual "unique" products?
-
Hi Faye,
It looks like the 2nd URL you provided is already 301 redirecting to the first URL so you shouldn't have to worry about it.
Hope that helps!
-
That permalink already 301 redirects to the visible URL in your example and wouldn't cause duplicate content. Sometimes in order to show a nicer-looking link people will use aliases. I do not know why in this case the two structures are needed since it seems the visible link could handle the categories, but then I do not know what all the complexities are—you could ask the programmer why.
-
To answer your question: No, it's not okay. Duplicate content is to be avoided. Ask your programmer to do a 301 redirect one of these pages to another, that automatically redirects users and the GoogleBot, or at least add an canonical url meta tag which defines which page is the "original" or "canonical" version for the content.
See more at Google Support.
As to why it is happening. Difficult to be precise but would venture that one of the URL-s might be a category "page" that gets created automatically, with an url thanks to your page's category structure and the other a real editable "page".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shifting target keyword to a new page, how do we rank the internal page?
I have been targeting one keyword for home page that was ranking between the postilion 6-7 but was never ranking on 1st as there were 2 highly competitive keywords targeted on the same page, I changed the keyword to an internal service page to rank it on 1st, I have optimized the content as well but the home page is still ranking on 11th, how do I get the internal page rank on that keyword
On-Page Optimization | | GOMO-Gabriel0 -
Url shows up in "Inurl' but not when using time parameters
Hey everybody, I have been testing the Inurl: feature of Google to try and gauge how long ago Google indexed our page. SO, this brings my question. If we run inurl:https://mysite.com all of our domains show up. If we run inurl:https://mysite.com/specialpage the domain shows up as being indexed If I use the "&as_qdr=y15" string to the URL, https://mysite.com/specialpage does not show up. Does anybody have any experience with this? Also on the same note when I look at how many pages Google has indexed it is about half of the pages we see on our backend/sitemap. Any thoughts would be appreciated. TY!
On-Page Optimization | | HashtagHustler1 -
Changes taken over in the SERP's: How long do I have to wait until i can rely on the (new) position?
I changed different things on a particular page (mainly reduced the exaggerated keyword density --> spammy). I made it recrawl by Google (Search Console). The new version has now already been integrated in the SERP's.Question: Are my latest changes (actual crawled page in the SERP's is now 2 days old) already reflected in the actual position in the SERP's or should I wait for some time (how long?) to evaluate the effect of my changes? Can I rely on the actual position or not?
On-Page Optimization | | Cesare.Marchetti0 -
Content hidden behind a 'read all/more..' etc etc button
Hi Anyone know latest thinking re 'hidden content' such as body copy behind a 'read more' type button/link in light of John Muellers comments toward end of last year (that they discount hidden copy etc) & follow up posts on Search Engine Round Table & Moz etc etc ? Lots of people were testing it and finding such content was still being crawled & indexed so presumed not a big deal after all but if Google said they discount it surely we now want to reveal/unhide such body copy if it contains text important to the pages seo efforts. Do you think it could be the case that G is still crawling & indexing such content BUT any contribution that copy may have had to the pages seo efforts is now lost if hidden. So to get its contribution to SEO back one needs to reveal it, have fully displayed ? OR no need to worry and can keep such copy behind a 'read more' button/link ? All Best Dan
On-Page Optimization | | Dan-Lawrence0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
Should I redirect mobile traffic to a different url? Will it hurt SEO?
I'm working on a site that has lots of great content and ranks well but essentially the money is generated by affiliate links. I don't have a mobile version of the site but the company I'm affiliated with does offer a mobile redirect to their domain. Will redirecting mobile traffic to a different url hurt my SEO? I think the user will get a better experience by landing on a mobile page but I don't know if google will see it like that. Any thoughts?
On-Page Optimization | | SamCUK0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5