URLs: Removing duplicate pages using anchor?
-
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same.
The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box.
So instead of 10 URLs, I now have one URL.
- Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2)
For e.g:
Old URLs.
- www.example.com/product-alpha-size1
- www.example.com/product-alpha-size2
- www.example.com/product-alpha-size3
- www.example.com/product-alpha-size4
- www.example.com/product-alpha-size5
New URLs
- www.example.com/product-alpha-size1
- www.example.com/product-alpha-size1?f=size2
- www.example.com/product-alpha-size1?f=size3
- www.example.com/product-alpha-size1?f=size4
- www.example.com/product-alpha-size1?f=size5
Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
-
Thanks Everett,
- Rel="canonical" is in place, so that's covered
- The urls with the parameter are only accessible if you want to directly access a particular size. If you are on the default page and switch sizes from the dropdown, no URL change is presented.
- I have left webmaster to decide what should be crawled or not. The parameter has been mentioned though.
-
Cyto,
The Google Webmaster Tools parameter handling, in my opinion, is often best left up to Google. In other words, I rarely change it. Instead, I try to fix the issue itself. In your case, here is what I would advise:
Instead of using a parameter in the URL, use cookies or hidden divs to change the content on the page to the different size. Have a look at most major online retailers. You can select a size or color from the drop down and it never changes the URL.
If this is not possible, I recommend the following:
Ensure the rel = "canonical" tag on all of those pages references the canonical version (e.g. /product-alpha-size1) which will consolidate the link-related metrics like PageRank into the one page.
-
Please say YES
-
Thank you Celilcan2,
- I'll set it up as 'yes' and it 'narrows' the page
- What is the perk of doing this though? Will Google not count anything after the parameter as something or value, it would focus on just the single URL?
-
Go to google webmaster tools
- On the Dashboard, under Crawl, click URL Parameters.
- Next to the parameter you want, click Edit. (If the parameter isn’t listed, click Add parameter. Note that this tool is case sensitive, so be sure to type your parameter exactly as it appears in your URL.)
- If the parameter doesn't affect the content displayed to the user, select **No ... **in the Does this parameter change... list, and then click Save. If the parameter does affect the display of content, click Yes: Changes, reorders, or narrows page content, and then select how you want Google to crawl URLs with this parameter.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One Article, Multiple(Same Keyword) Anchors, Same Urls
Hey Folks, So I have a 1000 word articles talking about say Dubai Holiday. Is it okay to have 4-5 Dubai Holiday as anchor linked to the same page. Or it should be only be used once.
Intermediate & Advanced SEO | | SAISEO0 -
Best to Combine Listing URLs? Are 300 Listing Pages a "Thin Content" Risk?
We operate www.metro-manhattan.com, a commercial real estate website. There about 550 pages. About 300 pages are for individual listings. About 150 are for buildings. Most of the listings pages have 180-240 words. Would it be better from an SEO perspective to have multiple listings on a single page, say all Chelsea listings on the Chelsea neighborhood page? Are we shooting ourselves in the foot by having separate URLs for each listing? Are we at risI for a thin cogent Google penalty? Would the same apply to building pages (about 150)? Sample Listing: http://www.nyc-officespace-leader.com/listings/364-madison-ave-office-lease-1802sf Sample Building: http://www.nyc-officespace-leader.com/for-a-new-york-office-space-rental-consider-one-worldwide-plaza-825-eighth-avenue My concern is that the existing site architecture may result in some form of Google penalty. If we have to consolidate these pages what would be the best way of doing so? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
How can I remove my old sites URL from showing up in Google?
Hi everyone. We have had a new site up for over a year now. When I search site:sqlsentry.net the old url still shows up and while those pages are redirected to .com I'd like to get the .net URL's out of google forever. What is the best way I can go about that?
Intermediate & Advanced SEO | | Sika220 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
All Thin Content removed and duplicate content replaced. But still no success?
Good morning, Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk. Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS. Can anyone tell me why we aren't making any progress or spot something we are not doing correctly? Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500). Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Removing old versions of a page.
Recently one of my good friends updated his iweb based screen printing site to wordpress per my recommendation. This update has helped dramatically boost his rankings to #3 for most local keywords. This new site is now V5 of his site, but all older iweb versions are still on the ftp. There are a total of 209 pages on the ftp, as versions of about 30 actual pages. The pages have changed significantly with each update, leaving very little duplicate content, but the old ones are still on the google index. Would it hurt the rankings to clean up these older versions & 301 redirect to the new versions, or should we leave them? The site for reference is: http://takeholdprinting.com
Intermediate & Advanced SEO | | GoogleMcDougald0 -
Removed Internal Rel=NoFollows from power internal page - how long till reflected in Google?
I just started with a client, who has an internal page (not the homepage) that gets about 70% of all total links to the site and ranks #1 for a highly competitive keyword. For some reason, the first set of links, including the first anchor text link to the homepage are nofollowed. I removed the nofollows yesterday. Today, The internal page has already been reindexed in Google showing the followed anchor text link to the homepage Should I expect a jump in link juice pointing to my homepage immediately with a corresponding rankings boost? Homepage is #8 for target term. I hope this makes sense. Any advice would be greatly appreciated.
Intermediate & Advanced SEO | | MattAaron0