How to remove hundreds of duplicate pages
-
Hi - while i was checking duplicate links, am finding hundreds of duplicates pages :-
-
having undefined after domain name and before sub page url
-
having /%5C%22/ after domain name and before the sub page url
-
Due to Pagination limits
Its a joomla site - http://www.mycarhelpline.com
Any suggestions - shall we use:-
-
301 redirect
-
leave these as standdstill
-
and what to do of pagination pages (shall we create a separate title tag n meta description of every pagination page as unique one)
thanks
-
-
Okay, I took a look at the plugin Ben recommended, and took another look at the SH404SEF one. The free one Ben recommended (http://extensions.joomla.org/extensions/site-management/sef/1063) looks like it can help out with some duplicate content - but what I recommend doing is getting the SH404SEF here http://anything-digital.com/sh404sef/features.html because it allows for setting up canonical tags and also gives you the option to add the rel=next feature to your paginated pages, which is one of your problem areas.
One thing I noticed though is that it specifically states it "automatically adds canonical tags to non-html pages" - so that means it will apply it automatically to Joomla's defaul pdf view, etc. While this is helpful, it may not solve the full issue of your duplicate pages with the undefined and "/%5C%22/" issue.
It does however state that it "removes duplicate URLs" - how it identifies and removes these, I am not sure. You may want to try it out because it is useful for other optimization tasks - or contact the owner for more information.
If the tool doesn't recognize and remove the duplicate pages caused by /undefined/ and "/%5C%22/" then you should disallow crawling of these in your robots.txt file. While you are in your robots.txt file you should remove the /images/ because you want those to be crawled - Joomla adds that in by default.
Because a lot of these pages have already been crawled, you should do a 301 on the duplicate pages to their matching page. This sounds like it will be a long process - this may be aided by the sh404sef plugin - not sure.
I just want to also add that I am in no way affiliated with any of these plugins.
-
The only way to solve the duplication error you are getting is to make the URL's distinct. Googlebot comes to your site and looks at the URL's and if they are not distinct it may not index them very well. I understand your site is showing up fine in the SERP's so this may be one of those items you place on a lower priority until later.
I think R.May knows Joomla so I'll refer to him on how to accomplish this but it may be worth it to make the adjustment. You may find the end result of making your page URL more distinct will actually increase your current SERPs. Just a thought.
Other than that. If your site isn't hurting and the only thing you are concerned about is the report in SEOmoz then I would move on and just make a mental note of it for later.
-
Hi ben - changing url is not well required as the site is getting good serp, however - the duplicacy issue to saveguard us from any future issue - is what we seek for
-
Hi - thanks for replying
-
For the dynamic url - Yes - at the initial start - it was missed and as on its not reqd somehow - as the pages are getting indexed well and good in SERP
-
For pagination - Where we needs this is like in our used car section, discount section& news section where multiple pages are created. shall we create separate title & meta description for every pagination page. is it ideally reqd ?
http://www.mycarhelpline.com/index.php?option=com_usedcar&view=category&Itemid=3
- 'undefined' & /%5C%22/ is coming as per report of SEOmoz and is almost on every page of site (except of home page) with the dynamic url after domain name are preceded with these 2 strings as per moz report
how to get this corrected - want to be preventive from this duplicacy n avoid getting a hit in future even if its going well now -
-
-
I'm not a Joomla expert but to make your URL's search engine friendly you are going to need to add an extension like this. That will allow you to make more distinct URLs that will not be considered "duplicate" anymore.
-
Joomla has soo many dup content issues, you have to know Joomla really well to avoid most of them. The biggest issue is you didn't enable the SEF URLs from the start and left the default index.php?option=com on most of them, which stuffs your URLs full of ugly parameters.
You can still enable this in your global options and with a quick edit to htaccess - but it will change all of your current URLs and you will need to 301 all of them, so that isn't a great option unless you are really suffering - and depending on if you are using J 1.6 or under, this is a time consuming nasty process. Also this is unlikely to get rid of any existing duplicate pages - but may make dealing with them and finding them easier.
I don't see the specific examples you posted though, where are you seeing "undefined" and "%5C%22/ " ?
You should implement rel=canonical on the correct version of each page. I recommend SH404SEF which is a Joomla plugin and makes this process easier - but it isn't free. I don't know of a good free plugin that does this, and Joomla's templates make doing this manually difficult.
Looking at it quickly, I also didn't notice any articles that were paginated, but you should try to follow the rel="next" and rel="prev" for paginated pages. This is likely something you will have to edit your Joomla core files to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use noindex or robots to remove pages from the Google index?
I have a Magento site and just realized we have about 800 review pages indexed. The /review directory is disallowed in robots.txt but the pages are still indexed. From my understanding robots means it will not crawl the pages BUT if the pages are still indexed if they are linked from somewhere else. I can add the noindex tag to the review pages but they wont be crawled. https://www.seroundtable.com/google-do-not-use-noindex-in-robots-txt-20873.html Should I remove the robots.txt and add the noindex? Or just add the noindex to what I already have?
Intermediate & Advanced SEO | | Tylerj0 -
What is the proper way to execute 'page to page redirection'
I need to redirection every page of my website to a new url of another site I've made. I intend to add:"Redirect 301 /oldpage.html http://www.example.com/newpage.html"I will use the 301 per page to redirect every page of my site, but I'm confused that if I add:"Redirect 301 / http://mt-example.com/" it will redirect all of my pages to the homepage and ignore the URLs i have separately mentioned for redirection.Please guide me.
Intermediate & Advanced SEO | | NABSID0 -
Duplicate or not ?
Hello, I have an ecommerce website with products I have many categories and more products are associated with several categories (I can not do otherwise). Urls of each product are not duplicated because I have : http://www.site.com/product-name However, my breadcrumb varies depending on the way. I have for example: If I go through the A section and sub-section Aa, my breadcrumb will:
Intermediate & Advanced SEO | | android_lyon
Home> Section A> subheading Aa> product 1 If >> I go through the B section and sub-section Ca, my breadcrumb will:
Home> Section B> subheading Ca> product 1 My question: is that with only a breadcrumb different for my product sheets, there is a duplication? My opinion ...... not because the url of the page is unique. Thank you for your feedback. Sorry for the english, i'm french 😉 D.0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Landing Page - Home Page redesign SEO factor question - Serious concern.
Hi Folks, I'm considering making a big change to our website and really need some expert advise. Will we lose ranking if we do what I propose? Our site www.meninkilts.com, needs to split users/clients by "Commercial" and "Residential" so we can message/market completely differently to each client. We are considering doing this structure: Landing Page | | Commercial Homepage Residential Homepage Right now we rank well, for our keywords like "Window Cleaning cityname" but are worried that adding a landing page, and splitting our site to two homepages will effect seo (ie: a landing page would only have two buttons: one for commercial and one for residential). What would be the best way to handle this. Looking forward to your insights! Cheers Brent
Intermediate & Advanced SEO | | MenInKilts0 -
Duplicate page content and duplicate pate title
Hi, i am running a global concept that operates with one webpage that has lot of content, the content is also available on different domains, but with in the same concept. I think i am getting bad ranking due to duplicate content, since some of the content is mirrored from the main page to the other "support pages" and they are almost 200 in total. Can i do some changes to work around this or am i just screwed 🙂
Intermediate & Advanced SEO | | smartmedia0 -
Category Pages - Canonical, Robots.txt, Changing Page Attributes
A site has category pages as such: www.domain.com/category.html, www.domain.com/category-page2.html, etc... This is producing duplicate meta descriptions (page titles have page numbers in them so they are not duplicate). Below are the options that we've been thinking about: a. Keep meta descriptions the same except for adding a page number (this would keep internal juice flowing to products that are listed on subsequent pages). All pages have unique product listings. b. Use canonical tags on subsequent pages and point them back to the main category page. c. Robots.txt on subsequent pages. d. ? Options b and c will orphan or french fry some of our product pages. Any help on this would be much appreciated. Thank you.
Intermediate & Advanced SEO | | Troyville0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0