Is this okay with google if i can access my sub categories from two different path?
-
My website is url is abcd.com. One of my category url is abcd.com/mobile.aspx. Which contains 5 sub categories :-
- samung Mobile 2) Nokia Mobile 3) Sony Mobile 4) HTC Mobile 5) Blackberry Mobile
Now if i go in to HTC Mobile sub categories i.e. abcd.com/htcmobile.aspx here i will see all the product related to HTC Mobile.
But at below of all product i will find all sub categories that is samsung mobile, nokia mobile, sony mobile and blackberry mobile.
So i want to task is this okay? Google will not count these categories as duplicate that is i can access all 4 categories i.e. samsung, nokia, sony and blackberry from here 1) abcd.com/mobile.aspx and 2) abcd.com/htcmobile.aspx
Thanks!
Dev
-
Hi Dev,
It's really going to depend on how much of the content is duplicated. From what I've seen, Google isn't very good at chunking pages up YET. They're good at spotting entire pages duplicated (e.g. press releases or articles syndicated across multiple sites), and pages on your site that have the majority of the content the same. But I don't think you're going to run into trouble with a page that has a number of sections, each of which is an entire page on its own.
Where you MIGHT run into trouble is with Panda and thin content. If the content you have for each of the manufacturers is very light, i.e. just a few sentences and an image or two, then those pages might be seen as thin content. While I don't think you have to hit the magic 2000 word mark on every page to avoid being seen as thin content, you certainly are going to want more than 100 words. And, if those manufacturer pages are important search targets for competitive terms--well, then, you probably WILL want those pages to contain somewhere near 2000 words each.
In THAT case, you'll probably want to change the content on the all-manufacturers page, and instead just put a short excerpt for each manufacturer there, along with some sort of "learn more" link to the single manufacturer page.
-
Hi Dev,
If you have same content on several pages - yes it's duplicate.
For example: abcd.com/htcmobile.aspx and below of all products are other sub categories (samsuing, nokia etc) with all products, description etc. - it's duplicate because on the other pages you will have same content but in different order.
If you are are talking about links to other categories below all products on abcd.com/htcmobile.aspx for example, it's just links (like navigation to other subcategories) so this is not duplicate.
Regards,
-
Any input please?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
How can I change the page title "two" (artigos/page/2.html) in each category ?
I have some categories and photo galleries that have more than one page (i.e.: http://www.buffetdomicilio.com/category/artigos and http://www.buffetdomicilio.com/category/artigos/page/2). I think that I must change the tittle and description, but I don't how. I would like to know how can I change the title of each of them without stay with duplicate title and description. Thank you! ahcAORR.jpg
Technical SEO | | otimizador20130 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
About google Disavow tool
My website is attacked by spammed link method, so should i use Goolge disavow tool to remove that links? And i have an question that when i use google Disavow to remove backlinks, but i still not remove it on the webpage that placed my links. Does Google index that backlink again? or never?
Technical SEO | | magician0 -
Google Places Question......
Hi Guys. I am working with a photographer they do not have a studio they shoot on location. However I noticed many photographers within their industry have their home address listed in their google places, and they too shoot on location. My client doesn't want their home address listed so I wondered what options there would be? Do you think renting mail forwarding address would suffice?
Technical SEO | | RankStealer0 -
Same page from different locations has slight different URL, is it a negative SEO practice?
Hi, Recently we made change in our website link generation logic, and now I can reach the same page from different pages with slightly different URLs like this: http://www.showme.com/sh/?h=wlZJNya&by=Featured_ShowMe and http://www.showme.com/sh/?h=wlZJNya&by=Topic Just wondering is this a bad practice and should we avoid it? Thank you, Karen
Technical SEO | | showme0 -
Checkout on different domain
Is it a bad SEO move to have a your checkout process on a separate domain instead of the main domain for a ecommerce site. There is no real content on the checkout pages and they are completely new pages that are not indexed in the search engines. Do to the backend architecture it is impossibe for us to have them on the same domain. An example is this page: http://www.printingforless.com/2/Brochure-Printing.html One option we've discussed to not pass page rank on to the checkout domain by iFraming all of the links to the checkout domain. We could also move the checkout process to a subdomain instead of a new domain. Please ignore the concerns with visitors security and conversion rate. Thanks!
Technical SEO | | PrintingForLess.com0