Is this okay with google if i can access my sub categories from two different path?
-
My website is url is abcd.com. One of my category url is abcd.com/mobile.aspx. Which contains 5 sub categories :-
- samung Mobile 2) Nokia Mobile 3) Sony Mobile 4) HTC Mobile 5) Blackberry Mobile
Now if i go in to HTC Mobile sub categories i.e. abcd.com/htcmobile.aspx here i will see all the product related to HTC Mobile.
But at below of all product i will find all sub categories that is samsung mobile, nokia mobile, sony mobile and blackberry mobile.
So i want to task is this okay? Google will not count these categories as duplicate that is i can access all 4 categories i.e. samsung, nokia, sony and blackberry from here 1) abcd.com/mobile.aspx and 2) abcd.com/htcmobile.aspx
Thanks!
Dev
-
Hi Dev,
It's really going to depend on how much of the content is duplicated. From what I've seen, Google isn't very good at chunking pages up YET. They're good at spotting entire pages duplicated (e.g. press releases or articles syndicated across multiple sites), and pages on your site that have the majority of the content the same. But I don't think you're going to run into trouble with a page that has a number of sections, each of which is an entire page on its own.
Where you MIGHT run into trouble is with Panda and thin content. If the content you have for each of the manufacturers is very light, i.e. just a few sentences and an image or two, then those pages might be seen as thin content. While I don't think you have to hit the magic 2000 word mark on every page to avoid being seen as thin content, you certainly are going to want more than 100 words. And, if those manufacturer pages are important search targets for competitive terms--well, then, you probably WILL want those pages to contain somewhere near 2000 words each.
In THAT case, you'll probably want to change the content on the all-manufacturers page, and instead just put a short excerpt for each manufacturer there, along with some sort of "learn more" link to the single manufacturer page.
-
Hi Dev,
If you have same content on several pages - yes it's duplicate.
For example: abcd.com/htcmobile.aspx and below of all products are other sub categories (samsuing, nokia etc) with all products, description etc. - it's duplicate because on the other pages you will have same content but in different order.
If you are are talking about links to other categories below all products on abcd.com/htcmobile.aspx for example, it's just links (like navigation to other subcategories) so this is not duplicate.
Regards,
-
Any input please?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub Domain Redirect
Hey Everyone, Here is the situation : Currently, a website's sub domain is being redirected to the main website home page. We're having issues getting the sub domain pages indexed. Just want to confirm that it is because of the redirect on the sub domain URL. Should we kill the sub domain redirect and set it up as it's own page? Will that solve the indexing issue for the sub domain pages. More explanation below: subdomain.domain.com currently redirects to domain.com We're having issues indexing pages belonging to the sub domain ( subdomain.url.com/page1 or subdomain.url.com/page2) Appreciate your input in advance. Cheers,
Technical SEO | | SEO5Team0 -
Multiple sub domain appearing
Hi Everyone, Hope were well!. Have a strange one!!. New clients website http://www.allsee-tech.com. Just found out he is appearing for every subdomain possible. a.alsee-tech.com b.allsee-tech.com. I have requested htaccess as this is where I think the issue lies but he advises there isn't anything out of place there. Any ideas in case it isn't? Regards Neil
Technical SEO | | nezona0 -
Should we handle this redirect differently?
So our question is should we handle page redirection/rewriting in php or in .htaccess (with a specific problem we are running into outlined below). We have an ecommerce store in a subfolder of our site (example.com/store/). In the next folder down we have a group of widgets(www.example.com/store/widget-group1). Recently we put a .htaccess redirect in the top level folder (example.com/store/.htaccess), in order to re-write some URL’s and also 301 a page to another page. This seems to be negatively affecting our /widgets-group1/ subfolder however (organic traffic to example.com/store/widget-group1) took a nose dive 3 days after putting the .htaccess redirect in place on the /store/ folder and it has not recovered 8 days later). *Nothing appears outwardly wrong with the current setup to the eye when viewing the pages or requesting as googlebot (the only issue being the nose dive in organic traffic lol) *both subfolders are setup in apache config file to allow local overrides of .htaccess as follows: <directory store="" widget-group1="">Options -Indexes FollowSymLinks -MultiViews
Technical SEO | | altecdesign
AllowOverride All
Order allow,deny
allow from all</directory> <directory store="">Options -Indexes FollowSymLinks -MultiViews
AllowOverride All
Order allow,deny
allow from all</directory>0 -
Google authorship what am i doing wrong
Hi, i am using google authorship on my site but when i use the testing tool it is not working. before the upgrade we had it working fine but now it does not seem to work. we have our google plus account pointing to the site and the writer we are trying to add is not coming up on the tool here is the code we are putting on the page Google+ and the page in question is here http://www.in2town.co.uk/emmerdale/emmerdale-laurel-is-determined-to-take-action when i check the tool i get the following Authorship Testing ResultAuthorship is not working for this webpage.andAuthorship rel=author MarkupCannot verify that rel=author markup has established authorship for this webpage.the tool i am using to check is http://www.google.com/webmasters/tools/richsnippetsany help to solve this problem would be great. i am using joomla
Technical SEO | | ClaireH-1848860 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
How to remove a sub domain from Google Index!
Hello, I have a website having many subdomains having same copy of content i think its harming my SEO for that site since abc and xyz sub domains do have same contents. Thus i require to know i have already deleted required subdomain DNS RECORDS now how to have those pages removed from Google index as well ? The DNS Records no more exists for those subdomains already.
Technical SEO | | anand20100 -
How to disallow google and roger?
Hey Guys and girls, i have a question, i want to disallow all robots from accessing a certain root link: Get rid of bots User-agent: * Disallow: /index.php?_a=login&redir=/index.php?_a=tellafriend%26productId=* Will this make the bots not to access any web link that has the prefix you see before the asterisk? And at least google and roger will get away by reading "user-agent: *"? I know this isn't the standard proceedure but if it works for google and seomoz bot we are good.
Technical SEO | | iFix0 -
"Site Suspended" in Google Adwords + Lost all rankings in Google => is this related?
Can anyone share thoughts on this: Does the S recently (mid april) we revamped our website (same content, new layout, strong brand), but a few days later our google rep contacted us to tell that she got a "red flag" for one of our SEA campaigns (we broke the bridge page policy, not on purpose to be clear), they were completely correct on this matter. We even got some extra time to correct this, normal policy is only 10 days. But, we were a little slow, so all our Adwords Campaigns are suspended and we get the message "Site suspended". We are working to have this fixed, our Google rep even granted some more time to fix this. Now, almost simultaneously, same time frame, all our new pages, that were already ranking well tx to proper 301 rules, suddenly fell out of the google SERPS, nothing to be found anymore up till now. Our website is live since 1996, no issues, up till now. There seems to be a strong correlation to what happened in our SEA and what happened in our SEO can anyone share some info?
Technical SEO | | TruvoDirectories0