What would you do if a site's entire content is on a subdomain?
-
Scenario:
- There is a website called mydomain.com and it is a new domain with about 300 inbound links (some going to the product pages and categories), but they have some high trust links
- The website has categories a, b, c etc but they are all on a subdomain so instead of being mydomain.com/categoryA/productname the entire site's structure looks like subdomain.mydomain.com/categoryA/productname
- Would you go to the effort of 301ing the subdomain urls to the correct url structure of mydomain.com/category/product name, or would you leave it as it is?
Just interested as to the extent of the issues this could cause in the future and if this is something worth resolving sooner than later.
-
Thanks for suggestions, appreciated!
-
Yes,exactly that structure
-
That is the best answer, you are right. Maybe I would first contact the sites that link to mine.
-
Here is what I would (probably) do.....
If there is just one subdomain... I would 301 redirect it to the root domain.
If there are multiple subdomains.... I would redirect each of them to a folder.
I would then try to get the links changed to directly hit the proper content in the proper folder.
-
Great question Kerry,
A great factor is how many subdomains do you exactly have. If not above 50 I think interlinking the content well should be enough. There are sites that use lots of subs and are succesful (look at cragslist.org for example). If you have a lot more subdomians I would further investigate the links: how many do exactly go to subdomians. Cause if you redirect only these links will no longer provide about 10-15% of the link juice they do provide now. You can estimate the ratio of the fallback.
Hope I could give you some points to consider, regretfully making a decision you are facing now is never easy. I hope you will be able to solve the problem.
-
Hi there,
So just to make sure. The structure would be as follows:
shop.mydomain.com/categoryA/productname
shop.mydomain.com/categoryB/productname1
???
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has deindexed a page it thinks is set to 'noindex', but is in fact still set to 'index'
A page on our WordPress powered website has had an error message thrown up in GSC to say it is included in the sitemap but set to 'noindex'. The page has also been removed from Google's search results. Page is https://www.onlinemortgageadvisor.co.uk/bad-credit-mortgages/how-to-get-a-mortgage-with-bad-credit/ Looking at the page code, plus using Screaming Frog and Ahrefs crawlers, the page is very clearly still set to 'index'. The SEO plugin we use has not been changed to 'noindex' the page. I have asked for it to be reindexed via GSC but I'm concerned why Google thinks this page was asked to be noindexed. Can anyone help with this one? Has anyone seen this before, been hit with this recently, got any advice...?
Technical SEO | | d.bird0 -
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
URL with query string being indexed over it's parent page?
I noticed earlier this week that this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages?channel=care was being indexed instead of this page - https://www.ihasco.co.uk/courses/detail/bomb-threats-and-suspicious-packages for its various keywords We have rel=canonical tags correctly set up and all internal links to these pages with query strings are nofollow, so why is this page being indexed? Any help would be appreciated 🙂
Technical SEO | | iHasco0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Additional product information: the product's sales page or a blog post?
I want to go in-depth about different customizations for custom caps, which is one of the products we offer. I just don't know whether it would be better--from an SEO perspective--to expand the caps sales page we already have or to write a blog post to give the site another valuable indexed page. From a user standpoint, I don't think it's as important, because if I do it the blog way, I can't just put a link on the page saying, Want more customizations? Visit our blog post. Any opinions?
Technical SEO | | UnderRugSwept1 -
What's the max number of links you should ever have on a page?
Our homepage has a few hundred links and our index pages(pages that link to our spintext pages) have about 900 links on them with no content. Our SEO guy said we have to keep the links under 1000 but I wanted to see what you guys think.
Technical SEO | | upper2bits0 -
Ecommerce site with currency selectors giving dupe content?
Hi everyone,
Technical SEO | | BeachDude
One of my ecommerce sites uses BigCommerce. They have a feature where you can add different currency buttons to change the currency that the customer can shop as. This is great because if people from the UK visit our site, they can change the currency to their own rather than US. It just ads a variable on the end of the URL string to change the currency. However, in my webmaster tools I noticed that I think i am getting a bunch of duplicate content. For example, it thinks i have duplicate title tags for the following: domainname/pages/my-cool-widget.html
domainname/pages/my-cool-widget.html?setCurrencyId=1
.domainname/pages/my-cool-widget.html?setCurrencyId=2
domainname/pages/my-cool-widget.html?setCurrencyId=3
domainname/pages/my-cool-widget.html?setCurrencyId=4 I thought about adding "rel=no-follow" but unfortunately I don't have access to this file to edit the code. Any suggestions?0 -
URL's for news content
We have made modifications to the URL structure for a particular client who publishes news articles in various niche industries. In line with SEO best practice we removed the article ID from the URL - an example is below: http://www.website.com/news/123/news-article-title
Technical SEO | | mccormackmorrison
http://www.website.com/news/read/news-article-title Since this has been done we have noticed a decline in traffic volumes (we have not as yet assessed the impact on number of pages indexed). Google have suggested that we need to include unique numerical IDs in the URL somewhere to aid spidering. Firstly, is this policy for news submissions? Secondly (if the previous answer is yes), is this to overcome the obvious issue with the velocity and trend based nature of news submissions resulting in false duplicate URL/ title tag violations? Thirdly, do you have any advice on the way to go? Thanks P.S. One final one (you can count this as two question credits if required), is it possible to check the volume of pages indexed at various points in the past i.e. if you think that the number of pages being indexed may have declined, is there any way of confirming this after the event? Thanks again! Neil0