What would you do if a site's entire content is on a subdomain?
-
Scenario:
- There is a website called mydomain.com and it is a new domain with about 300 inbound links (some going to the product pages and categories), but they have some high trust links
- The website has categories a, b, c etc but they are all on a subdomain so instead of being mydomain.com/categoryA/productname the entire site's structure looks like subdomain.mydomain.com/categoryA/productname
- Would you go to the effort of 301ing the subdomain urls to the correct url structure of mydomain.com/category/product name, or would you leave it as it is?
Just interested as to the extent of the issues this could cause in the future and if this is something worth resolving sooner than later.
-
Thanks for suggestions, appreciated!
-
Yes,exactly that structure
-
That is the best answer, you are right. Maybe I would first contact the sites that link to mine.
-
Here is what I would (probably) do.....
If there is just one subdomain... I would 301 redirect it to the root domain.
If there are multiple subdomains.... I would redirect each of them to a folder.
I would then try to get the links changed to directly hit the proper content in the proper folder.
-
Great question Kerry,
A great factor is how many subdomains do you exactly have. If not above 50 I think interlinking the content well should be enough. There are sites that use lots of subs and are succesful (look at cragslist.org for example). If you have a lot more subdomians I would further investigate the links: how many do exactly go to subdomians. Cause if you redirect only these links will no longer provide about 10-15% of the link juice they do provide now. You can estimate the ratio of the fallback.
Hope I could give you some points to consider, regretfully making a decision you are facing now is never easy. I hope you will be able to solve the problem.
-
Hi there,
So just to make sure. The structure would be as follows:
shop.mydomain.com/categoryA/productname
shop.mydomain.com/categoryB/productname1
???
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.
Technical SEO | | Jane.com0 -
How bad is it to have duplicate content across http:// and https:// versions of the site?
A lot of pages on our website are currently indexed on both their http:// and https:// URLs. I realise that this is a duplicate content problem, but how major an issue is this in practice? Also, am I right in saying that the best solution would be to use rel canonical tags to highlight the https pages as the canonical versions?
Technical SEO | | RG_SEO0 -
Google how deal with licensed content when this placed on vendor & client's website too. Will Google penalize the client's site for this ?
One of my client bought licensed content from top vendor of Health Industry. This same content is on the vendor's website & my client's site also but on my site there is a link back to vendor is placed which clearly tells to anyone that this is a licensed content & we bought from this vendor. My client bought paid top quality content for best source of industry but at this same this is placed on vendor's website also. Will Google penalize my client's website for this ? Niche is HEALTH
Technical SEO | | sourabhrana1 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Sitemap issue? 404's & 500's are regenerating?
I am using the WordPress SEO plugin by Yoast to generate a sitemap on http://www.atozqualityfencing.com. Last month, I had an associate create redirects for over 200 404 errors. She did this via the .htaccess file. Today, there are the same amount of 404s along with a number of 503 errors. This new Wordpress website was constructed on a subdirectory and made live by simply entering some code into the .htaccess file in order to direct browsers to the content we wanted live. In other words, the content actually resides in a subdirectory titled "newsite" but is shown live on the main url. Can you tell me why we are having these 404 & 503 errors? I have no idea where to begin looking.
Technical SEO | | JanetJ0 -
Is new created page's pagerank 1 ?
Hey I just want to know,
Technical SEO | | atakala
If I create a web page, is the pagerank of the page would be 1?1 -
How to fix duplicate content errors with Go Daddy Site
I have a friend that uses a free GoDaddy template for his business website. I ran his site through Moz Crawl diagnostics, and wow - 395 errors. Mostly duplicate content and duplicate page title I dug further and found the site was doing this: URL: www.businessname.com/page1.php and the duplicate: businessname.com/page1.php Essentially, the duplicate is missing the www. And it does this 2 hundred times. How do I explain to him what is happening?
Technical SEO | | cschwartzel0 -
Why are my URL's changing
My rankings suddenly dropped and when trying to understand why I realized that nearly all images in Google's cached version of my site were missing. In the actual site they appear but in the cached version they don't. I noticed that most of the images had a ?6b5830 at the end of the URL and these were the images that were not showing. I am hoping that I found the reason for the drop in rankings. Maybe since Google cannot see a lot of the content it decided not to rank it as well (particularly since it seems to happen on thousands of pages). This is a cached version of my site I am using the following plugins that might be causing it: Yoasts SEO plugin, W3 total cache. Does anyone know what is causing ?6b5830 to be added to the end of most of my URL's? Could this be the reason for the ranking drop? Thanks in advance!
Technical SEO | | JillB20130