Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can subdomains avoid spam penalizations?
-
Hello everyone,
I have a basic question for which I couldn't find a definitive answer for.
Let's say I have my main website with URL:
And I have a related affiliates website with URL:
Which includes completely different content from the main website. Also, both domains have two different IP addresses.
Are those considered two completely separate domains by Google? Can bad links pointing to affiliates.mywebsite.com affect www.mywebsite.com in any way?
Thanks in advance for any answer to my inquiry!
-
Sure, I understand, that makes sense. Thank you for your help!
-
Hi Fabrizo,
As answered by Joshua Belland in above answer, you will need to be careful with how you plan it out.
The IP and DNS need to be on a different server.
Be careful about how prtovide link for these with each other.
Regards,
Vijay
-
Sorry guys, I wasn't enough clear with my first question above, it was actually too generic.
To cut to the chase, I am talking about our main website:
www.virtualsheetmusic.com (IP 66.29.153.48)
and our affiliate website which is:
affiliates.virtualsheetmusic.com (IP 66.29.153.50)
They have 2 different IPs, but they are on the same server and same network, of course their are on the same IP block.
And I'd like to know to what extent the activity/status of one site can affect the other, but from what you are asking, I guess they could affect each other to some extent. I mean, Google could understand that they are part of the same "network" and then associate them anyway... right?
-
Are these subdomain properties on different A class ip blocks or different C class ip blocks?
It think this all depends. If the IP addresses are in the same neighborhood or on the same subnets as each other then I would say yes. But beyond that you have to think about several other foot prints to look for:
- Are the nameservers the same?
- Are these ip addresses assigned to different regions?
- Are you interlinking these web properties?
- Even the fact that the subdomain is still associated with the domain makes nervous and only because that is easy for Google to track. If you think about how may other data points they use to find footprints in their algorithm, I don't see why that wouldn't be one of them.
I would be careful with RankBrain continuously evolving and seeing how much turbulence there has been in the serps lately. Personally, my small PBN is completely on separate A Class IPs, with custom name servers, different hosts, and I only put premium content on it. It's not great for quick affiliate gigs, but it certainly helps sustain long term growth.
-
Hi Fabrizo,
Yes, they would be treated as different entities, as a precaution, I would recommend the geographical location of the server IP to be far off and not from the same IP block.
Thanks,
Vijay
-
Thank you Vijay for your extensive answer, but as I wrote above, each sub-domain has its own separate IP address. So... if each sub-domain has its own IP address, are they treated as two completely different websites?
-
Hi Fabrizo,
A subdomain is treated a different entity, however since it comes from the same IP, it's risky to create backlinks to main the site and subdomain. Let me try to answer your question by giving you an example, where we experimented with the idea of subdomain and main site linking , it would help you understand how google treats them as different entities.
We had a client who runs one of his donation campaign for his project from his subdomain and used the main domain for commercial purpose.
He was linking both domains in reciprocal links to send traffic to donation subdomain from the main site and vice versa. The results were shocking as the donation website was ranking far better on even commercial keywords better than main website. We did a deeper analysis and found out the donation website was out-performing main website in terms of high authority contextual backlinks. After some time, the main site started dipping more on the organic traffic and results, we analysed and concluded it was reciprocal linking that was the source of the problem.
We had to make a choice either to remove reciprocal backlinks or test the subdomain on a separate IP. First, we removed the reciprocal links (even if the client was not ready easily) just to prove to the client that it was subdomain links that were causing the problem, the results were good as the main site recovered the ranks and traffic (we also implemented our planned off-page for both the sites) .
Now, this helped us conclude that same IP + subdomain was an issue but we were not sure whether moving to another server would help (not only IP, we had made clear that we wanted a separate location for server IP from hosting company). We shifted the IP first and then watched the results , the donation site was steadily improving on donation related keywords and dipping on commercial keywords, on the other hand, the main website crept up slowly in ranks on commercial keywords (they were medium-high competition keywords).
We made it clear to the client, that this time the links won't be reciprocal and he has to decide his priority about which site he wants to give follow and no follow links. The client wanted the backlinks from donation to the main site with do-follow links, so we created the same. This further helped our commercial website rank to improve, we are still running the websites in the same mode and the results are good.
I hope this answers your query and would help you have a decision. if you have further questions, please feel free to respond and ask.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does the order of the keywords affect my SERP? And what can I do to improve?
Hi all, So, if you google "london life coach" my site appears #2 (www.nickhatter.com) But if you google "life coach London" my SERP seems to fluctuate between #3 up to #6. If you google "life coach in London" my SERP is a solid #2/3. I don't get it all. Would someone care to explain? Also, if you have any tips on how I might improve the EAT of my website please do feel free to weigh in! Many thanks,
Intermediate & Advanced SEO | | NickHatster
Nick0 -
Can I add FAQS schema on my homepage?
Hello, can we have the FAQ code on the homepage (staff time)? we have written some questions and answers in the drop-down list on the homepage, and also add the schema code script to one tag of the page, but it does not work!
Intermediate & Advanced SEO | | fbowable0 -
Ecommerce store on subdomain - danger of keyword cannibalization?
Hi all, Scenario: Ecommerce website selling a food product has their store on a subdomain (store.website.com). A GOOD chunk of the URLs - primarily parameters - are blocked in Robots.txt. When I search for the products, the main domain ranks almost exclusively, while the store only ranks on deeper SERPs (several pages deep). In the end, only one variation of the product is listed on the main domain (ex: Original Flavor 1oz 24 count), while the store itself obviously has all of them (most of which are blocked by Robots.txt). Can anyone shed a little bit of insight into best practices here? The platform for the store is Shopify if that helps. My suggestion at this point is to recommend they all crawling in the subdomain Robots.txt and canonicalize the parameter pages. As for keywords, my main concern is cannibalization, or rather forcing visitors to take extra steps to get to the store on the subdomain because hardly any of the subdomain pages rank. In a perfect world, they'd have everything on their main domain and no silly subdomain. Thanks!
Intermediate & Advanced SEO | | Alces0 -
New Subdomain & Best Way To Index
We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!
Intermediate & Advanced SEO | | Markbwc0 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
How to add subdomains to webmaster tools?
Can anyone help with how I add a sub domain to webmaster tools? Also do I need to create a seperate sitemap for each sub domain? Any help appreciated!
Intermediate & Advanced SEO | | SamCUK1 -
How do I list the subdomains of a domain?
Hi Mozers, I am trying to find what subdomains are currently active on a particular domain. Is there a way to get a list of this information? The only way I could think of doing it is to run a google search on; site:example.com -site:www.example.com The only issues with this approach is that a majority of the indexed pages exist on the non-www domain and I still have thousands of pages in the results (mainly from the non-www). Is there another way to do it in Google? OR is there a server admin online tool that will tell me this information? Cheers, Dan
Intermediate & Advanced SEO | | djlaidler0 -
Can Google Read Text in Carousel
so what is the best practice for getting Google to be able to read text that populates via JQuery in a carousel. If the text is originally display none, is Google going to be able to crawl it? Are there any limits to what Google can crawl when it comes to JavaScript and text? Or is it always better just to hardcopy the text on the page source?
Intermediate & Advanced SEO | | imageworks-2612900