Domains and subdomains
-
When I started a campaign for my message, I got the message:
"We have detected that the domain www.vamospaella.com and the domain vamospaella.com both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here."
I wasn't sure whether I had said it was a subdomain when in fact it was a domain (or the other way round), so I started another campaign for the same website using the other option and the message didn't come up.
However, I still don't understand what you meant by this and whether it's an issue. When I search for my website in Google, it shows as vamospaella.com when other websites come up as www. and then their domain name.
If it is a problem, is it to do with my hosting package and how it's set up or is it to do with my local site on my computer? I did ring my web host, 1&1, but they said they couldn't see a problem. Please can you let me know how I can resolve this as my ranking is still quite low in Google and I'm not sure why. If it is because of "twin domains", then will Google see my content as duplicated and keep me low in their rankings?
I'm new to SEO and not a website novice, so please answer in lay terms!
Thanks
Melissa
-
Hey Brent,
This has still really perplexed me but out of the blue as I started to think about this I think I may have come to this conclusion. I am running my content as inline iframes, could this be the reason why I am getting the above errors? My site is http://vortexcleaning.com any help here is appreciated and thanks in advance.
-
Melissa,
Brent tells you exactly what is happening on your site.
The only thing I have to add is that you also are causing yourself some problems.
Your HOME link points to this:
http://vamospaella.com/index.php
So that means you are telling all the search engines and visitors that http://vamospaella.com/ is not your home page, but http://vamospaella.com/index.php is.
So you need to stop doing that. Make sure your internal links only point to one URL.
(as well as doing the 301 redirects that ensure your site only responds one consistent way.)
-
Yes this is a problem since you are able to access the home page of your website via two different urls, with and without www. This issue is can be resolved with Canonicalization.
More information about Canonicalization: http://www.seomoz.org/learn-seo/canonicalization
From the article:
SEO Best Practice
For SEOs, canonicalization refers to individual web pages that can be loaded from multipleURLs. This is a problem because when multiple pages have the same content but different URLs, links that are intended to go to the same page get split up among multiple URLs. This means that the popularity of the pages gets split up. Unfortunately for web developers, this happens far too often because the default settings for web servers create this problem. The following lists show the most common canonicalization errors that can be produced when using the default settings on the two most common web servers:
Apache web server:
- http://www.example.com/
- http://www.example.com/index.html
- http:/example.com/
- http://example.com/index.html
Microsoft Internet Information Services (IIS):
- http://www.example.com/
- http://www.example.com/default.asp (or .aspx depending on the version)
- http://example.com/
- http://example.com/default.asp (or .aspx)
- or any combination with different capitalization.
Each of these URLs spreads out the value of inbound links to the homepage. This means that if the homepage has multiple links to these various URLs, the major search engines only give them credit separately, not in a combined manner.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reason for reducing site authority domain?
Hi;
Technical SEO | | 5mizo
I have a website that had an authoritative domain of 10 but today it has reached 5. I wanted to know what the reason for these changes was.
No backlinks from my site have disappeared and even more, but today my authoritative domain has reached 5 authoritative domain.
Can you guide my servant? My website to troubleshoot : bonianservice.com Thank you very much mxyv_untitled-2.jpg0 -
Change theme or domain first?
I bought a new domain and I want to use it instead of my current domain for my website. I also want to change my theme. Which should I do first? At the time I have my new domain forwarded to my current domain.
Technical SEO | | tfuentez0 -
Domain forwarding or redirects for SEO?
Hi all! A client of mine owns several top level domains which are not in use, let's call them example.nu, example.de, example.net and so on. The current website is example.com.
Technical SEO | | JHultqvist
When checking the technical status of the unused domains I realized that all but one are forwarded (via DNS) to example.com and only one has a 301 redirect. Should I redirect all of them by means of 301 or let them stay forwarded? Very few of the domains have any other sites linking to them. Any thoughts would be really appreciated! Jesper0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
One server, two domains - robots.txt allow for one domain but not other?
Hello, I would like to create a single server with two domains pointing to it. Ex: domain1.com -> myserver.com/ domain2.com -> myserver.com/subfolder. The goal is to create two separate sites on one server. I would like the second domain ( /subfolder) to be fully indexed / SEO friendly and have the robots txt file allow search bots to crawl. However, the first domain (server root) I would like to keep non-indexed, and the robots.txt file disallowing any bots / indexing. Does anyone have any suggestions for the best way to tackle this one? Thanks!
Technical SEO | | Dave1000 -
Domain hacked and redirected to another domain
2 weeks ago my home page plus some others had a 301 redirect to another cloned domain for about 1 week (due to a hack).The original pages were then de-indexed and the new bad domain was indexed and in effect stole my rankings.Then the 301 was removed/cleaned from my domain and the bad domain was fully de-indexed via a request I made in WMT (this was 1 week ago).Then my pages came back into the index but without any ranking power (as if it's just in the supplemental index).It's been like this for a week now and the algorithms have not been able to correct it. So how do I get this damage undone or corrected? Can someone at Google reverse/cancel the 301 ranking transfer since the algorithms don't seem to be able to?I have the option to do a "Change of Address" in WMT from bad domain to my domain. But I don't think this would work properly because it says I also need to place a 301 on the bad domain back to mine. Would a change of address still work without the 301?Please advise/help what to do in order to get my rankings back to where they were.
Technical SEO | | Dantek0 -
Localized domains and duplicate content
Hey guys, In my company we are launching a new website and there's an issue it's been bothering me for a while. I'm sure you guys can help me out. I already have a website, let's say ABC.com I'm preparing a localized version of that website for the uk so we'll launch ABC.co.uk Basically the websites are going to be exactly the same with the difference of the homepage. They have a slightly different proposition. Using GeoIP I will redirect the UK traffic to ABC.co.uk and the rest of the traffic will still visit .com website. May google penalize this? The site itself it will be almost the same but the homepage. This may count as duplicate content even if I'm geo-targeting different regions so they will never overlap. Thanks in advance for you advice
Technical SEO | | fabrizzio0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0