CcTLDs vs folders
-
My company is looking at expanding internationally, we have sudomains in the UK and Canada currently. I'm making recommendations on improving SEO and one of the parts that I'm struggling with is the benefits of ccTLDs vs using folders.
I know the basic argument about Google recognizing the ccTLDs as being geo specific so they get priority. But I'd like to know HOW much priority they get. We have unique keywords and a pretty strong domain, is having a ccTLDs so much better that'd be worth going that route rather then creating folders within our current domain?
Thanks,
Jacob
-
Hi Jacob,
Use subfolders. Remember to use the hreflag tag, inclufing the country code.
If you have the ccTLD domains, redirect them to the subfolder.
For example: If you have yoursite.co.uk point it to yoursite.com/uk/Also, remember to add every subfolder to Google Search Console (Google Web Masters Tools) and declare for each one the country that is itended to.
Hope it helps.
GR. -
There definitely is a benefit for keeping all of your content on one domain (using folders), and building up the overall Domain Authority of one domain/one site.
When it comes to making the decision on whether or not to go to a ccTLD, consider your users/visitors first. How will they interact with the site, will they trust it more if it's a ccTLD in their country? If so, then consider the fact that it will ultimately be better for your business if the users like it and trust it better.
Another consideration is the fact that you'll be creating an entirely new site on a ccTLD. You'll be starting fresh, and will need links and time to ultimately get it to rank and get the traffic to where you need it to be. Then there's the whole issue of content, you'll need unique content for the site. If you can afford the time and effort involved in creating a completely new site, and it makes sense for users then I would consider the ccTLD route.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Affiliate links vs. seo (updated 19.02.2014)
UPDATE - 19.02.2014: Hi, We got another negative answer from Google pointing again to our affiliate links, so the 301 redirect and block was not enough.
Intermediate & Advanced SEO | | Silviu
I understand the need of contacting all of them and ask for the nofollow, we've started the process, but it will take time, alot of time. So I'd like to bring to your attention another 2 scenarious I have in mind: 1. Disavow all the affiliate links.
Is it possible to add big amount of domains (>1000) to the disavow doc.? Anyone tryed this? 2. Serve 404 status for urls coming from affiliates that did not add noffolow attribute.
This way we kinda tell G that content is no longer available, but we will end up with few thousand 404 error pages.
The only way to fix all those errors is by 301 redirecting them afterwards (but this way the link juice might 'restart' flowing and the problem might persist). Any input is welcomed. Thanks Hi Mozers, After a reconsideration request regarding our link profile, we got a 'warning' answer about some of our affiliate sites (links coming from our affiliate sites that violate Google's quality guidelines). What we did (and was the best solution in trying to fix the 'seo mistake' and not to turn off the affiliate channel) was to 301 redirect all those links to a /AFFN/ folder and block this folder from indexing.
We're still waiting for an answer on our last recon. request. I want to know you opinion about this? Is this a good way to deal with this type of links if they're reported? Changing the affiliate engine and all links on the affiliate sites would be a big time and technical effort, that's why I want to make sure it's truly needed. Best,
Silviu0 -
Short Url vs Medium Urls ?
Hello Moooooooooooz ! I got a SEO fight today and though the best would be to involve more people into the fight ! 😛 Do you think it's better to get A- company.com/services/service1.html or B- company/service1.html I was for A as services is also googled to find the service1. I also think that it's better to help google to understand where the service is on the website My friend was for B as URL has to stay as short as possible What do you think ? ps: I can create the URL I want using Joomla and Sh404. The websites has 4 different categoies: /about, /services/ products, /projects Tks ! 🙂
Intermediate & Advanced SEO | | AymanH0 -
Webmaster Tools: Total Indexed VS Ever Crawled
Ok, In WMT's under health > index status I have both total indexed and ever crawled ticked - It also looks like the data is broken up weekly. As an example say you have the following: Total Indexed: 1000 Ever Crawled: 5000 What is this say? It found 5000 pages but only indexed 1000 (20%). Thanks
Intermediate & Advanced SEO | | Bondara0 -
Broken sitemaps vs no sitemaps at all?
The site I am working on is enormous. We have 71 sitemap files, all linked to from a sitemap index file. The sitemaps are not up to par with "best practices" yet, and realistically it may be another month or so until we get them cleaned up. I'm wondering if, for the time being, we should just remove the sitemaps from Webmaster Tools altogether. They are currently "broken", and I know that sitemaps are not mandatory. Perhaps they're doing more harm than good at this point? According to Webmaster Tools, there are 8,398,082 "warnings" associated with the sitemap, many of which seem to be related to URLs being linked to that are blocked by robots.txt. I was thinking that I could remove them and then keep a close eye on the crawl errors/index status to see if anything changes. Is there any reason why I shouldn't remove these from Webmaster Tools until we get the sitemaps up to par with best practices?
Intermediate & Advanced SEO | | edmundsseo0 -
How is Google's algorithm evolving in terms of DA vs PA value?
how is Google evolving in terms of value for DA vs PA? Is having a link from a DA 75 + PA 25 better than having a link from a DA 50 + PA 50, assuming such 2 websites are otherwise identical? I have a couple of .EDU backlinks where DA is around 80, though PA 1. Would be DA 40 with a PA 40 be more valuable? I hear Google is placing increasing value on the domain and less on the page authority.
Intermediate & Advanced SEO | | knielsen
Any insight appreciated thank you0 -
Ad units or % of ads vs content?
When looking at content "above the fold" is it more important to look at ad units or the visual % of unique content to ads? For example, if there are 6 small ad units or one large ad unit that takes up 30% of the page, which is better for search engines? In general, is 50% unique content above the fold with 50% ads adequate or what % do you try to optimize for?
Intermediate & Advanced SEO | | nicole.healthline0 -
No index, follow vs. canonical url
We have a site that consists almost entirely as a directory of videos. Example here: http://realtree.tv/channels/realtreeoutdoorsclassics We're trying to figure out the best way to handle pagination and utility features such as sort for most recent, most viewed, etc. We've been reading countless articles on this topic, but so far have been unable to determine what might be considered the industry standard. Two solutions seem to stand out... Using the canonical url on all the sorted and paginated pages. However, after reading many blog posts, it seems that you should NEVER use the canonical url to solve the issue of paginated, and thus duplicated content because the search bots will never crawl past the first page leaving many results not in the index. (We are considering ruling this method out.) Another solution seems to be using the meta tag for noindex, follow so that a search engine like Google will crawl your directory pages but not add them to the index themselves. All links are followed so content is crawled and any passing link juice remains unchanged. However, I did see a few articles skeptical of this solution as well saying that there are always better alternatives, or that there is no verification that search engines obey this meta tag. This has placed some doubt in our minds. I was hoping to get some expert advice on these methods as it would pertain to our site. Thank you.
Intermediate & Advanced SEO | | grayloon0 -
Factors that affect Google.com vs .ca
Though my company is based in Canada, we have a .com URL, we're hosted on servers in the U.S., and most of our customers are in the U.S. Our marketing efforts are focused on the U.S. Heck, we even drop the "u" in "colour" and "favour"! 🙂 Nonetheless we rank very well in Google.ca, and rather poorly on Google.com. One hypothesis is that we have more backlinks from .ca domains than .com, but I don't believe that to be true. For sure, the highest quality links we have come from .coms like NYTimes.com. Any suggestions on how we can improve the .com rankings, other than keeping on with the link building?
Intermediate & Advanced SEO | | RobM4161