TLDs vs ccTLDs?
-
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came.
So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs.
What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits?
Thanks
V
-
Thanks, Jane, that's a much better answer/example!
-
Hi again,
Sorry it has taken a few days to get back to you. I replied in the other thread about ccTLDs versus using one site. Some additional info: in general, you will have an easier time using the subfolder structure recommended in the other question (again, as long as there are no factors which make it important to have country-specific domains). The Singapore / Indonesian sections of the website will naturally inherit authority because they sit on the strong .com. Just being linked to by the .com isn't enough to give them such a large boost.
Apple uses this strategy for internationalisation: http://www.apple.com/uk/ for the UK, http://www.apple.com/nz/ for New Zealand, http://www.apple.com/sg/ for Singapore and so forth.
On the other hand, BlackBerry uses subdomains: http://uk.blackberry.com/ and http://sg.blackberry.com/.
Amazon obviously uses ccTLDs.
All of these domains are hellishly strong in their own rights; traditionally it has been thought of as best to use one site like Apple does if you are no a mammoth already. However, you can make other options work with good link development. I think in your case, one domain is something to seriously consider.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Http vs. https - duplicate content
Hi I have recently come across a new issue on our site, where https & http titles are showing as duplicate. I read https://moz.com/community/q/duplicate-content-and-http-and-https however, am wondering as https is now a ranking factor, blocked this can't be a good thing? We aren't in a position to roll out https everywhere, so what would be the best thing to do next? I thought about implementing canonicals? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Desktop vs. Mobile Results
When googling on www.google.ca for "wedding invitations" and in my own geo location market of Toronto, my site - www.stephita.com, will show up differently on SERP on desktop (Chrome & IE) vs. mobile (iPad, iPhone, android, etc.). On desktop SERP, I will show up 6/7 position... (which is relatively a new position, the past 3 weeks - I was previously on page 2) (After a bunch of SEO fixes, I've managed to propel my site back to page 1!) On mobile SERP, I only show up on 1/2 position on PAGE 2 😞 As I mentioned above, I did a bunch of SEO fixes that I think were related to Panda/Penguin algos. So I'm wondering why my MOBILE SERP has NOT improved along the way? What should I be looking at to fix this 5-6 position differential? Thanks all!
Intermediate & Advanced SEO | | TysonWong0 -
508 compliance vs good SEO re: Image alt tags
I'm currently in debate with our 508 compliance team over the use of alt tags on images. For SEO, it is best practice to use alt tags so that readers can tell what the image represents. However, they are arguing that these images should NOT have alt text as it doesn't add anything to the disability screen reader as the image text would be repetitive with the text on the page. I feel they are taking the "decorative" image concept in 508 compliance too far. It's intention is for images for bullets, etc that truly are decorative in nature and add no benefit to the reader. What is the communities thoughts on this? Have you ever run into scenario where 508 is attempting to ruin SEO? Usually the 2 play nicely.
Intermediate & Advanced SEO | | jpfleiderer0 -
Pagination Question: Google's 'rel=prev & rel=next' vs Javascript Re-fresh
We currently have all content on one URL and use # and Javascript refresh to paginate pages, and we are wondering if we transition to the Google's recommended pagination if we will see an improvement in traffic. Has anyone gone though a similar transition? What was the result? Did you see an improvement in traffic?
Intermediate & Advanced SEO | | nicole.healthline0 -
Mobile SEO vs. normal SEO?
Hi everyone, I wanted to ask you abour your opinon on mobile SEO. Do we already have two different Indices, one for mobile, one for desktop? Except a few mobile listings I don't see a difference yet. If yes, do I need to do special mobile SEO for my site or is it enough to have e.g. a responsive webdesign which detects the device and shows a different page? Are there any other extra Mobile SEO measures that should be considered? I know of the Mobile Sitemap and directories but is there anything else? Best regards
Intermediate & Advanced SEO | | CrazySEO0 -
Sitewide Vs HomePage Links For Network of Sites
I wanted to site wide link a few sites together as they are sort of in the same network of ownership and wanted some advice. 1X PR1
Intermediate & Advanced SEO | | upick-162391
2X PR2
2x PR3 Would it be best to just get home page links before the footer, the links will be within a paragraph of text OR Just site wide link them in the footer with a heading of "Our Shopping Network"0 -
Site Architecture: Cross Linking vs. Siloing
I'm curious to know what other mozzers think about silo's... Can we first all agree that a flat site architecture is the best practice? Relevant pages should be grouped together. Shorter, broader and (usually) therefore higher volume keywords should be towards the top of each category. Navigation should flow from general to specific. Agreed? As Google say's on page 10 of their SEO Starter Guide, "you should think about how visitors will go from a general page (your root page) to a page containing more specific content ." OK, we all agree so far, right? Great! Enter my question: Bruce Clay (among others) seem to recommend siloing as a best practice. While Richard Baxter (and many others @ SEOmoz), seem to view silos as a problem. Me? I've practiced (relevant) internal cross linking, and have intentionally avoided siloing in almost all cases. What about you? Is there a time and place to use silos? If so, when and where? If not, how do we rectify the seemingly huge differences of opinions between expert folks such as Baxter and Clay?
Intermediate & Advanced SEO | | DonnieCooper7 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0