Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Factors that affect Google.com vs .ca
-
Though my company is based in Canada, we have a .com URL, we're hosted on servers in the U.S., and most of our customers are in the U.S. Our marketing efforts are focused on the U.S. Heck, we even drop the "u" in "colour" and "favour"!
Nonetheless we rank very well in Google.ca, and rather poorly on Google.com.
One hypothesis is that we have more backlinks from .ca domains than .com, but I don't believe that to be true. For sure, the highest quality links we have come from .coms like NYTimes.com.
Any suggestions on how we can improve the .com rankings, other than keeping on with the link building?
-
Thanks for letting us know how things worked out Aspirant.
Andy
-
Final verdict:
I took the plunge. Even though our product is geography agnostic, I changed our Webmaster Tools setting to "U.S."
Sure enough, we immediately saw some improvements in the google.COM rankings. Not much of an impact on .CA, and any loss here was definitely made up in the new .COM traffic.
I'll be doing a deeper dive into the data later.
Thanks everyone.
-
Hey Rob,
I have a bit of exp with this - had a Canadian based site that wanted to target the states. We were ranking well for .CA and not so good in .COM. I actually did this in WMT for a site - set geo-targetting to USA - and after a week or so started noticing a huge jump in .COm for a lot of keywords. What was great was that the rankings in .CA stayed consistent.
The only drop I noticed was in the .CA (Canada Only) searches. These completely dropped off the map. But normal searches in google.ca were fine.Don't know if this will always happen, but this is my experience.
-
I had exactly the same with a spanish site of mine .es for a long time i was first in google.com but knowhere to be found in google.es . Everybody kept telling me that this was not because i had a lot of .com link and none where .es But when time passed without any link changes the keywords aked well in google.es . So is it maybe the case the some countries are just a few months behind?
-
I have noticed that getting links from the appropriate TLD extension really determines where you rank on each google serps for the individual country.
you can search for sites related to yours for the specific TLD by putting inurl:.com in google along with your keywords.
the same thing works for all other extensions.
this makes finding .edu link opportunities a breeze for example
Besides link building you will want to make sure on webmaster tools you have set your targeted country to the country you want to rank best for. For example I have a site about college students which I've set to target the US since Canada mostly calls post secondary education University and College so the audience is split much more.
Hope this helps.
-
Sorry, I meant David Mihm -- oops!
-
I suspect having the settings in WMT set for the USA "might" hurt your performance in other areas, however the small company website (that gets 90% of its business from the USA) I mentioned in my prior response has the setting set to USA and it ranks #3 for it's main search term in both .ca and .com. Having claimed a Local Places account might also be an issue. I'd suggest you contact either Todd Mihm (http://www.davidmihm.com/blog) or Mike Blumenthal (http://blumenthals.com/blog) for an answer to that question.
-
Thanks for the answer. A couple of questions come to mind:
Won't setting our Google Webmaster Tools to United States hurt our performance in other parts of the world? So far I've made a point of ensuring that Webmaster Tools has us as not geo-specific ("Target users in: unlisted", on the Site Configuration > Settings screen of Webmaster Tools).
Also (on the advice of another SEO advisor) we verified our Google Places location, so is there a risk of sending mixed signals to Google and getting hurt by that?
-
The competition is usually stronger in the USA (.com) arena than in Canada (.ca). I have a little company site (with little work done in the way of SEO) that ranks #3 in both .ca and .com for "wheelchair trays". You may want to adjust your settings on Google WebMasterTools to ensure your site is set to United States rather than Canada. As David Kauzlaric has mentioned, you will definitely benefit from having more links from US based sites - I'd focus on that as a first step.
-
Still no breakthroughs on this issue. Our performance keeps improving on .ca and .com, which is obviously good, but our ranking on .com is always very, very far behind our .ca performance.
It's still a mystery to me, given that most of the inbound links are from U.S.-based, .com websites.
The only answer that works in my mind is that .ca uses a different algorithm. But I'm still very interested in hearing other thoughts!
Thanks,
Rob
-
Hi Rob,
Have you seen any changes with your rankings on Google.ca and Google.com? Do you have any other questions or comments you can add to help others that may be in a similar situation?
Here's hoping you got to enjoy two long weekends in a row from both countries!
-
Agree.
We did a link building campaign for a german website (dot de) and most of the links were from .com websites. They started to rank very well on google.com and google.de had only minor impacts. Is clear that the links should be from the same country zone if you want to rank in that particular area.
You should focus on links from .com domain - but that should be easier then building links from .ca.
You should also get a google maps account with your US location - if you have one. That alone should bring up your results in the US.
-
It's a pretty well known fact that non-US versions of Google are not using the same algorithm and therefore are "behind". This could be the case where you are employing methods that a couple years ago were effective and are working well for .CA but on .COM not as well.
The biggest thing you can do is work on high quality content and build links. Remember, linking is somewhere around 70% of the algorithm alone. Work on getting more .COM authoritative links from sites like NYT, USAToday, etc...
Also, if a good portion of your links are from .CA, that very well could affect it too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Pagination Changes
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s). Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first. The way I see it I have one option: Show every product in each category on page 1. I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it? Is there anything I'm missing?
Intermediate & Advanced SEO | | moon-boots0 -
Can you index a Google doc?
We have updated and added completely new content to our state pages. Our old state content is sitting in a our Google drive. Can I make these public to get them indexed and provide a link back to our state pages? In theory it sounds like a great link building strategy... TIA!
Intermediate & Advanced SEO | | LindsayE1 -
Are backlinks the most important factor in SEO?
I have had an agency state that "Backlinks are the most important factor in SEO". That is how they are justifying their strategy of approaching bloggers. I believe there are a lot more factors than that including Target Market definition, Keyword identification an build content based on these factors. What's everyone's thoughts?
Intermediate & Advanced SEO | | AndySalmons0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Does having a ? on the end of your URL affect your SEO?
I have some redirects that were done with at "?" at the end of the URL to include google coding (i.e. you click on an adwords link and the google coding follows the redirected link). When there is not coding to follow the link just appears as "filename.html?". Will that affect us negatively SEO-wise? Thank you.
Intermediate & Advanced SEO | | RoxBrock1 -
Google and Product Description Tabs
How does Google process a product page with description tabs? For example, lets say the product page has a tab for Overview, Specifications, What's In the Box and so on. Wouldn't that content be better served in one main product description tab with the tab names used as (htags) or highlighted paragraph separators? Or, does all that content get crawled as a single page regardless of the tabs?
Intermediate & Advanced SEO | | AWCthreads0 -
Indexed Pages in Google, How do I find Out?
Is there a way to get a list of pages that google has indexed? Is there some software that can do this? I do not have access to webmaster tools, so hoping there is another way to do this. Would be great if I could also see if the indexed page is a 404 or other Thanks for your help, sorry if its basic question 😞
Intermediate & Advanced SEO | | JohnPeters0 -
Help - .ie vs .co.uk in google uk
We have a website that for years has attracted a high level of organic searches and had a very high level of links. It has the .ie extension (Ireland) and did very well when competing in the niche market it is in on google.co.uk. We have the same domain name but in .co.uk format and basically redirected traffic to it when people typed in .co.uk instead. Since the latest panda update, we have noticed that the number of visits organically has dropped to a quarter of what it was and this is continuing to go down. We have also noticed that the .ie version is no longer listed in google and has been replaced by .co.uk. As we've never exchanged or submitted links for the .co.uk domain this means there are only links indexed in google. Is there any way I can get google to re-index the site using the .ie domain rather than the .co.uk domain? I am hemorrhaging sales now and becoming a much more withdrawn person by the day!!! PS - the .co.uk domain is set up as a domain alias in plesk with both .ie and .co.uk domain dns pointing to the the same IP address. Kind Regards
Intermediate & Advanced SEO | | rufo
Steve0