Google.ca vs Google.com Ranking
-
I have a site I would like to rank high for particular keywords in the Google.ca searches and don't particularly care about the Google.com searches (it's a Canadian service). I have logged into Google Webmaster Tools and targeted Canada. Currently my site is ranking on the third page for my desired keywords on Google.com, but is on the 20th page for Google.ca. Previously this change happened quite quickly -- within 4 weeks -- but it doesn't seem to be taking here (12 weeks out and counting). My optimization seems to be fine since I'm ranking well on Google.com: not sure why it's not translating to Google.ca.
Any help or thoughts would be appreciated.
-
Hi! We're going through some of the older unanswered questions and seeing if people still have questions or if they've gone ahead and implemented something and have any lessons to share with us. Can you give an update, or mark your question as answered?
Thanks!
-
Hi Elisse,
There are a few items that can speed up this process:
-
Where is your site hosted? Hosting in Canada may help.
-
Are you building any links to it? And if so, what kind of links are you getting? I find that getting links from domains within the same country helps
Or this might be a Google data center issue, I find that not all data centers sync up simultaneously and usually updating some content or metas will help speed that up.
-
-
I've a few questions:
- Where is the physical location of the hosting of the website?
- Where are the links to the site coming from?
- Is the site looking to rank locally (within Canada)? Is it listed in local directories?
- Are you talking about Canada in the content?
Thanks,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do if same content ranked twice or more on Google?
I have a Bangla SEO related blog where I have written article like "Domain Selection" "SEO Tools" "MOZ" etc. All the article has been written in Bengali language. I have used wp tag for every post. I have submit xml site map generated by Yoast SEO. However I kept "no index" for category. I know well duplicate content is a major problem for SEO. After publishing my content Google ranked them on 1st page. But my fear is that most of the content twice or more. The keywords are ranked by post, wp post tag and Archive. Now I have a fear of penalty. Please check the screenshot and please suggest me what to do. uRCHf yq7m2 rSLKFLG
Intermediate & Advanced SEO | | AccessTechBD0 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
One page ranking for all key words, when other targeted pages not ranking
Hi everyone I am fairly new to SEO but have a basic understanding. I have a page that has a lot of content on it (including brand names and product types and relevant info) ranking for a quite a few key words. This is cool, except that I have pages dedicated to each specific key word that are not ranking. The more specific page still has a lot of relevant text on it too. eg. TYRES page - Ranks first for "Tyres". Ranks okay for many tyre key words, including "truck tyres"
Intermediate & Advanced SEO | | JDadd
TRUCK TYRES page - not ranking for "truck tyres" Further on, I then have pages not ranking all that well for more specific key words when they should. eg HONDA TRUCK TYRES - Then has a page full of product listings - no actual text. Not Ranking for "honda truck tyres". ABC HONDA TRUCK TYRE - not ranking for "abc honda truck tyre" key word
These pages don't have a lot of content on them, as essentially every single tyre is the same except for the name. But they do have text. So sometimes, these terms don't rank at all. And sometimes, the first TYRES page ranks for it. I have done the basic on page seo for all these pages (hopefully properly) including meta desc, meta titles, H1, H2, using key words in text, alt texting images where possible etc. According to MOZ they are optimised in the 90%. Link building is difficult as they are product listings, so other sites don't really link to these pages. Has anyone got ideas on why the top TYRES page might be so successful and out ranking more specific pages? Any ideas on how I can get the other pages ranking higher as they are more relevant to the search term? We are looking in to a website redesign/overhaul so any advice on how I can prevent this from happening on essentially a new site would be great too. Thanks!0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Wrong page being ranked
Hi there, This seems a bit of a strange one, I have a particular keyword which I am trying to rank for, all internal links with the appropriate anchor text are pointing to the page I want to rank for, for this particular keyword, all external links are pointing to the page I want to rank for, for this particular keyword, however Google is ranking another page on my website for this keyword and the bizarre things is the page which is being ranked is a .PDF I am really not sure what else to do to give Google the hint that they are ranking the wrong page, any ideas? Kind Regards
Intermediate & Advanced SEO | | Paul780 -
Why did my rankings drop?
Hi all, In July I started to re-energise my link building efforts by getting a proper campaign together to build links. Despite building about 20 new links my traffic has actually fallen. Here a a breakdown of what happen: 1)Late June I noticed my toolbar page rank up at about PR4 which, despite only being a small part of the algo, was nice to see. Early July I started my link building campaign by getting together a massive list of potential link partners by using Open Site Explorer on my competitors sites. 3)Because I'm a bit pressed for time I decided to go for the easier links first. I sorted my link list by Domain Authority and started to list on high DA directories used by my competitors. I listed on about 20 of these directories. I also livened up an old links page I'd previously hidden from the SE's because I was planning to do a bit of Link exchanging too. A few days after I started building links from these directories I noticed my traffic start to drop off gradually. I also noticed the toolbar PR go down to PR3. I decided to stop at 20 submissions because it looked like this was effecting traffic. I also removed the links page I'd livened up which produced a temporary improvement in traffic but it's since gone on to get a bit worse. Traffic is now down by about 10% on when I started buying submissions to directories. I must add that during this period we have also been taking on new clients which, as a a real estate listing site, means we put loads of content on our site for the client. That content is also on the clients website and on other competitors sites. So there would be lot's of a content that appears elsewhere on the net. Not really sure which of the two has caused the problem and not really sure how to progress. Do I remove the links on the directories? Do I wait for this newly added content to bed down so that new fresh can take it's place in our results which we rank from? Any help would be appreciated.
Intermediate & Advanced SEO | | Mulith0 -
Getting a site to rank in both google.com and google.co.uk
I have a client who runs a yacht delivery company. He gets business from the US and the UK but due to the nature of his business, he isn't really based anywhere except in the middle of the ocean somewhere! His site is hosted in the US, and it's a .com. I haven't set any geographical targeting in webmaster tools either. We're starting to get some rankings in google US, but very little in google UK. It's a small site anyway, and he'd prefer not to have too much content on the site saying he's UK based as he's not really based anywhere. Any ideas on how best to approach this?
Intermediate & Advanced SEO | | PerchDigital0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0