GWT - International Targeting
-
By selecting a country in the Country Targeting section of GWT what effect does this have?
For example if I select UK will this boost rankings on google.co.uk and decrease them on google.com etc?
If we are based in the UK but our customer base is worldwide should we not select anything?
-
By providing Google Webmasters Tools with your international targeting information, you are helping Google decide if your website should appear and how it should appear (in local results) in a location. It only affects results for geographically related queries in which a user limits the scope of a search to a certain country. It will not affect appearance in search results that are not geographically bounded. Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am losing 1 point of DA at month? What could it be? I have noticed I have lost 50K (out of 300K) of internal links after a website update, could it be related to that?
I am losing 1 point of DA at month? What could it be? I have noticed I have lost 50K (out of 300K) of internal links after a website update, could it be related to that?
Technical SEO | | albertoalchieriefficio0 -
GWT False Reporting or GoogleBot has weird crawling ability?
Hi I hope someone can help me. I have launched a new website and trying hard to make everything perfect. I have been using Google Webmaster Tools (GWT) to ensure everything is as it should be but the crawl errors being reported do not match my site. I mark them as fixed and then check again the next day and it reports the same or similar errors again the next day. Example: http://www.mydomain.com/category/article/ (this would be a correct structure for the site). GWT reports: http://www.mydomain.com/category/article/category/article/ 404 (It does not exist, never has and never will) I have been to the pages listed to be linking to this page and it does not have the links in this manner. I have checked the page source code and all links from the given pages are correct structure and it is impossible to replicate this type of crawl. This happens accross most of the site, I have a few hundred pages all ending in a trailing slash and most pages of the site are reported in this manner making it look like I have close to 1000, 404 errors when I am not able to replicate this crawl using many different methods. The site is using a htacess file with redirects and a rewrite condition. Rewrite Condition: Need to redirect when no trailing slash RewriteCond %{REQUEST_FILENAME} !-f
Technical SEO | | baldnut
RewriteCond %{REQUEST_FILENAME} !.(html|shtml)$
RewriteCond %{REQUEST_URI} !(.)/$
RewriteRule ^(.)$ /$1/ [L,R=301] The above condition forces the trailing slash on folders. Then we are using redirects in this manner: Redirect 301 /article.html http://www.domain.com/article/ In addition to the above we had a development site whilst I was building the new site which was http://dev.slimandsave.co.uk now this had been spidered without my knowledge until it was too late. So when I put the site live I left the development domain in place (http://dev.domain.com) and redirected it like so: <ifmodule mod_rewrite.c="">RewriteEngine on
RewriteRule ^ - [E=protossl]
RewriteCond %{HTTPS} on
RewriteRule ^ - [E=protossl:s] RewriteRule ^ http%{ENV:protossl}://www.domain.com%{REQUEST_URI} [L,R=301]</ifmodule> Is there anything that I have done that would cause this type of redirect 'loop' ? Any help greatly appreciated.\0 -
Tool to search relative vs absolute internal links
I'm preparing for a site migration from a .co.uk to a .com and I want to ensure all internal links are updated to point to the new primary domain. What tool can I use to check internal links as some are relative and others are absolute so I need to update them all to relative.
Technical SEO | | Lindsay_D0 -
Is there a need to have differen GWT account
Hi, in your opinion and practice, do you think that it is necessary not to put too many web sites that you optimize in the same GWT account? Can this always give Google a signal that there is a strong relation between this websites?
Technical SEO | | vladokan0 -
Target term hits a glass ceiling despite A grade
Greetings from 13 degrees C wetherby UK 🙂 Ive hit a roadbloack in my attempts to get a target term onto page one, below is a url pointing to a graph illustrting the situation. The target term is on the graph (I'm reluctant to stick it in here incase this page comes up) http://i216.photobucket.com/albums/cc53/zymurgy_bucket/glass-ceiling-office-to-let.jpg This is what Ive done to date for page -
Technical SEO | | Nightwing
http://www.sandersonweatherall.co.uk/office-to-let-leeds/ 1. Ensured the Markup follows SEO best parctice
2. Internally linked to the page via a scrolling footer
3. Shortened the URL
4. Requested the Social media efforts points links to the page
5. Requested additional content But i wonder... Is the reason for hitting a glass ceiling now down to lack of content ie just one page or is there a deeper issue of an indexing road block? Any insights welcome 🙂0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Internal Links not Crawled by Open Site Explorer
Can someone plz tell me why www.hotelelgreco.gr has only 2 internal links in OSE despite the fact that the text content has a plethora of them. Thanks in advance.
Technical SEO | | socrateskirtsios0 -
Online Marketing exec seeking internal SEO advice - Large org. Many verticals. Where to start?
Hello! I'm an Online Marketeer who's taking on some SEO responsibility internally (after months of researching - information overload) and will be working with an external agency. I'll get to the point here. The difficulty I have is in getting my head around approaching our SEO strategy. I have what I would class as a beginner-intermediate understand of SEO and the tools and techniques available for assistance (many great articles available on this forum and others). I feel that there are some real opportunities for our company that has already placed investment into SEO previously. The difficulty I have is working out how to approach the whole campaign. We are a recruitment consultancy with many many vertical markets, the sheer amount of keywords could be staggering. For some short insights we have a homepage PR 4, Alexa 338,361, PA 56, moz Rank 4.61. I've started by using the SEOmoz pro tool to identify problems with our sites optimisation and improve them. On reflection our site is doing OK, but we really need to show some significant results in search rankings, increases in traffic, and ultimately conversions. I'm a keen user of G-analytics and love the SEOmoz tool (a selection of the tools I will use to measure results). Are there any basic frameworks to devising a strategy that could help point me in the right direction? E.g. Keyword Research
Technical SEO | | ARMofficial
Identify Keywords
Optimise Site for keywords
Setup campaign to measure keywords SERP's
Perform internal & external SEO techniques
Wait and see what happens and continuously improve I appreciate there could be no simple answer to this project. If you would like any more information about who we are please let me know by comment. Best Regards,
Sam.0