Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
-
Does using Google Loader's ClientLocation API to serve different content based on region hurt SEO?
Is there a better way to do what I'm trying to do?
-
I think people panic about cloaking/dynamic content too much to be honest.
It would be easy to go overboard and start alarm bells ringing, but if you have a dynamic area on a well structured and balanced page I can't see it being an issue.
Caveat: I can't think of a clear comparison to something I have worked on in terms of serving it geographically. However I've done similar based on countless other criteria and not felt it has harmed anything.
-
I am actually not really using this for SEO ranking purposes, although that might not be a bad side-effect.
I am using it to serve different content to different geographic locations. e.g. displaying the correct regional sales managers for the correct locations etc.
Do you think that placing dynamic content based on location on the homepage might give the googlebots a false cloaking? That wouldn't be too good.
-
Tricky this, as it depends on how it is being used.
Plenty of sites include dynamic content that will differ to different users. This can be for a number of legitimate reasons including serving different geographic content. If you are targeting general (non geographic) terms and every version of that page is serving those phrases well there should be no harm.
However, if the aim is to rank for [keyword placename] type searches and use the geographic targeting to do that then that is unlikely to work. If that were the aim you would probably be better served by having distinct pages for the pages and using the clienlocation API to direct users towards the most relevant one for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using "Div's" to place content at top of HTML
Is it still a good practice to use "div's" to place content at the top of the HTML code, if your content is at the bottom of the web page?
Technical SEO | | tdawson090 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
How to handle city-based product selection and duplicate content?
Hi everyone, I've been searching the interwebs for a solution to my problem, but haven't really found anything conclusive. I've got a client with duplicate content issues; their website not only has a nation-wide website, but also 10 different sub-categories for different cities, with each subcategory having the same content as the main website. The reason they wanted city-based websites was due to the changing product offerings in each city. So City 1 may not have all the products available that City 2 does. Needless to say this has caused some duplicate content issues as most sections of the website have been multiplied by 10. When a visitor lands on any page of the website, they are greeted by a pop up asking for their location, which will then redirect them to their selected version of the website. As the copy cannot really be changed enough for each city to make it unique, I've been looking into canonical tags, but this would mean the localised versions will not be indexed by Google. Has anyone had any experience of a similar situation where the product range changes according to location, but it doesn't hurt SEO? Thanks in advance for any advice!
Technical SEO | | Nimbus30000 -
Google Dropping Pages After SEO Clean Up
I have been using SEOmoz to clear errors from a site. There
Technical SEO | | Andy56
were over 10,000 errors to start with. Most of these were duplicate content, duplicate titles and too many links on a page. Most of the duplicate errors have now been
cleared. This has been done in two weeks (down to around 3000 errors now). But instead of improving my rankings, pages that were on the second page of Google have started to drop out of the listings altogether. The pages that are dropping out
are not related to the duplicate problems and get A grades when I run SEOmoz
page reports. Can you clean up too much too quickly or is there likely to be another reason for it?0 -
What's best practice for blog meta titles?
I have the option of placing meta titles on the actual blog, or on the blog category on my site. Should I have separate meta titles for each blog or bundle them under a category and try to drive traffic to the category? Can anyone help with best practice?
Technical SEO | | Lubeman0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0 -
Is there an onsite seo api?
Im trying to find an api which I can intergrate with my database for onsite keyword checking. Does anyone know if there is one available on the market? thanks, Chris
Technical SEO | | seomasters0 -
Crawl Tool Producing Random URL's
For some reason SEOmoz's crawl tool is returning duplicate content URL's that don't exist on my website. It is returning pages like "mydomain.com/pages/pages/pages/pages/pages/pricing" Nothing like that exists as a URL on my website. Has anyone experienced something similar to this, know what's causing it, or know how I can fix it?
Technical SEO | | MyNet0