Serving different content based on IP location
-
I have city centric website. For sake of simplicity, say I only have 2 cities -- City A and City B.
Depending on a user's IP address, they will either get City A or City B. Users can change their location through javascript on pages. But there is no cross-linking between cities. By this, I mean that unless you can read or execute javascript, there is no way for you to get from city A to City B.
My concern is this: googlebot comes to my site, and we serve them up City A. How does City B get discovered if Googlebot doesn't read javascript?
We have an xml sitemap plus plenty of backlinks to City B. Is this sufficient?
Should I provide a static link to City B (and vice versa) on the homepage for crawling purposes?
-
Adding to Daniel's comment, I'd say the big difference "...through our faceted search." It's important to have both the XML entries and a crawl path. An XML sitemap may be enough to get the pages indexed, but they won't inherit any internal link-juice. That comes through your internal links. Somewhere, there needs to be a link that Google can crawl to the other cities.
The direct back-links will help, and should get you indexed and possibly ranking, but you're still losing the authority from the domain as a whole that you'd inherit via internal links. The upshot is that you'll lose ranking power.
-
I do the exact same thing (local business pages based on visitor IP) but you can change your location based on what search terms you enter.
What we also do is allow anyone to browse any state/city results through our faceted search and we have XML sitemap entries for each state/category landing page which will then link down to city level searches.
We have seen no problem with google indexing our site (currently almost 500,000 pages indexed.)
As long as you don't actively hide content that doesn't pertain to the requesting site IP and you provide some way for Google to find it, you should be OK.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog with all copied content, should it be rewritten?
Hi, I am auditing a blog where their goal is to get approved to on ad networks but the whole blog has copied content from different sources, so no ad network is approving them. Surprisingly (at least to me), is that the blog ranks really well for a few keywords (#1's and rich snippets ), has a few hundred of natural backlinks, DA is high, has never been penalized (they have always used canonical tags to the original content), traffic is a few thousand sessions a month with mostly 85% organic search, etc. overall Google likes it enough to show them high on search. So now the owner wants to monetize it. I suggested that the best approach was to rewrite their most visited articles and deleted the rest with 301 redirects to the posts that stay. But I actually haven't worked on a similar project before and can't find precise information online so I'm looking to know if anyone has a similar experience to this. A few of my questions are: If they rewrite most of the pages and delete the rest so there is no repeated/copied content, would ad networks (eg. adsense) approve them? Assuming the new articles are at least as good quality as the current ones but with original content, is there a risk on losing DA? since pretty much it will look like a new site once they are done They have thousands of articles but only about 200 hundred get most visits, which would be the ones getting rewritten, so it should be fine to redirect the deleted ones to the remaining? Thanks for any suggestions and/or tips on this 🙂
Intermediate & Advanced SEO | | ArturoES0 -
Dynamic referenced canonical pages based on IP region and link equity question
Hi all, My website uses relative URLs that has PHP to read a users IP address, and update the page's referenced canonical tag to an region specific absolute URL for ranking / search results. E.g. www.example.com/category/product - relative URL referenced for internal links / external linkbuilding If a US IP address hits this link, the URL is the same, but canonicalisation is updated in the source to reference www.example.com**/us/**category/product, so all ranking considerations are pointed to that page instead. None of these region specific pages are actually used internally within the site. This decision was done so external links / blog content would fit a user no matter where they were coming from. I'm assuming this is an issue in trying to pass link equity with Googlebot, because it is splitting the strength between different absolute canonical pages depending on what IP it's using to crawl said links (as the relative URL will dynamically alter the canonical reference which is what ranking in SERPs) Any assistance or information no matter how small would be invaluable. Thanks!
Intermediate & Advanced SEO | | MattBassos0 -
Swiss based, USA links only
Hello, My company is based is Switzerland with a Swiss address and US number but my client are only in the USA. I only have links from US websites and no Swiss website. Can I be penalised by google for that ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Does location of my VPS and IP adress matter to Google's ranking?
We're busy with adding a German version of our webshop. Right now we're quit successful in The Netherlands with our webshop and SEO. I wonder if Google minds the location of the website (VPS) and IP address concerning SEO for our German webshop. If I Google on this subject I can not find a clear answer. Can somebody help me?
Intermediate & Advanced SEO | | Makelmail0 -
Blog Content In different language not indexed - HELP PLEASE!
I have an ecommerce site in English and a blog that is in Malay language. We have started the blog 3 weeks ago with about 20-30 articles written. Ecommerce is using MAgento CMS and Blog is wordpress. URL Structure: Ecommerce: www.example.com Blog: www.example.com/blog Blog category: www.example.com/blog/category/ However, google is indexing all pages including blog category but not individual post that is in Malay language. What could be the issue here? PLEASE help me!
Intermediate & Advanced SEO | | WayneRooney0 -
Duplicate content
I run about 10 sites and most of them seemed to fall foul of the penguin update and even though I have never sought inorganic links I have been frantically searching for a link based answer since April. However since asking a question here I have been pointed in another direction by one of your contributors. It seems At least 6 of my sites have duplicate content issues. If you search Google for "We have selected nearly 200 pictures of short haircuts and hair styles in 16 galleries" which is the first bit of text from the site short-hairstyles.com about 30000 results appear. I don't know where they're from nor why anyone would want to do this. I presume its automated since there is so much of it. I have decided to redo the content. So I guess (hope) at some point in the future the duplicate nature will be flushed from Google's index? But how do I prevent it happening again? It's impractical to redo the content every month or so. For example if you search for "This facility is written in Flash® to use it you need to have Flash® installed." from another of my sites that I coincidently uploaded a new page to a couple of days ago, only the duplicate content shows up not my original site. So whoever is doing this is finding new stuff on my site and getting it indexed on google before even google sees it on my site! Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
First link importance in the content
Hi, have you guys an opinion on this point, mentioned by Matt Cutts in 2010 : Matt made a point to mention that users are more likely to click on the first link in an article as opposed to a link at the bottom of the article. He said put your most important links at the top of the article. I believe it was Matt hinting to SEOs about this. http://searchengineland.com/key-takeaways-from-googles-matt-cutts-talk-at-pubcon-55457 I've asked this in private and Michael Cottam told me he read a study a year ago that indicated that the link juice passed to other pages diminished the further down the page you go. But he can't find it anymore ! Do you remember this study and have the link ? What is your opinion on Matt's point ?
Intermediate & Advanced SEO | | baptisteplace0 -
Expired Content
Hi We have a listing website that has a huge amount of listings.These listings are changing all time, they become passive or deleted. We would like to choose the response code for the passive for deleted pages. Which response type must we use ? Redirect to last category with 301 Give 410 Gone response code Give 404 Response code which option would we choose ? and any ideas ?
Intermediate & Advanced SEO | | SEMTurkey0