Is it better to have URLs of internal pages that are geo-targeted or point geo-targeted links to the homepage?
-
For example...
Having links that are geo-targeted and pointing to this URL
or
Not having any geo-targeted internal pages and just having links that are geo-targeted and pointing to this URL
Eventually the site will be a national campaign, so I am concerned about having so many geo-targeted internal pages.
Thanks in advance!
-
Actually I created a separate thread for this question. I figured I would keep it organized.
-
I'm not sure what you mean - are you asking about keyword variations for ecommerce?
-
Great, thanks for detailed explanation! Just one last scenario is what if it is an e-commerce site and you can't have location dedicated pages, only product pages?
-
Duplicate content: If you're doing each location page correctly, you'll be minimizing the duplicate content by creating unique content on each page. If it's all duplicate content, that's a strike against how you implemented it, not the strategy of having multiple location pages.
Dedicated Page vs. Homepage: If you're trying to target "nyc real estate" you're going to absolutely need a page dedicated to that keyword. Your homepage will only suffice if you're only targeting NYC. If you're targeting beyond that, you'll need a dedicated page, bottom line. It's simply to competitive of a term to try and do it any other way.
Anchor Text: Don't worry about anchor text so much. Build branded links to your homepage, and get whatever links you can to those subpages, both with optimized anchor text and without. Subpages are great for getting exact and partial match anchor text because it's often the best way to describe that page, which isn't the case with the homepage.
Basic Keyword Variations: Regarding variations such as "nyc real estate", "real estate nyc", and "real estate new york city", Google is pretty good at filtering through simple variations like that. Pick the one that gets the most exact traffic in the adwords keyword tool, and then include the other variations on the page in a non-keyword-stuffing manner.
-
Hey Kane,
Thanks for the response! This question is very much related to this other question I posted, and the response I left to another member there directly affects what you are referring to here. If you can please check out my response there and give me your thoughts on it I would greatly appreciate it.
-
If you're targeting one location, you can probably work with the homepage.
More than one location? I'd use specific pages for each location.
Regarding too many pages - I'd argue that you only have too many if they have duplicate content. If each geo-targeted page has content that is unique to that city or area, I think you're in the clear. That said, if it's implemented in a way that looks spammy, then you have too many...
-
Why not both? If you are creating a valuable resource that is.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirect from dynamic url to static page
Hi, i want to redirect from this old link http://www.g-store.gr/product_info.php?products_id=1735/ to this one https://www.g-store.gr/golf-toualetas.html I have done several attempts but with no result. I anyone can help i will appreciate. My website runs in an Apache server with cpanel. Thank you
Technical SEO | | alstam0 -
How long does it take for Moz to discover links to pages
Hi folks, Our website is doing well in the Google rankings relative to our competitors who often have higher "Domain authority" than us as reported by Moz. I'm wondering how closely Moz's "Domain Authority" correlates with Google's. In particular, I wonder how long it takes Moz to discover inbound links. For instance our page at http://www.educationquizzes.com/ks3/english has many inbound links from pages on an outstanding educational website and yet our page authority is given by Moz as a measly "1"! Any insights would be very much appreciated.
Technical SEO | | colinking0 -
Does anchor text penalty apply to internal links?
We already know that over optimsied anchor text for external will cause a penalty. But what about internal links? All of our blog posts include an advertisement linking sales pages. These links all use the exact same anchor text. Is linking to an internal page from so many other pages (blog posts) likely to trigger a penalty? Here is an example: http://www.designquotes.com.au/business-blog/four-ways-to-enhance-your-e-commerce-site-for-busy-shoppers/ This links to http://www.designquotes.com.au/web-design-quotes Many of the posts link to the same page using the anchor text "Compare Web Design Quotes from Local Designers."
Technical SEO | | designquotes0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Homepage outranked by sub pages - reason for concern?
Hey All, trying to figure out how concerned I should be about this. So here is the scoop, would appreciate your thoughts. We have several eCommerce websites that have been affected by Panda, do to content from manufacturers and lack of original content. We have been working hard to write our own descriptions and are seeing an increase in traffic again. We have also been writing blogs since February and are getting a lot of visits to them. Here is the problem, our blog pages are now outranking our homepage when you type in site:domain-name Is this a problem? our home page does not show up until you are 3 pages in. However when you type in just our domain name in google as a search it does show up in position one with sitelinks under it. This is happening across both of our sites. Is this a cause for concern or just natural due to our blogs being more popular than our homepage. Thanks! Josh
Technical SEO | | prima-2535090 -
External Links on a Front Page
Does anyone have any links to information about external links on a front page ? I am advising a client that this is not the best idea and that they could be put in a different place but can't find any proof of this.
Technical SEO | | marcelo-2753980 -
Frequent updating of pages on website to rank better
Will updating each page often for example everyday or few days, rank the page and/or website better in google. The reason I ask is that I made 18 websites about three months ago and the traffic initially was alot higher and has fallen little by little thru the 3 months. Also how well my pages rank has also fallen. I just put out the websites, have done nothing else. No linking, etc. No updates. It is evident that without new links coming in, the website will fall in rank ie., link aquistion velocity But my question is if I update the pages and change content frequently will this improve my position in google and other search engines. The traffic on websites over the three months if graphed sort of looks like a stair case going down.
Technical SEO | | mickey110 -
Backlinks to home page vs internal page
Hello, What is the point of getting a large amount of backlinks to internal pages of an ecommerce site? Although it would be great to make your articles (for example) strong, isn't it more important to build up the strength of the home page. All of My SEO has had a long term goal of strengthening the home page, with just enough backlinks to internal pages to have balance, which is happening naturally. The home page of our main site is what comes up on tons of our keyword searches since it is so strong. Please let me know why so much effort is put into getting backlinks to internal pages. Thank you,
Technical SEO | | BobGW0