Which is better for Local & National coupons --1000s of Indexed Pages per City or only a Few?
-
Not sure where this belongs..
I am developing a coupons site for listing local coupons and national coupons (think Valpak+RetailMeNot), eventually in all major cities, and am VERY concerned about how many internal pages to let google 'follow' for indexing, as it can exceed 10,000 per city.
Is there a way to determine what the optimal approach is for internal paging/indexing BEFORE I actually launch the site (it is about ready except for this darned url question, which seems critical) Ie can I put in searchwords for google to determine which ones are most worthy to have their own indexed page? I'm a newbie sort of, so please put answer in simple terms. I'm one person and have limited funds and need to find the cheapest way to get the best organic results for each city that I cover.
Is there a generic answer? One SEO firm told me the more variety the better. Another told me that simple is better, and use content on the simple pages to get variety. So confused I decided to consult the experts here!
Here's the site concept:
**FOR EACH CITY: **
User inputs location: Main city only(ie Houston), or 1 of 40 city regions(suburb, etc..), or zip code, or zip-street combo, OR allow gps lookup. A miles range is defaulted or chosen by the user.
After search area is determined, user chooses 1 of 6 types of coupons searches:
1. Online shopping with national coupon codes, choice of 16 categories (electronics, health, clothes, etc) and 100 subcategories (computers, skin care products, mens shirts) These are national offers for chains like Kohls, which do not use the users location at all.
2. Local shopping in-store coupons, choice of same 16 categories and 100 subcategories that are used for online shopping in #1 (mom & pop shoe store or local chain offer). The results will be within the users chosen location and range.
3. Local restaurant coupons, about 60 subcategories (pizza, fast food, sandwiches). The results are again within the users chosen location and range.
4. Local services coupons, 8 categories (auto repair, activities,etc..) and around 200 subcategories (brakes, miniature golf, etc..). Results within users chosen location and range.
5. Local groceries. This is one page for the main city with coupons.com grocery coupons, and listing the main grocery stores in the city. This page does not break down by sub regions, or zip, etc..
6. Local weekly ad circulars. This is one page for the main city that displays about 50 main national stores that are located in that main city.
So, the best way to handle the urls indexed for the dynamic searches by locations, type of coupon, categories/subcats, and business pages
The combinations of potential urls to index are nearly unlimited:
Does the user's location matter when he searches for one thing (restaurants), but not for another (Kohls)? IF so, how do I know this? SHould I tailor indexed urls to that knowledge? Is there an advantage to having a url for NATIONAL cos that ties to each main city: shopping/Kohls vs shopping/Kohls/Houston or even shopping/Kohls/Houston-suburb?
Again, I"m talking about 'follow' links for indexing. I realize I can have google index just a few main categories and subcats and not the others, or a few city regions but not all of them, etc.. while actually having internal pages for all of them..
Is it better to have 10,000 urls for say coupon-type/city-region/subcategory or just one for the main city: main-city/all coupons?, or something in between? You get the gist. I don't know how to begin to figure out the answers to these kinds of questions and yet they seem critical to the design of the site.
The competition: sites like Valpak, MoneyMailer, localsaver seem to favor the 'more is better' approach, with coupons/zipcode/category or coupons/bizname/zipcode But a site like 8coupons.com appears to have no indexing for categories or subcategories at all! They have city-subregion/coupons and they have individual businesses bizname/city-subregion but as far as I see no city/category or city-subregion/category. And a very popular coupons site in my city only has maincity/coupons maincity/a few categories and maincity/bizname/coupons.
Sorry this is so long, but it seems very complicated to me and I wanted to make the issue as clear as possible. Thanks, couponguy
-
Great! I just sent you an email.
-
Hi,
Sure, I can do some analysis for you - I work solely as a freelance consultant right now. If you're keen, just send me an email (jane.copland@gmail.com). I can do a competitive analysis audit for the main competitors, which could be of use!
Cheers,
Jane
-
Thanks. Not knowing this well, are you for hire to check out some competitors if I give you some names? I can't afford to mess this up (over 5000 hours of programming into this). I know I should learn more but I'm spread thin..
-
Hi,
If consolidation is an option, I'd certainly consider it. What I'd be curious about is the indexation and page count of your most successful competitors. I have not worked with a coupon site personally, and I must admit that the 8,000 page number per town does concern me... however, what I'd do is run a ScreamingFrog crawl (http://www.screamingfrog.co.uk/seo-spider/ - you will need to pay for the premium account at $99 to remove the 500 URL limit) for a look at competitors' websites. This will show you the response codes, canonical tags, directives, etc. that others are using. I am not a fan of the idea that if your competitors are doing it, you should do it too, but this will give you a good idea of what is working for sites who manage to rank well for both smaller terms ([jiffy lube coupon post falls]) and big terms ([kohls coupons]).
I would say that 1,000 is preferable to 8,000 if structured properly, but I'd be really keen to know what the rest of your field in vouchers / coupons looks like from an indexed / live URLs perspective.
-
Thank you. 8000 pages per city won't hurt me? That's perhaps my biggest concern..My structure right now has all those pages, but I want to make sure that's the best way to go..alternatively I could probably reduce the number to 1000 or so by combining subcategories in to 'grouped' subcategories (ie all plumber, carpenter, contractors go under 'home-repairs'). Is 1000 better than 8000?
-
Hi,
It really is complicated - I would definitely say that you do not need to think about building links to 8,000+ pages - the well-ranked competitors won't have good links to the majority of their internal pages, but they'll engage in good marketing that brings in authority to the home page and similar high-level pages on the site. Then they'll link well, with good site structure, down through the categories. They'll also (for the most part) avoid duplication issues with canonical tags, although as you point out, some duplication within sites like this is to be expected. Due to these sites' pages being indexed and often ranking well, we have to assume that Google understands the nature of coupon sites, although you still need to be careful of site hygiene and keep a close eye on your Webmaster Tools account for crawl errors, etc.
-
Thanks Jane,
This is, it seems, complicated, so I appreciate your taking the time to check into it.
Very good advice regarding avoiding duplication. Yet, in the Olive Garden example location IS important, so if I decide to go that route I need to be sure there is content unique to the location (maybe nearby offers, for example..)
If there are 40 regions in a city, and 200 subcategories that's 8000 indexed pages potentially without even listing businesses, so even a simple structure like you mention could yield a huge number of internal pages. I question the value of trying to build backlinks to 8000 pages and worry about losing 'juice' from the home page if I do so..(I've read that page rank is a very low search ranking factor anymore, so maybe I need not worry about the juice issue at all--your thoughts?).
-
Hi there,
The danger you face in creating tens of thousands of URLs / pages for everything on the site and allowing those pages to be indexed is that it is almost certain that these pages will essentially duplicate each other. A coupon page for deals at Olive Garden in Phoenix will not be different, besides one or two words, from a page about Olive Garden in Seattle.
This isn't stopping most of the competitors mentioned: Valpak is allowing these pages to be indexed, although I am not sure of their reach with these pages in terms of search engine performance. Users access a page like this one: http://www.valpak.com/coupons/printable/Jiffy-Lube/92048?addressId=1689472&offerId=1581320 from the main Spokane, WA page, but this URL contains a canonical tag that cuts off the ?addressId= section of the URL, leaving http://www.valpak.com/coupons/printable/Jiffy-Lube/92048. This URL is indexed: https://www.google.co.uk/search?q=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&oq=http%3A%2F%2Fwww.valpak.com%2Fcoupons%2Fprintable%2FJiffy-Lube%2F92048&aqs=chrome..69i58j69i60j69i57.895j0j4&sourceid=chrome&espv=210&es_sm=119&ie=UTF-8 and I get it ranking sixth in google.com for [jiffy lube coupons post falls] (not the web's most competitive phrase, but an indicator that the site is indexed and able to rank well for "deep" pages).
MoneyMailer's pages are badly optimised - not even a descriptive title tag here: http://www.moneymailer.com/coupons/online/seattle/wa/dining/855894?pageNum=1 but the page is still indexed. That page doesn't rank for related terms, as far as I can see.
Regarding location, several are allowing URLs that do not denote location to load, with canonical tags pointing to a location-based URLs, e.g. http://www.localsaver.com/98102/Real_Estate_Agents/Windermere_Eastlake/BDSP-12576652/931434.html is accessed from the Seattle, WA page but its canonical tag points to http://www.localsaver.com/WA/Seattle/Windermere_Eastlake/BDSP-12576652/931434.html
I would imagine that location is pretty key, especially given the nature of search queries you're wanting to target, e.g. people who want a coupon for a restaurant local to themselves. If people want to walk into a specific store or restaurant with a coupon, they will note the area. Where you will see people leave the area out is when they expect to buy online, or the product is more generic than a specific store, e.g. shoes. Many sites seem to employe a combination, but those focusing on location are keeping it simple and mentioning coupons available at specific stores.
I would look to placing content in a structure that avoids duplication but keeps the site structure relatively simple, like coupons/region/category. You are seeing a lot of variation because there are multiple ways to go about this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Post indexing Issue
I am now currently having the crowing issue for the past 9 months the domain hardly Index post use to make it by using Google inspect tools for faster indexing before it's was announced that Google has temporarily disabled it. post like https://xclusivesongs.com/meet-nigerian-most-awarded-music-artists/ takes up to 6 days before indexing. another issue is that it's one year now I'm still can't believe that my domain is yet to dominate is brand name looks like something is wrong I don't just get it do mean each time I search the name xclusivesongs it keeps showing me like this saying do you mean?" 8vhEn3y
Local Website Optimization | | Gerradino0 -
Need Awesome Examples of Well-Designed Service & Product Pages
I'm looking for some examples of really well built product/service pages that have great conversion points on them. I find most small businesses do a terrible job at highlighting their features & benefits (the "why") for their services and wanted some inspiration from those that are doing a fabulous job.
Local Website Optimization | | JoyHawkins0 -
Hreflang errors "no return tag" sitemap.xml , and local search landing page with wrong Languages
Really need help , our website when search in google(US) will provide global page (keyword:asus/asus zenfone3). and search console also return "no return tags"another wear thing is when use googlebot crawl sitemap.xml googlebot cannot finish the file less than a quarterCan you please advise on what needs to be edited or changed to make sure my implementation is correct and not returning errors?
Local Website Optimization | | June01270 -
Suburb Pages
Hey Mozers, This is an old and often criticized method of SERP however we have a client who has requested we create suburb specific pages for their site. PLASTIC PLANTS "SUBURB" NEED PLASTIC PLANTS IN "SUBURB" They have shown us a competitor who is ranking for hundreds maybe thousands of suburbs in Australia using this method. Any thoughts or experience in this area would be appreciated.
Local Website Optimization | | wearehappymedia0 -
Theory: Local Keywords are Hurting National Rankings?
I've read a good amount here and in other blog posts about strategies for national brands to rank locally as well with local landing pages, citations, etc. I have noticed something strange that I'd like to hear if anyone else is running into, or if anyone has a definitive answer for. I'm looking at a custom business printing company where the products can and are often shipped out of state, so it's a national brand. On each product page, the client is throwing in a few local keywords near where the office is to help rank for local variations. When looking at competitors that have a lower domain authority, lower volume of linking root domains, less content on the page, and other standard signals, they are ranking nationally better than the client. The only thing they're doing that could be better is bolding and throwing in the page keyword 5-10 times (which looks unnatural). But when you search for keyword + home city, the client ranks better. My hypothesis is that since the client is optimizing product pages for local keywords as well as national, it is actually hurting on national searches because it's seen as local-leaning business. Has anyone run into this before, or have a definitive answer?
Local Website Optimization | | Joe.Robison2 -
Call Tracking, DNI Script & Local SEO
Hi Moz! I've been reading about this a lot more lately - and it doesn't seem like there's exactly a method that Google (or other search engines) would consider to be "best practices". The closest I've come to getting some clarity are these Blumenthals articles - http://blumenthals.com/blog/2013/05/14/a-guide-to-call-tracking-and-local/ & the follow-up piece from CallRail - http://blumenthals.com/blog/2014/11/25/guide-to-using-call-tracking-for-local-search/. Assuming a similar goal of using an existing phone number with a solid foundation in the local search ecosystem, and to create the ability to track how many calls are coming organically (not PPC or other paid platform) to the business directly from the website for an average SMB. For now, let's also assume we're also not interested in screening the calls, or evaluating customer interaction with the staff - I would love to hear from anyone who has implemented the DNI call tracking info for a website. Were there negative effects on Local SEO? Did the value of the information (# of calls/month) outweigh any local search conflicts? If I was deploying this today, it seems like the blueprint for including DNI script, while mitigating risk for losing local search visibility might go something like this: Hire reputable call-tracking service, ensure DNI will match geographic area-code & be "clean" numbers Insert DNI script on key pages on site Maintain original phone number (non-DNI) on footer, within Schema & on Contact page of the site ?? Profit Ok, those last 2 bullet points aren't as important, but I would be curious where other marketers land on this issue, as I think there's not a general consensus at this point. Thanks everyone!
Local Website Optimization | | Etna1 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0 -
What is the best type map for local SEO?
Hi mozzers, Can someone tell me which type of map is best when embedding it into your service pages? or any map is good enough? Why? Thanks guys!
Local Website Optimization | | Ideas-Money-Art0