Multi Location business - Should I 301 redirect duplicate location pages or alternatively No Follow tag them ?
-
Hello All,
I have a eCommerce site and we operate out of mulitple locations. We currently have individual location pages for these locations against each of our many categories. However on the flip slide , this create alot of duplicate content.
All of our location pages whether unique or duplicated have a unique title Tag, H1, H2 tag , NAP and they all bring in the City Name . The content on the duplicated content also brings in the City name as well.
We have been going through our categories and writing unique content for our most popular locations to help rank on local search.
Currently I've been setting up 301 redirects for the locations in the categories with the duplicated content pointing back to the category page.
I am wondering whether the increase in number of 301's will do more harm than having many duplicate location pages ?..
I am sure my site is affected by the panda algorithm penalty(on the duplicated content issues) as a couple of years ago , this didn't matter and we ranked top 3 for pretty much for every location but now we are ranking between 8 - 20th depending on keyword.
An Alternative I thought, may be to instead of 301 those locations pages with duplicate content, is to put No Follow tags on them instead ?...
What do you think ?.
It's not economically viable to write unique content for every location on every category and these would not only take years but would cost us far to much money. Our Site is currently approx 10,000 pages
Any thoughts on this greatly appreciated ?
thanks
Pete
-
Hello Monica,
Many thanks for your response.
We currently don't have any user generated content on these pages. It is something we are looking at doing but being a tool rental business , reviews is something, I have found that the industry (my competitors etc) have previously stayed aways from. Thoughts, being that most people who give a review with regards to this, would tend to give a negative review as opposed to saying the carpet cleaner they rented was ace !..
We are currently spending alot of money on the location content writting, it's just with over 100 categories and so many locations, the cost of doing this for everything would be way beyond our budgets. We have like you said, picked our most valuable locations depending on category (where we rent more of a certain product) and have written unique content for these pages.
If we Rel Canonical Tag the remaining location pages with the duplicate content, is this not going to cause problems as the main location varies category to category and these duplicate location pages still have their own NAP etc etc so , in effect I am pointing one branch page to another branch page at a different location which gets more rentals ?.. Would the location pages with the duplicate content still get affected by panda ?...
There's nothing really on the web I can find about this specific problem but I am sure , there must be loads of multiple location business who have done similar to us.
I will take a read and research on your suggestion and probably try it out for a few of them to see what happens in the rankings.
Many thanks
Pete
-
This is a tough question. My first thought is do you have any user generated content on these pages? Is it possible to get some reviews on these location pages?
Secondly, I know that it takes time and money but there is nothing more important for successful SEO than **uniquely valuable content. **If you are expecting success with duplicate content or thin and missing content you will not get it. If I could only pick one thing to spend money and time on it would be Content Writing. I would start with your most valuable pages and get some expertly written and valuable copy on them asap.
As far as the problem being location pages, I would recommend using some REL Canonical tags instead of 301 redirecting the pages. I would pick your main location as the authoritative page. Having unique copy on the location pages, like user generated comments and reviews, is really the best way to solve your duplicate content issue. The canonical tags will help you in the interim, but, the best way to solve your issue it with unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting to HTTPS for web design companies
Hello, We redirected our website about 3 months ago to https from http and noticed a drop in rankings after about 2 weeks. Unfortunately, our rankings have not yet recovered. Can anyone recommend a solutions? The website is https://www.web3.ca/ Do we have to build a lot of new links to https if we currently have links that are pointing to http, for the most part? Also, could the switch effect anchor text links? For example, if we have a link to web design, but the links is pointing to http, instead of https, would that link have less value now? Thanks, Anton Vasiliv
Local Website Optimization | | Web3
Creative Director
Web30 -
Can I have multiple GeoShape Schema for one page on one domain?
Hi Mozers, I'm working on some Schema for a client of mine, but whilst doing the research on GeoShapes with my developer, we came across a potential issue with this particular mark-up. My client is B2C business, operating in numerous places across the UK. I want to use the Circle property from GeoShape to draw out multiple circles across the UK, but am I able to do this? From looking at some other websites, most seem to just have one GeoShape. Can I have multiple on the same page and same domain? Thanks! Virginia
Local Website Optimization | | Virginia-Girtz0 -
Areaserved json-ld schema markup for a local business that targets national tourism
If there is a local business that thrives on ranking nationally for people searching for their services in that location, do you target the business's actual service areas or target nationally? For instance, a hotel in Denver, Colorado. Would the areaserved markup be: "areaServed":[{"@type":"State","name":"Colorado"},{"@type":"City","name":"Denver"}] Or "areaserved":"USA" The "geographic area where a service or offered item is provided" would be denver, colorado. But we would be looking to target all people nationally looking to travel to denver, colorado. Or would it be best to target it all, like: "areaServed":[{"@type":"State","name":"Colorado"},{"@type":"City","name":"Denver"},"USA"]
Local Website Optimization | | SEOdub0 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
Pages ranking outside of sales area
Hi there Moz Community, I work with a client (a car dealership), that mostly serves an area within 50-100 miles at most from their location. A previous SEO company had built a bunch of comparison pages on their website (i.e. 2016 Acura ILX vs. Mercedes-Benz C300). These pages perform well in their backyard in terms of engagement metrics like bounce rate, session duration, etc. However, they pull in traffic from all over the country and other countries as well. Because they really don't have much of an opportunity to sell someone a car across the country that a customer could easily buy at their local dealership, anyone from outside their primary marketing area typically bounces. So, it drags down their overall site metrics plus all of the metrics for these pages. I imagine searchers from outside their primary sales area are seeing their location and saying "whoah that's far and not what I'm looking for." I tried localizing the pages by putting their city name in the title tags, meta descriptions, and content, but that doesn't seem to really be getting rid of this traffic from areas too far away to sell a car to. My worry is that the high bounce rates, low time on site, and general irrelevancy of these pages to someone far away are going to affect them negatively. So, short of trying to localize the content on the page or just deleting these pages all together, I'm not quite sure where to go from here. Do you think that having these high bouncing pages will hurt them? Any suggestions would be welcomed. Thanks!
Local Website Optimization | | Make_Model1 -
Best way to remove spammy landing pages?
Hey Mozzers, We recently took over a website for a new client of ours and discovered that their previous webmaster had been using a WordPress plugin to generate 5,000+ mostly duplicated local landing pages. The pages are set up more or less as "Best (service) provided in (city)" I checked Google Webmaster Tools and it looks like Google is ignoring most of these spammy pages already (about 30 pages out of nearly 6,000 are indexed), but it's not reporting any manual webspam actions. Should we just delete the landing pages all at once or phase them out a few (hundred) at a time? Even though the landing pages are mostly garbage, I worry that lopping off over 95% of a site's pages in one fell swoop could have other significant consequences. Thanks!
Local Website Optimization | | BrianAlpert780 -
Is this an example of bad doorway pages or perfectly fine and helping users?
I'm asking because I want to do something similar. http://bit.ly/1puGXJu Imagine hundreds of pages like this, with the city names switched out. Since the inventory is different on each page, due to different inventory in different cities, are these pages not considered doorway pages and Google will probably be fine with them?
Local Website Optimization | | CFSSEO0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0