To block with robots.txt or canonicalize?
-
I'm working with an apt community with a large number of communities across the US. I'm running into dup content issues where each community will have a page such as "amenities" or "community-programs", etc that are nearly identical (if not exactly identical) across all communities.
I'm wondering if there are any thoughts on the best way to tackle this. The two scenarios I came up with so far are:
Is it better for me to select the community page with the most authority and put a canonical on all other community pages pointing to that authoritative page?
or
Should i just remove the directory all-together via robots.txt to help keep the site lean and keep low quality content from impacting the site from a panda perspective?
Is there an alternative I'm missing?
-
I think the canonical idea is better than blocking the pages all together. Depending on how the site is laid out you may try and make the pages more specific to location being talked about. Maybe adding header tags with the location information as well as adding that info to the page title and meta-description. If it is not too time consuming, I'd try and make those pages more unique especially since you might be getting searches based on a location. Location specific pages may help in that regard.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can't work out robots.txt issue.
Hi I'm getting crawl errors that MOZ isn't able to access my robots.txt file but it seems completely fine to me? Any chance anyone can help me understand what might be the issue? www.equip4gyms.co
Moz Pro | | brenmcc10 -
Meta Robots query
Hi guys, I was ranking really well on my home page for certain keywords which has all dropped pretty dramatically over the last 3/4 weeks - I think the issue is since since the configuration of Yoast SEO Wordpress plugin. In March (when my rankings were strong) my crawl test showed the top data in the attached image, and in May (now the rankings have dropped severly) they show the bottom data. I don't fully understand canonical and Meta Robots so I am hoping someone can shed some light on the following points. 1. Will the change result in my loss of rankings.
Moz Pro | | RocketStats
2. How can I put it back to how it was in March? PS. I haven't had any Google penalties. Thanks,
Joshua RfTar0 -
Duplicate Content, Canonicalization may not work in our scenario.
I'm new to SEO (so please excuse the lack of terminology), and will be taking over our companies inbound marketing completely, I previously just did data analysis and managed our PPC campaigns within Google and Bing/Yahoo, now I get all three, Yipee! But I digress. Before I get started here, I did read: http://moz.com/community/q/new-client-wants-to-keep-duplicate-content-targeting-different-cities?sort=most_helpful and I found both the answers there to be helpful, but indirect for my scenario. I'm conducting our companies first real SEO audit (thanks MOZ for the guide there), and duplicate content is going to be our number one problem to tackle. Our companies website was designed back in 2009, with the file structure /city-name/product-name. The problem with this is, we are open in over 50 cities now (and headed to 100 fast), and we are starting to amass duplicate content. Five products (and expanding), times the locations... you get it. My Question(s): How should I deal with this? The pages are almost identical, except listing the different information for each product depending upon it's location. However, for one of our products, Moz's own tools (PRO) did not find all the duplicate content, but did find some (I'm assuming it's because the pages have different course options and the address for the course is different, boils down to a different address on the very bottom of the body and different course options on the right sidebar). The other four products duplicate content were found and marked extensively. If I choose to use Canonicalization to link all the pages to one main page, I believe that would pass all the link juice to that one page, but we would no longer show in a Google search for the other cities, ex: washington DC example product name. Correct me if I'm wrong here. **Should I worry about the product who's duplicate content only was marked four times out of fifty cities? **I feel as if this question answers itself, but I still would like to have someone who knows more than me shed some light on this issue. The other four products are not going to be an issue as they are only offered online, but still follow the same file structure with /online in place of /city-name. These will be Canonicalized together under the /online location. One last thing I will mention here, having the city name in the url gives us a nice advantage (I think) when people are searching for products in cities we offer our product. (correct me again) If this is not the case, I believe I could talk our team into restructuring the files (if you think that's our best option). Some things you need to know about our site: We use a cookie for the location. Once you land on a page that has a location tied to it, the cookie is updated and saved. If the location does not exist, then you are redirected to a page to chose a location. I'm pretty sure this can cause some SEO issues too, but once again not sure. I know this is a wall of text, but I cannot tell you enough how appreciative I am in advance for your informative answers. Thanks a million, Trenton
Moz Pro | | PM_Academy0 -
Linking C Blocks - SEOMoz says its a good thing?
In the competitve analysis, one competitor have more Linking C Blocks, Seomoz has a tick by it almost like its a better thing. Surely a site with the same administrative relationship is not going to help you as much from a linking point of view.
Moz Pro | | sanchez19600 -
Rogerbot Ignoring Robots.txt?
Hi guys, We're trying to block Rogerbot from spending 8000-9000 of our 10000 pages per week for our site crawl on our zillions of PhotoGallery.asp pages. Unfortunately our e-commerce CMS isn't tremendously flexible so the only way we believe we can block rogerbot is in our robots.txt file. Rogerbot keeps crawling all these PhotoGallery.asp pages so it's making our crawl diagnostics really useless. I've contacted the SEOMoz support staff and they claim the problem is on our side. This is the robots.txt we are using: User-agent: rogerbot Disallow:/PhotoGallery.asp Disallow:/pindex.asp Disallow:/help.asp Disallow:/kb.asp Disallow:/ReviewNew.asp User-agent: * Disallow:/cgi-bin/ Disallow:/myaccount.asp Disallow:/WishList.asp Disallow:/CFreeDiamondSearch.asp Disallow:/DiamondDetails.asp Disallow:/ShoppingCart.asp Disallow:/one-page-checkout.asp Sitemap: http://store.jrdunn.com/sitemap.xml For some reason the Wysiwyg edit is entering extra spaces but those are all single spaced. Any suggestions? The only other thing I thought of to try is to something like "Disallow:/PhotoGallery.asp*" with a wildcard.
Moz Pro | | kellydallen0 -
Local search block seems to mess up ranking calculations!
Time and time again i see SEO moz come out with a new rankings report and tell me that I've gone up or down in the rankings by 9 and I get really excited. Then i go look at the search and once agian we are in the exact same position! What it seems like is sometimes SEO Moz decides to count the local search block, and other times it decides not too. Is there any way to fix this? To make it always count the local search block would be preferred.
Moz Pro | | adriandg0 -
How to get rid of the message "Search Engine blocked by robots.txt"
During the Crawl Diagnostics of my website,I got a message Search Engine blocked by robots.txt under Most common errors & warnings.Please let me know the procedure by which the SEOmoz PRO Crawler can completely crawl my website?Awaiting your reply at the earliest. Regards, Prashakth Kamath
Moz Pro | | 1prashakth0 -
Link Blocks
Sorry, perhaps a noob question. In relation to site explorer, have also searched and unable to find any information, wondered if anyone could advise as to what "Linking C Blocks" are? Found under the "Compare Link Metrics" tab. Thanks in advance. Lee
Moz Pro | | LeeMiller0