What is the best way to eliminate this specific image low lying content?
-
The site in question is www.homeanddesign.com where we are working on recovering from some big traffic loss.
I finally have gotten the sites articles properly meta titled and descriptioned now I'm working on removing low lying content.
The way there CMS is built, images have their own page (every one that's clickable). So this leads to a lot of thin content that I think needs to be removed from the index. Here is an example:
http://www.homeanddesign.com/photodisplay.asp?id=3633
I'm considering the best way to remove it from the index but not disturb how users enjoy the site.
What are my options? Here is what I'm thinking:
-
add Disallow: /photodisplay to the robots.txt file
-
See if there is a way to make a lightbox instead of a whole new page for images. But this still leaves me with 100s of pages with just an image on there with backlinks, etc.
-
Add noindex tag to the photodisplay pages
-
-
Disallow: /photodisplay.asp?*
That should do it. But just to be safe you can add another one for:
Disallow: /photodisplay.asp
There is very, very, very, very little danger of you blocking your entire site from being crawled if you add those disallow statements to your robots.txt file. If you're an SEO your job is to "mess with" the robots.txt file. Furthermore, trying to dynamically change the robots meta tag to noindex based on page-type is going to be much more tricky and potentially dangerous than adding a line to the robots.txt file.
Don't forget to remove the pages from the index using the URL removal tool in GWT once the block has been added.
Also I'd stop linking to those pages. It is best practice not to link to pages that you don't want indexed if you can help it. I'd go the lightbox route you mentioned above. This is something I do on my Wordpress sites too.
Good luck!
-
Hi WIlliam,
I would personally go the route of adding the noindex tag to the photo pages. Messing with the robots.txt file would probably be quicker; however, I am a little hesitant about messing with the robots.txt tag if I don't have to... one slip and you could be blocking your whole site or an entire directory from being crawled vs specifically calling out each individual page using the noindex tag.
Lightboxes are fine, but like you say, you aren't really solving the problem of tons of other pages.
You could look into your CMS and see if there is a way to remove the automatically generated link to photodisplay.aspXXXX so that the images are still displayed with , but it doesn't add the <a href="">... you know?</a>
<a href="">Hope this helps.
Mike</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The best checking tool Keyword Cannibalization
hi guys i have a Keyword Cannibalization isuue, please Introduce best free tools for checking Keyword Cannibalization.
Reporting & Analytics | | 1001mp30 -
Duplicate Title Errors on Product Category Pages - The best practice?
I'm getting quite a few 'Duplicate Title Error' on category pages which span over 2 - 3 pages. E.g. http://www.partwell.com/cutting-punches http://www.partwell.com/cutting-punches?page=1 http://www.partwell.com/cutting-punches?page=2 http://www.partwell.com/cutting-punches?page=3 All 4 pages currently have the same title... <title>Steel Cutting Punches</title> I was thinking of adding Page Numbers to the title of each corresponding page, thus making them all unique and clearing the Duplicate Page Title errors. E.g. <title>Steel Cutting Punches</title> <title>Steel Cutting Punches | Page 1 of 3</title> <title>Steel Cutting Punches | Page 2 of 3</title> <title>Steel Cutting Punches | Page 3 of 3</title> Is this the best way to go around it? Or is there another way that I'm not thinking of? Would I need to use the rel=canonical tag to show that the original page is the one I want to be found? Thanks
Reporting & Analytics | | bricktech0 -
Best way to measure local search keyword rank
I have several clients that have a regional or local presence and want to track their rankings for various keywords, but only in the areas they are located in. What are the best ways to track keyword rank in specific locations or regions of the US?
Reporting & Analytics | | TheURLdr0 -
How to get metrics on Home page image link to youtube video in lightbox
My home page has a explainer video. Is there a way to get metrics or analytics data about who watch or don't watch the video? http://www.furnacefilterscanada.com For example, if I have 100 unique visitors on this page and my video get 33 views, 33% saw the video! I think the data provide by youtube are not accurate, my site has a avarage of 40 visitors/day and my video was saw 84 times sinces it is online, 14 days ago. On that 84 clicks, I most have click 30 times!!! When you lend on my home page, you can't miss it! http://screencast.com/t/ZbIWYl0W I don't believe only a few visitors saw it. Conversation rate has increase. I use Google Analytics, is there a code or something that can be done to get metrics or data more accurate then the .Ananalytics" provide in my youtube control panel? Please HELP! thank you, BigBlaze
Reporting & Analytics | | BigBlaze2050 -
Google Images referral visits fell off a cliff
As of 28-Jan, referral traffic from Google images (/imgres) to my domain has pretty much vaporized. Visits are down 85%. It's actually not a disaster because most of those visits were from poorly optimized alt tags that were resulting in low quality visits. The interesting thing is that visit duration is up 80% during the same period. So I'm asking this question out of curiosity more than anything. Is this likely an algorithm tweak? I can't think of any major changes on my end. There's only one other data point to mention but I don't see how they'd be connected. I did two PRWEB releases on 29-Jan and 30-Jan that resulted in a few hundred new no-follow links back to my site. 22b7k2.png
Reporting & Analytics | | JonDiPietro0 -
Is Google able to determine duplicate content every day/ month?
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site). Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday. Have you seen or heard something similar?
Reporting & Analytics | | Martijn_Scheijbeler0 -
Setting up Analytics on a Site that Uses Frames For Some Content
I work with a real estate agent and he uses strings from another tool to populate the listings on his site. In an attempt to be able to track traffic to both the framed pages and the non-framed pages he has two sets of analytics code on his site - one inside the frame and one for the regular part of the site. (there's also a third that the company who hosts his site and provides all these other tools put on his site - but I don't think that's really important to this conversation). Not only is it confusing looking at the analytics data, his bounce rate is down right unmanageable. As soon as anyone clicks on any of the listings they've bounced away. Here's a page - all of those listings below " Here are the most recent Toronto Beaches Real Estate Listings" are part of a frame. http://eastendtorontohomes.com/toronto-beach-real-estate-search/ I'm not really sure what to do about it or how to deal with it? Anyone out there got any good advice? And just in case you're wondering there aren't any other options - apart from spending thousands to build his own database thingie. We've thought about that (as other agents in the city have done that), but just aren't sure it's worth it. And, quite frankly he doesn't want to spend the money.
Reporting & Analytics | | annasus0 -
Sub-category considered duplicate content?
Hello, My craw diagnostics from the PRO account is telling me that the following two links have duplicate content and duplicate title tag: http://www.newandupcoming.com/new-blu-ray-releases (New Blu-ray Releases) http://www.newandupcoming.com/new-blu-ray-releases/action-adventure (New Action & Adventure Releases | Blu-ray) I am really new to the SEO world so I am stuck trying to figure out the best solution for this issue. My question is how should I fix this issue. I guess I can put canonical tag on all sub-categories but I was worried that search engines would not craw the sub-categories and index potentially valuable pages. Thanks for all the help.
Reporting & Analytics | | hirono0