Need for a modified meta-description every page for paginated content?
-
I'm currently working on a site, where there url structure which is something like: www.domain.com/catagory?page=4. With ~15 results per page.
The pages all canonical to www.domain.com/catagory, with rel next and rel prev to www.domain.com/catagory?page=5 and www.domain.com/catagory?page=3
Webmaster tools flags these all as duplicate meta descriptions, So I wondered if there is value in appending the page number to the end of the description, (as we have with the title for the same reason) or if I am using a sub-optimal url structure.
Any advice?
-
We don't have a view all page(We found them so slow, so long, and with so meny links we had a notable improvement in rankings in general when switching to the quicker paginated versions). And other then the first page none of the other pages are currently in our site map.
I'm not entirely sure how that would stop gwt flagging it as a duplicate meta though. Less you imply to also no-index them.
-
Do you have "View All" as an option for your paginated pages? If not, you might consider it, and then just include the "View all" version of the page in your site map. Just a thought...
-
That scale of unique descriptions is well beyond our capacity. We're actually considering dropping the number of items per page too.
Thanks for the help.
-
Could ignore cause any problems? (such as pages that should/shouldn't be indexed) I was rather suprised to discover that using cannonical wasn't enough.
-
I believe appending the page number, for example: (Page 3 of 5) to the end of the meta description would suffice from SEOmoz's crawler's or GWT's perspective, however, the best would be to have the ability to create completely unique meta descriptions.
-
It sounds like you have canonical and rel next/prev setup correctly so you shouldn't worry about duplicate meta descriptions. You could add ?page= as a query string to "ignore" in WMT and then it will ignore those pages and you won't be getting any errors from duplicate meta's on those pages.
Hope this helps,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If the order of products on a page changes each time the page is loaded, does this have a negative effect on the SEO of those pages?
Hello, a client of mine has a number of category pages that each have a list of products. Each time the page is reloaded the order of those products changes. Does this have a negative effect on the pages' rankings? Thank you
Technical SEO | | Kerry_Jones2 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
Pages with different content and meta description marked as duplicate content
I am running into an issue where I have pages with completely different body and meta description but they are still being marked as having the same content (Duplicate Page Content error). What am I missing here? Examples: http://www.wallstreetoasis.com/forums/what-to-expect-in-the-summer-internship
Technical SEO | | WallStreetOasis.com
and
http://www.wallstreetoasis.com/blog/something-ventured http://www.wallstreetoasis.com/forums/im-in-the-long-run
and
http://www.wallstreetoasis.com/image/jhjpeg0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0 -
Meta description showing in source code but not being detected by SEO Moz or other tools?
Hello fellow SEO enthusiasts, Re www.appetise.com Our developers have added a meta description and I can see it when I right click on pages to 'view source' as follows : Example : BUT - using the on page seo assessment tool on SEO Moz (and also using other tools which assess title, description and keyword optimisation) - they are telling us that the meta description is not present. Please could someone suggest why? If we can get the meta description picked up - we will reach A Grade for our core pages! And this will make us feel good - and hopefully shine through in our results :-). Any help greatly appreciated. Kind Regs, Richard Best - Appetise.com <meta http-equiv="description" content="Online Takeaway Food with appetise.com. 100's of Local Takeaways Menus Online. Order Take Away Food Online for Delivery. Pay by Card Safely. Including Pizza, Chinese, Indian, Italian, Kebab."/>
Technical SEO | | E-resistible0 -
My pages says it has 16 errors, need help
My pages says it has 16 errors, and all of them are due to duplicate content. How do I fix this? I believe its only due to my meta tag description.
Technical SEO | | gaji0 -
Duplicate Meta Description in GWMT
We've just discovered that there are multiple duplicate URLs indexed for a site that we're working on. It seems that when new versions of the site was developed in the last couple of years, there were new page names and URL structures that were used. All of these seem to be showing up as Duplicate Meta Descriptions in Google's WMT, which is not surprising as they are basically the same page with the same content that are just sitting on different page names/URLs. This is an example of the situation, where URL 5 is the current version. Note: all the others are still live and resolve, although they are not linked to from the current site. URL 1: www.example.com/blue-tshirts.html (Version 1 - January 2010) URL 2: www.example.com/blue-t-shirts.html (Version 2 - July 2010) URL 3: www.example.com/blue_t_shirts.html (Version 3 - November 2010) URL 4: www.example.com/buy/blue_tshirts.html (Version 4 - January 2011) URL 5: www.example.com/buy/bluetshirts.html (Version 5 - April 2011) Presumably, this is a clear case of duplicate content. QUESTION: In order to solve it, shall we 301 all of the previous URLs to the current one - ie. Redirect URLs 1-4 to URL 5? Or, should some of them be NoIndexed? To complicate matters, there is Pagination on most of them. For example: URL 1: www.example.com/blue-tshirts.html (Version 1 - January 2010) URL 1a: www.example.com/page-1/blue-tshirts.html URL 1b: www.example.com/page-2/blue-tshirts.html URL 1c: www.example.com/page-3/blue-tshirts.html URL 4: www.example.com/buy/blue_tshirts.html URL 4a: www.example.com/buy/page-1/blue_tshirts.html URL 4b: www.example.com/buy/page-2/blue_tshirts.html URL 4c: www.example.com/buy/page-3/blue_tshirts.html URL 5: www.example.com/buy/bluetshirts.html URL 5a: www.example.com/buy/page-1/bluetshirts.html URL 5b: www.example.com/buy/page-2/bluetshirts.html URL 5c: www.example.com/buy/page-3/bluetshirts.html Since URL 5 is the current site, we are going to 'NoIndex, Follow' URLs 5a, 5b and 5c, which is what we understand to be the correct thing to do for paginated pages. QUESTION: What shall we do with URLs 1a, 1b and 1c? Should we apply the same "No Index, Follow" OR should they be 301'd to their respective counterparts in 5a, 5b and 5c? QUESTION: In the same way, since URL 4 is the version just before the current live Version 5, does it make a different on whether the paginated pages (ie 4a, 4b and 4c) should be No Indexed or 301'd? Thanks in advance for all responses and suggestions, it's greatly appreciated.
Technical SEO | | orangechew0 -
How to Solve Duplicate Page Content Issue?
I have created one campaign over SEOmoz tools for my website. I have found 89 duplicate content issue from report. Please, look in to Duplicate Page Content Issue. I am quite confuse to resolve this issue. Can any one suggest me best solution to resolve it?
Technical SEO | | CommercePundit0