Rotating Content Concern on Deep Pages
-
Hi there,
I apologize if I'm too vague, but this is a tough issue describe without divulging too much of our project.
I'm working on a new project which will provide information results in sets of 3. Let's say someone wants to find 3 books that match their criteria, either through their organic search which leads them to us, or through their internal search on our site.
For instance, if they're looking for classic movies involving monsters, we might display Frankenstein, Dracula, and The Mummy. We'd list unique descriptions about the movies and include lots of other useful information.
However, there are obviously many more monster movies than those 3, so when a user refreshes the page or accesses it again, a different set of results show up. For this example, assume we have 5 results to choose from. So it's likely Google will index different results shuffled around.
I'm worried about this causing problems down the line with ranking. The meat and potatoes of the page content are the descriptions and information on the movies. If these are constantly changing, I'm afraid the page will look "unstable" to Google since we have no real static content beyond a header and title tag.
Can anyone offer any insight to this?
Thanks!
-
Thanks for the response. The issue of "hiding" the content with the randomization was a fear of mine. Believe me, I don't like the rotating content design, but it's where we're at right now.
3 search results, think specific businesses, but for user experience, only 3 will be shown at once. This is not something to be changed unfortunately. If more than 3 are in that specific business category, we'll be rotating them out (which I don't like) upon refresh.
The only solution I can think of is to have the top 3 remain static and allow the user to click a "Show more" button which loads them beneath (or replaces the original 3). Either way, Google shouldn't have an issue with that, correct?
I know there are "better" ways to accomplish what we're asking, but the site is custom built and nearly 95% complete. We are also taking a unique approach to the way we display results and serve them to our clients, so the most optimal way is not achievable at this point. It's basically finding the most optimal for what we can do, if that makes sense. Thanks for understanding!
-
Sorry been mega busy
First of all, never hide content from Google if a user is unable to view that information. You will get slapped for it. Even if the algorithm does not pick it up, someone will report it at some point. That is a bad foundation to start from.
What you are trying to do is complicated to get the full picture in my head hence the lack of response from others in this forum I think.
You need to describe exactly what will be on the page and why and what will be on others and why those pages need to be indexed. This way we can work out of the strategy you are taking is even the right one. There is likely a better way to do it.
-
Hi Gary,
Were you notified of my follow-up posts? I'd love to hear additional information from you.
Thanks a lot!
-
Hi Gary,
One thing you could try is loading all the matches on to a page and only show the top 3 matches with an option to reveal more and mark all the code up with a schema. This way the content will always be on the page and able to be crawled by Googlebot.
This is the idea I've been toying with. Do you have any idea if we could preload all matches/results and still use the refresh? It'd technically (I think) be different because the user can't load more on command, like with a button, but Google can see them.
I feel like it's a little iffy since Google seems to only approve of hidden text if the user controls when they see it or not. Any idea?
Thanks again!
-
Lesley,
Thanks for the response.
If we scripted the page so Google would ignore the content, I'm afraid we'd be in nearly the same boat we're in now. As in, we'd have no content on the page and wouldn't rank for anything.
While it would effectively "solve" the potential rotating content issues and penalties, we wouldn't have anythign to rank for.
Gary,
Thanks for the helpful response!
1. How would we run into internal duplicate content issues? These 3 results (in full) would only be found on this specific page, they'd just be rotating.
I will say that the way these results pages are structured includes snippets of content that can be found on each results individual page, e.g., a snippet of Frankenstein's plot will show on the results page, and once clicked, will show the full entry. So there's going to be some duplicate content. That shouldn't be a huge deal though?
2. That's exactly the reason I hate this. Even if Google didn't get pissed, we wouldn't have static content (keywords, longtails) to build authority and rank for.
Idea #1: I actually have this prinicple written down, but slightly different. If we had a link at the bottom of the results in Javascript to "shuffle" or "refresh" the content, the user would get the benefit, but since it's not a new page, Google couldn't crawl it. So they'd only randomize on command, but stick with the initial 3 on pageload.
I was also toying with the idea of locking 2 of the results and only shuffling the 3rd, that way there's some semblance of continuity to the indexing and we'd always be working towards that content ranking. Thoughts?
Are you saying with SCHEMA we can "hide" the additional/rotated results initially to the user, but Google sees it immediately? If so, please elaborate or send me a link since this is interesting!
Idea #2: The snippets actually lead/link to their static pages on their own URL (this is the only duplicate content I believe) so that's fine, but yes, we aren't concerned with the static pages ranking, only the grouped together results.
-
You run into a number of issues by having these pages indexed.
1. Lots of internal duplicate content, Google has said this is not a problem and the they will serve up the best result. But it can trigger Panda issues.
2. The content always changes so you will confuse Googlebot and have issues ranking for specific terms for any period of time. (your SERPS would fluctuate like crazy or trigger a quality algorithm)
Some ideas:
One thing you could try is loading all the matches on to a page and only show the top 3 matches with an option to reveal more and mark all the code up with a schema. This way the content will always be on the page and able to be crawled by Googlebot.
Another option is to not index these pages at all and create static pages for each item. But this defeats the object of what you are trying to rank for.
Serving up random content is always going to be an issue for Googlebot, but more and more webmasters have responsive designs that hide and show content based on clickable actions on pages. Googlebot indexes all the content but is smart at working out what is also visible to the user and giving preference to it.
-
In my opinion the safest way to do it would be to have a discrete iframe that loaded the contents. The reason being is that google would ignore it. It would make it on par with twitter widgets and facebook like boxes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Errors flaring on nonexistent or unpublished pages – should we be concerned for SEO?
Hello! We keep getting "critical crawler" notifications on Moz because of firing 404 codes. We've checked each page and know that we are not linking to them anywhere on our site, they are not published and they are not indexed on Google. It's only happened since we migrated our blog to Hubspot so we think it has something to do with the test pages their developers had set up and that they are just lingering in our code somewhere. However, we are still concerned having these codes fire implies negative consequences for our SEO. Is this the case? Should we be concerned about these 404 codes despite the pages from those URLs not actually existing? Thank you!
Intermediate & Advanced SEO | | DebFF
Chloe0 -
Is it a good strategy to link older content that was timely at one point to newer content that we would prefer to guide traffic and value to
Hi All, I've been working for a website/publisher that produces good content and has been around for a long time but has recently been burdened by a high level of repetitious production, and a high volume in general with pages that don't gather as much traffic as desired. One such fear of mine is that every piece published doesn't have any links pointing to when it is published outside of the homepage or syndicated referrals. They do however have a lot (perhaps too many) outbound internal links away from it. Would it be a good practice, especially for new content that has a longer shelf life, to go back to older content and place links pointing to the new one? I would hope this would boost traffic via internal recircultion and Page Authority, with the added benefits of anchor text boosts.
Intermediate & Advanced SEO | | ajranzato91 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
What is the best practice to optimize page content with strong tags?
For example, if I have a sub page dedicated to the keyword "Houston Leather Furniture" is it best practice to bold ONLY the exact match keyword? Or should ONLY the words from the keyword (so 'Houston' 'Leather' and 'Furniture') Is there a rule to how many times it should be done before its over-optimization? I appreciate any information as I want to do the BEST possible practice when it comes to this topic. Thanks!
Intermediate & Advanced SEO | | MonsterWeb280 -
How to improve ranking of deep pages?
While this may sound like an obvious or stupid question at first...let me explain... We are an e-commerce website which sells one type of item nationally; for sake of an example which is similar to us, you can think of an e-commerce site that sells movie theater tickets in cities and towns across the country. Our home page ranks very well for the appropriate keywords as well as some of our state and city pages rank very well for local searches. However, while some state and city pages rank well for their respective local searches, others have a low page rank with some not even in the top 50 for their respective keywords. My question is that we aren't clear why some pages will rank well while others wont when the competition looks similar for those local searches. And in today's Panda/Penguin era we are unsure of how to get more of these state/city pages ranking better? For the record, we are quite strict about on-page SEO, 99% of our 5600 pages are crawled & we have minimum SEO errors from the SEOMoz crawls. Can anyone provide some feedback & thoughts?
Intermediate & Advanced SEO | | CTSupp0 -
Content linking ?
If you have links on the left hand side of the website on the Navigation and content at the bottom of the page and link to the same page with different anchor text or the same would it help the page (as it is surrounded by similar text) or is the first one counted and this is it?
Intermediate & Advanced SEO | | BobAnderson0 -
Will Google Visit Non-Canonicalized Page Again and Return Its Page's Original Ranking?
I have 2 questions about canonicalization. 1. Will Google ever visit Page A again if after it has been canonicalized to Page B? 2. If Google will still visit Page A and found that it is not canonicalizing to Page B already, will the original rankings and traffic of Page A returned to the way before it's canonicalized? Thanks.
Intermediate & Advanced SEO | | globalsources.com0 -
Pop Up Pages Being Indexed, Seen As Duplicate Content
I offer users the opportunity to email and embed images from my website. (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?
Intermediate & Advanced SEO | | andertoons0