Technical Automated Content - Indexing & Value
-
One of my clients provides some Financial Analysis tools, which generate automated content on a daily basis for a set of financial derivatives. Basically they try to estimate through technical means weather a particular share price is going up or down, during the day as well as their support and resistance levels.
These tools are fairly popular with the visitors, however I'm not sure on the 'quality' of the content from a Google Perspective. They keep an archive of these tools which tally up to nearly a 100 thousand pages, what bothers me particularly is that the content in between each of these varies only slightly.
Textually there are maybe up to 10-20 different phrases which describe the move for the day, however the page structure is otherwise similar, except for the Values which are thought to be reached on a daily basis. They believe that it could be useful for users to be able to access back-dated information to be able to see what happened in the past. The main issue is however that there is currently no back-links at all to any of these pages and I assume Google could deem these to be 'shallow' provide little content which as time passes become irrelevant. And I'm not sure if this could cause a duplicate content issue; however they already add a Date in the Title Tags, and in the content to differentiate.
I am not sure how I should handle these pages; is it possible to have Google prioritize the 'daily' published one. Say If I published one today; if I had to search "Derivative Analysis" I would see the one which is dated today rather then the 'list-view' or any other older analysis.
-
I would appreciate some more feedback, I'm looking to group some of these pages from the 100k we're bringing it down to around 33k.
As regards comments not sure it's very feasible from research we did not many people go into back-dated entries so it's highly doubtful we'd receive much if any comments.
-
Right, I guess that's true as we still rank for other terms. However there are concerns that this could effect the Domain Rank ( I don't think its the case). We've decided to try drop at least 1/3rd of these 'automated pages' by displaying them in AJAX this way there should be a bit less stuff in the google index.
-
If certain area of the website have a duplicate content that Google will only ignore those pages which contain duplication the affect will never be on the complete website!
-
I don't exactly want all content to be deemed to be unique, what I'm more interested in is making sure that this content does not penalize the rest of the website; it's fine if its ignored by Google if its more then a week or two old. What we don't want is old results coming up when today's value is far more interesting.
I'd be happy if Google would prioritize the 'daily' posts more with relation to 'freshness'.
-
In my personal opinion slightly varied content can count under the duplicate content and this is mainly because the major %age of content on different pages is same...
As you explain how the content is generated, I don’t think there is a way you can manage to change the page in such a way that it becomes unique from each other and adding unique content to each pages is not a very good idea as there are around 100 thousand pages as you said earlier!
If I would be at your place I would have added the comment section below the content so that users who are interested in the content can share their experience, how this data helped them, what exactly happened in the market.... and this user generated content will help the up-coming pages to be unique with user generated content.
This idea will help to an extent to give new life to old pages but saying that it will make all pages unique is almost next to impossible in my eye!
Obviously, this is my suggestions but I would love to listen to others what they would do if they gone through the similar situation!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Title & Keywords
Hi Quick question on arrangement of keywords in titles. I know the order isn't so important anymore, but would there be a real issue if I want to rank for 'Henry Xtra' but my title reads 'Numatic Henry Xtra Vacuum Cleaner' Rather than 'Henry Xtra Vacuum Cleaner' ?? Will it really make much difference? Thank you!
Intermediate & Advanced SEO | | BeckyKey0 -
Responsive Content
At the moment we are thinking about switching to another CMS. We are discussing the use of responsive content.Our developer states that the technique uses hidden content. That is sort of cloaking. At the moment I'm searching for good information or tests with this technique but I can't find anything solid. Do you have some experience with responsive content and is it cloaking? Referring to good articles is also a plus. Looking forward to your answers!
Intermediate & Advanced SEO | | Maxaro.nl0 -
Why is the meta description not the same as in the index?
Hi all, When I search for keywords concerning "little wannahaves", the meta description in attachment 1 appears. This is however not the meta description I gave in. When I search for "site:littewannahaves.nl" the right meta description appears, see attachment 2. Does anyone know how why these two differ and how I can fix this? According to webmaster tools there should not be any error. Thanks in advance! P3FMNzP.png nkDXqRc.png
Intermediate & Advanced SEO | | U-Digital0 -
Infinite Scrolling: how to index all pictures
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
Intermediate & Advanced SEO | | khi5
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/ thank you0 -
Alternative HTML Structure for indexation of JavaScript Single Page Content
Hi there, we are currently setting up a pure html version for Bots on our site amazine.com so the content as well as navigation will be fully indexed by google. We will show google exactly the same content the user sees (except for the fancy JS effects). So all bots get pure html and real users see the JS based version. My questions are first, if everyone agrees that this is the way to go or if there are alternatives to this to get the content indexed. Are there best practices? All JS-based websites must have this problem, so I am hoping someone can share their experience. The second question regards the optimal number of content pieces ('Stories') displayed per page and the best method to paginate. Should we display e.g. 10 stories and use ?offset in the URL or display 100 stories to google per page and maybe use rel=”next”/"pref" instead. Generally, I would really appreciate any pointers and experiences from you guys as we haven't done this sort of thing before! Cheers, Frank
Intermediate & Advanced SEO | | FranktheTank-474970 -
Index, Nofollow Issue
We are having on our site a couple of pages that we want the page to be indexed, however, we don't want the links on the page to be followed. For example url: http://www.printez.com/animal-personal-checks.html. We have added in our code: . Bing Webmaster Tools, is telling us the following: The pages uses a meta robots tag. Review the value of the tag to see if you are not unintentionally blocking the page from being indexed (NOINDEX). Question is, is the page using the right code as of now or do we need to do any changes in the code, if so, what should we use for them to index the page, but not to follow the links on the page? Please advise, Morris
Intermediate & Advanced SEO | | PrintEZ0 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
Duplicate content
I run about 10 sites and most of them seemed to fall foul of the penguin update and even though I have never sought inorganic links I have been frantically searching for a link based answer since April. However since asking a question here I have been pointed in another direction by one of your contributors. It seems At least 6 of my sites have duplicate content issues. If you search Google for "We have selected nearly 200 pictures of short haircuts and hair styles in 16 galleries" which is the first bit of text from the site short-hairstyles.com about 30000 results appear. I don't know where they're from nor why anyone would want to do this. I presume its automated since there is so much of it. I have decided to redo the content. So I guess (hope) at some point in the future the duplicate nature will be flushed from Google's index? But how do I prevent it happening again? It's impractical to redo the content every month or so. For example if you search for "This facility is written in Flash® to use it you need to have Flash® installed." from another of my sites that I coincidently uploaded a new page to a couple of days ago, only the duplicate content shows up not my original site. So whoever is doing this is finding new stuff on my site and getting it indexed on google before even google sees it on my site! Thanks, Ian
Intermediate & Advanced SEO | | jwdl0