Fresh content has had a negative affect on SERPs
-
Hi there,
I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right?
Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory:
/health/flu/keyword-1
/health/flu/keyword-2 and so on...
I have compared both pages as I have back ups of the old content
- On Average there are more words on each of the new pages compared to previous pages
- Lower bounce rate by at least 30% (Via Adwords)
- More time on site by at least 2 minutes (Via Adwords)
- More page visits (Via Adwords)
- Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages
So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing.
Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case)
If anyone can help with this that would be very much appreciated.
Thanks
-
Howdy! Just wanted to point out this question is several months old.
The statement about "The same content in another language can trigger duplicate content issues" was a bit surprising to hear. Can you provide some more information about that?
I'm more accustomed to what Google says in places like http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#3 where they state "Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag."
Matt Cutts also talks about it being OK in a video at http://www.youtube.com/watch?v=UDg2AGRGjLQ.
I'm also interested in knowing more about content needing to be relative to the directory name. Can you give a few more details?
-
Hello, You will not rank well on G without backlinks, sometimes new sites do get a temp boost early in their life because of the fresh content. The same content in another language can trigger duplicate content issues, also look into your directory/url addresses because the content should be relative to the directory name etc. Hope you figure it out, if worse comes to worse you can also roll back your changes and observe ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate product description ranking problems (off-site duplicate content)
We do business in niche category and not in English language market. We have 2-3 main competitors who use same product information as us. They all do have same duplicate products descriptions as we. We with one competitors have domains with highest authority in this market. They maybe have 10-20% better link profile (when counting linking domains and total links). Problem is that they rank much better with product names then we do (same duplicate product descriptions as we have and almost same level internal optimisation) and they haven't done any extra link building for products. Manufacturers website aren't problem, because these doesn't rank well with product name keywords. Most of our new and some old product go to the Supplemental Results and are shown in "In order to show you the most relevant results, we have omitted some entries very similar to the ... already displayed. If you like, you can repeat the search with the omitted results included.". Unique text for products isn't a option. When we have writen unique content for product, then these seem to rank way better. So our questions is what can we do externaly to help our duplicate description product rank better compared to our main competitor withour writing unique text? How important is indexation time? Will it give big advantage to get indexed first? We have thought of using more RSS/bing services to get faster indexation (both site will get products information almost at same time). It seems our competitor get quicker in index then we do. Also are farmpages helpful for getting some quick low value links for new products. We have planed to make 2-3 domains that would have few links pointint to these new products to get little advantage right after products are launched and doesn't have extranl links. Sitemap works and our new product are shown on front pages (products that still mostly doesn't rank well and go to Supplemental Results). Some new product have #1 or top3 raking, but these are only maybe 1/3 that should have top3 rankings. Also we have noticed problem that when we index products quickly (for example Fetch as Google) then these will get good top3 results and then some will get out of rankings (to Supplemental Results).
International SEO | | raido0 -
Duplicate content on .co.uk and .com TLDs with different domain authority
What's the best approach to take for a site that has identical content on the .co.uk and .com versions of the root domain? The .co.uk version has a significantly higher domain authority (54 vs 32 according to Open Site Explorer - see attached screenshot). But it's an international company with its largest customer base in North America and customers in over 60 countries. The company does not intend to localize content. My initial thought before seeing the domain authority was to 301 redirect the .co.uk to the .com domain to consolidate all the link equity under one international TLD. However, I wondered if the higher domain authority for .co.uk would be passed on if we did this. I figured that a non-UK audience would be more likely to trust a .com site. I still think 301 redirecting .co.uk to .com might be the best strategy in the long term. But is there likely to be a dip in rankings and organic search volume in the short term until .co.uk is replaced in the index by .com? I'd really appreciate your thoughts on this. CbVnfSO.png
International SEO | | Torchbox0 -
Do we need to update our sitemaps each time our content changes?
Dear SEO experts! We have created sites maps to get our international sub-domains indexed, however we're unsure if we have to update our sitemaps each time our content changes on our many landing pages which are translated to 17 different languages? Obviously the goal is to make it dynamic so it updates itself. I hope you can help us with some advice. Thanks a lot! Allan
International SEO | | Todoist0 -
Showing different content according to different geo-locations on same URL
We would like our website to show different content according to different Geo-locations (but in the same language). For example, if www.mywebsite.com is accessed from the US, it would show text (in English) appealing to North Americans, but, if accessed from Japan, it would show text (also in English) that appeals more to Japanese people. In the Middle East, we would like the website to show different images than those shown in the US and Asia. Our main concern is that we would like to keep the same URL. How will Google index these pages? Will it index the www.mywebsite.com (Japan version) in its Asia archives and the www.mywebsite.com (US version) in its North American archives? Will Google penalise us for showing different content across Geo-locations on the same URL? What if a URL is meant to show content only in Japan? Are there any other issues that we should be looking out for? Kindest Regards L.B.
International SEO | | seoec0 -
Should I be deindexing pages with thin or weak content?
If I have pages that rank product categories by alphabetical order should I deindex those pages? Keeping in mind the pages do not have any content apart from product titles? For example: www.url.com/albums/a/ www.url.com/albums/b/ If I deindexed these pages would I lose any authority passed through internal linking?
International SEO | | Jonathan_Hatton0 -
Is having duplicated content on different domains a problem when using alternate tag, but no canonical?
We will be launching a couple of new language versions. I understand that ccTLD is mostly considered as best option, however I thought that to start with it might be better to launch the new language version first on a subdirectory of our established domain with strong backlink profile as it may rank much better until I can attract some strong links to new ccTLD. I would wait for the pages of new language versions to be indexed on the main domain and then after a month launch the same content paralell on the ccTLD setting up an alternate tag in the main domain pointing to the ccTLD. I would not setup any canonical tag. As I understand google would rank whatever of the 2 versions ranks higher. Should not cause duplicated content issues right?
International SEO | | lcourse
Any thoughts? EDIT:
For clarification. The language we are launching are mostly spoken in several countries. E.g. for Portuguese I would add in main domain an altnernate tag for Brazilian visitors to Brazilian ccTLD, but no alternate tag for Portuguese visitors. For Corean I would add in main domain an alternate tag for visitors in south corea, but not one for visitors in north corea.0 -
I have on site translated into several languages on different TLDs, .com, .de, .co.uk, .no, etc. Is this duplicate content?
Three of the sites are English (.co.uk, .com, .us) as well as foreign (.de, .no, etc.) - are these all seen as having duplicate content on every site? They're hosted under the same EpiServer backend system if this helps. But I am still copying and pasting content over each site, and translating where necessary, so I'm concerned this is indexed as being large amounts of duplicate content. Site traffic doesn't appear to be suffering but as I'm currently putting together new SEOs strategies, I want to cover this possibility. Any advice on ensuring the sites aren't penalised appreciated!
International SEO | | hurtigruten0 -
How can I view Google.com SERPs from outside the US?
If I go to Google.com I get redirected back to Google.co.uk and search results have a UK bias. I'm trying to research the US market and have a hazy recolection of Rand demonstrating how you can add a few characters to the google.co.uk url to see US results - just can't remember what video I saw it in. Any ideas? Thanks a lot
International SEO | | trbaldwin0