Fresh content has had a negative affect on my SERPs
-
Hi there,
I was ranking pretty well for highly competitive keywords without actually doing any link building please see graph attached, so I thought I have an opportunity here in getting to page 1 for these keywords, the plan was to write fresh & original content for these pages, because hey Google loves fresh content, right?
Well it seems NOT, after one week of these pages been re-written (21st Feb 2012), all of these pages dropped all together, please note: all the pages were under the same directory:
/health/flu/keyword-1
/health/flu/keyword-2 and so on...
I have compared both pages as I have back ups of the old content
- On Average there are more words on each of the new pages compared to previous pages
- Lower bounce rate by at least 30% (Via Adwords)
- More time on site by at least 2 minutes (Via Adwords)
- More page visits (Via Adwords)
- Lower keyword density, on average 4% (new pages) compared to 9% (old content) across all pages
So since the end of February, these pages are still not ranked for these keywords, the funny thing is, these keyword are on page 1 of Bing.
Another NOTE: We launched an irish version of the website, using the exact same content, I have done all the checks via webmaster tools making sure it's pointing to Ireland, I have also got hreflang tags on both website (just in case)
If anyone can help with this that would be very much appreciated.
Thanks
-
Hi Gary,
Not sure I can add anything not said here, but if you feel inclined to send me a PM with the URLs, I'd be more than happy to take a look at it.
-
Hi Cyrus,
I forgot to mention I put canonical and hreflang tags on both pages in question .co.uk & .ie and it now seems that Google has finally crawled the .ie page, unfortunately it's on Google.ie the keyword is no longer ranked, the good news is that When i paste the first paragraph of the .co.uk page into Google.co.uk it's the .co.uk page that appears not the .ie webpage.
Is there no way of the .ie version ranking at all in Google.ie? it seems a shame that Google cannot get this right.
Thanks
-
Hi Gary,
Good question. Could be a couple of things going on. Let me address each in turn.
1. Duplicate content and the Irish version of your site. Could be an issue if you're duplicating content, even with the hreflang tags. Google also recommends to use canonical tags on international versions in addition to hreflang tags if the the content is duplicated.
Good discussion here:
http://googlewebmastercentral.blogspot.fr/2011/12/new-markup-for-multilingual-content.htmland here:
https://plus.google.com/u/0/115984868678744352358/posts/9zA3a96XahN2. Fresh content. In general, fresh content will help your rankings. But there are a couple of things to look out for when updating your content.
- If the content changes significantly from the original, Google may interpret as contextually different, and alter it's ranking score.
- Same for title tags and other on-page factors. If these change too much from the original, Google may do a "reset" on the page, which basically says "this is an entirely new subject, so you have to earn your rankings again.
- Internal links. Sometimes our content ranks from the power of internal text links, and we can inadvertently change these when updating content. In the absence of strong external link signals, this effect can be strong.
If you kept your content, subject matter and internal links fairly consistent, there may be other factors at work, such as an algorythm update or the aforementioned dupe content issue.
Hope this helps. Best of luck with your SEO!
-
Hi Aaron,
I have done very little in terms of link building for these pages, and the pages I have got back links on they are from authority websites, so it makes it very unlikely that this is the cause of the issue.
I just can't figure out what the issue could be, I mean for all pages in that directory to not be ranked anymore, it just seems to much of a coincidence that all the pages in this directory had been re-written, then one week later they vanish from the SERPs.
Any more suggestions would be very grateful.
-
It really sounds like a mistaken black-listing, if the ranking has dropped "all together". It could be based on content, but if your bounce rate is dropping, your content is better, and if there was a mistake made, I think some bad back-links are more likely.
If this occurred at the end of February, it would predate PENGUIN, but black-listings do occur between updates. One idea for resolving it would be to check some of the lower domain-authority backlinks, go to their site, and then from there, check other sites they are linking to, and see if those sites have also suffered. Once you target the location of the bad-links, you can start your clean up from there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unique content for international SEO?
Hi Guys, We have a e-commerce store on generic top-level domain which has 1000s of products in US. We are looking to expand to aus, uk and canda using subfolders. We are going to implement hreflang tags. I was told by our SEO agency we need to make all the content between each page unique. This should be fine for cateogry/product listing pages. But they said we need to make content unique on product pages. If we have 1000 products, thats 4000 pages, which is a big job in terms of creating content. Is this necessary? What is the correct way to approach this, won't the hreflang tag be sufficent to prevent any duplicate content issues with product pages? Cheers.
Intermediate & Advanced SEO | | geekyseotools0 -
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Content Internal Linking ?
Should we internally link new content to old content using anchor tags (keywords) related to pages from all new blogposts or should be keep rotating the blogposts like link from some blog posts & not from others. What ratio should we maintain. Right now i keep 2 links maximum from a 300 words posts or 3 in 500 words posts maximum. But linking from each new blog posts will be good?
Intermediate & Advanced SEO | | welcomecure0 -
Website (.BE) showing up in .NL SERPS
Fellow mozzers, we need your help We have a situation where a customer has two websites for each country: flowtracksurf.be → Belgium flowtracksurf.nl → Netherlands They used to have very good keyword rankings in the SERPS in BE & NL. Flowtracksurf.nl had good rankings in Google.nl and Flowtracksurf.be in Google.be.
Intermediate & Advanced SEO | | Jacobe
Recently there has been a change: Flowtracksurf.nl is not showing up in Google.nl anymore. It also seems that all the rankings from flowtracksurf.nl have been switched to flowtracksurf.be. .BE is doing very well, .NL is suffering. Data shows us that .NL : In the first two weeks of december 2014, we see a massive drop in traffic (GA) In that same week(s) we see a drop in search queries (Webmaster Tools) We see the exact opposite in .BE (growing strong in those weeks) When we look at the cache of flowtracksurf.nl we see only reference to flowtracksurf.be. Is that a hint of what was going on? On the same date that we see a massive drop in traffic on .NL, we see a peak in 'indexation' of .BE We see that the MOZ pages crawled dropped in that same week for NL We're also seeing that all the traffic from Google.nl is now going to flowtracksurf.be. Some keywords we were scoring #1-2 for are: surfvakanties, surfvakantie, surfcamp mimizan, surfcamp, frankrijk, surfcamp spanje, surfen frankrijk We just can't figure out the hard evidence in the data.
Can you help us on that?0 -
Please help with some content ideas
I was reading this post http://www.clambr.com/link-building-tools/ about how he had basically outreached to experts in the field and each one had shared this post with their followers. I am wondering how this could translate to our small business marketing and design blog I am really struggling for content ideas that will work in regards to popularity and link building.
Intermediate & Advanced SEO | | BobAnderson0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Faceted Navigation and Dupe Content
Hi, We have a Magento website using layered navigation - it has created a lot of duplicate content and I did ask Google in GWT to "No URLS" most of the querystrings except the "p" which is for pagination. After reading how to tackle this issue, I tried to tackle it using a combination of Meta Noindex, Robots, Canonical but still it was a snowball I was trying to control. In the end, I opted for using Ajax for the layered navigation - no matter what option is selected there is no parameters latched on to the url, so no dupe/near dupe URL's created. So please correct me if I am wrong, but no new links flow to those extra URL's now so presumably in due course Google will remove them from the index? Am I correct in thinking that? Plus these extra URL's have Meta Noindex on them too - I still have tens of thousands of pages indexed in Google. How long will it take for Google to remove them from index? Will having Meta No Index on the pages that need to be removed help? Any other way of removing thousands of URLS from GWT? Thanks again, B
Intermediate & Advanced SEO | | bjs20100 -
Ajax Content Indexed
I used the following guide to implement the endless scroll https://developers.google.com/webmasters/ajax-crawling/docs/getting-started crawlers and correctly reads all URLs the command "site:" show me all indexed Url with #!key=value I want it to be indexed only the first URL, for the other Urls I would be scanned but not indexed like if there were the robots meta tag "noindex, follow" how I can do?
Intermediate & Advanced SEO | | wwmind1