Increasing content, adding rich snippets... and losing tremendous amounts of organic traffic. Help!
-
I know dramatic losses in organic traffic is a common occurrence, but having looked through the archives I'm not sure that there's a recent case that replicates my situation. I've been working to increase the content on my company's website and to advise it on online marketing practices. To that end, in the past four months, I've created about 20% more pages — most of which are very high quality blog posts; adopted some rich snippets (though not all that I would like to see at this point); improved and increased internal links within the site; removed some "suspicious" pages as id'd by Moz that had a lot of links on it (although the content was actually genuine navigation); and I've also begun to guest blog. All of the blog content I've written has been connected to my G+ account, including most of the guest blogging.
And... our organic traffic is preciptiously declining. Across the board. I'm befuddled. I can see no warnings (redirects &c) that would explain this. We haven't changed the site structure much — I think the most invasive thing we did was optimize our title tags! So no URL changes, nothing.
Obviously, we're all questioning all the work I've done. It just seems like we've sunk SO much energy into "doing the right thing" to no effect (this site was slammed before for its shady backlink buying — though not from any direct penalty, just as a result of the Penguin update).
We noticed traffic taking a particular plunge at the beginning of June.
Can anyone offer insights? Very much appreciated.
-
I'm trying to determine right now whether it's been an issue of this particular post being the symptom of a broader discrimination against our site or whether there has been competition introduced for this page. All the peaks and valleys of the site's organic traffic are exactly the peaks and valleys of popularity for this post. Graphing other major (organic) landing pages for our site (the top three of which have much less traffic than this one stupid page) does not indicate that the other pages have been similarly affected — their popularity is far more undulating, and subject to far fewer crazy movements. So I'm pretty sure at this point that it's the one page.
And, yes, this particular blog post accounts for about 1/2 of our site's organic traffic. We've reduced the bounce rate on this blog post down to the low 80's, percentage wise, which I think is respectable for what the blog post is & it's relationship to the site and the site's purpose as a whole, which is commercial and not immensely related to the post's content.
I suppose that's a new question, isn't it? How much should we care about the fortunes of one page that has a high bounce rate? Obviously, we should reduce the bounce rate (and there are some things we haven't done yet to do that) but the nature of this particular post is just not a super strong match for the content and direction of our site. The bounce rate will always been fairly high, it's just the way it will always be. Yet it has so. much. traffic. Another site I work on has a similar page, similarly somewhat-tangential to the site's content: the "when to use spray foam insulation" page. Thus I always want to call these the "spray foam insulation pages."
-
Ahh I see, I think if I was in that position I would try and have the dodgy links removed where possible, if you think they might be doing more harm to the site. Remember just because you've not received a warning notice in Webmaster tools, it doesn't mean that these links aren't negatively affecting your sites rankings, it may just be that there's not enough to have triggered a warning message, or as mentioned before they've simply been devalued.
What was it that caused the popularity around this particular blog post?
Do you mean that the decline in overall site traffic is down to a decline in traffic to this specific post? Or that it just correlates with the decline? -
I've got very little information about these backlinks since they precede my time, but I know that there was never any Google warnings about it. I think you're probably right, though — that the effect from the lousy backlinks is ongoing.
I graphed the decline in GA & found that the decline in traffic is exactly mirrored by the fortunes of this one ridiculously popular blog post. So while I continue to root around for confirmation for this, I'm guessing that this particular post has had found some new competition on the SERP. Yeesh.
-
Hi Novos Jay,
Do the shady backlinks you mentioned still exist and point to the site?
Have you used the disavow tool at all?The reason I ask is that it might just simply be down to the fact that the links that were holding the rankings and traffic up previously, are now gradually being devalued through various algorithm updates, so in spite of your recent work to do the right thing, there's still going to be an overall negative effect.
Perhaps with a little more information about the types of links (the shady ones) and quantity/% of the total backlinks, I/others might be able to give you some more specific ideas on what's happened?
Thanks,
Greg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
A campaign ghost keeps returning to my Google Analytics - Help!
A couple of campaign tracking links were created on my homepage (leading to internal pages), these were removed a few weeks ago (100% removed from the site). I understand there is a 6 month window and as long as a user returns (no matter from which source) they will be counted as a session against that campaign. Since these campaign links were set-up in error, I hoped creating a fresh new view within Google Analytics would stop them appearing. However they are still showing as sessions even in the new view (created after removing the campaign links in question). Is there anyway to stop this happening!? I want to be able to report on sessions correctly. Thanks, Sam
Web Design | | Sam.at.Moz0 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
How much does on-site duplicated content affect SERPs?
Hi, We've recently gotten into Moz, with our E-commerce websites, and discovered that it's crawler takes note of about 2500 pages which it thinks are the same (duplicated). We've now begun to completely rewrite every description of every product (including Meta Title/Description) so that this number may be reduced. Since this is the biggest issue Moz spots I'm wondering what the effect of fixing it will be on our position in the SERP (mainly Google). Does anybody have some stories or experience about this topic? Thanks in Advance! 🙂 Alexander
Web Design | | WebmasterAlex0 -
How serious is duplicate page content?
We just launched our site on a new platform - Magento Enterprise. We have a wholesale catalog and and retail catalog. We have up to 3 domains pointing to each product. We are getting tons of duplicate content errors. What are the best practices for dealing with this? Here is an example: mysite.com/product.html mysite.com/category/product.html mysite.com/dynamic-url
Web Design | | devonkrusich0 -
Google Bot cannot see the content of my pages
When I go to Google Webmaster tools and I type in any URL from the site http://www.ccisolutions.com in the "Fetch as Google Bot" feature, and then I click the link that says "success," Google bot is seeing my pages like this: <code>HTTP/1.1 200 OK Date: Tue, 26 Apr 2011 19:11:50 GMT Server: Apache/2.2.6 (Unix) mod_ssl/2.2.6 OpenSSL/0.9.7a DAV/2 PHP/5.2.4 mod_jk/1.2.25 Set-Cookie: CCISolutions-UT-Status=66.249.72.55.1303845110495128; path=/; expires=Thu, 25-Apr-13 19:11:50 GMT; domain=.ccisolutions.com Last-Modified: Tue, 28 Oct 2008 14:36:45 GMT ETag: "314b26-5a-2d421940" Accept-Ranges: bytes Content-Length: 90 Keep-Alive: timeout=15, max=99 Connection: Keep-Alive Content-Type: text/html Any clue as to why this could be happening?</code>
Web Design | | danatanseo0 -
Is my sitemap going to help me attract more visitors?
Hi, As I await my sitemap to go live, can someone tell me the main benefits of it? A Google sitemap that is .xml one. I have a images sitemap also as the site is an e-commerce store. Should I be expecting to see an increase in visitors when I implement it initially? Thanks Will
Web Design | | WillBlackburn0 -
Duplicate Content for index.html
In the Crawl Diagnostics Summary, it says that I have two pages with duplicate content which are: www.mywebsite.com/ www.mywebsite.com/index.html I read in a Dream Weaver tutorial that you should name your home page "index.html" and then you can let www.mywebsite.com automatically direct the user to index.html. Is this a bug in SEOMoz's crawler or is it a real problem with my site? Thank you, Dan
Web Design | | superTallDan0