Large traffic loss, how to resolve?
-
So I have a few questions. A site which has dropped quite dramatically over the last month. There could be a few factors (less regular updates) been one. However, plenty of links spam has happened which some of which appears to be been pointed at an article. That article has been removed. We also disavowed 800+ links all of which had a spam score of between 100 and 40+ - they all looked ugly and irreverent such as forum spams, wallpapers spams, and junk. Is this the right move to make?
What should the site do next? The current score is 4,000 SEMR. It was around 15,000 before. It's an aged site. Less effort is put into it now as the next site is here.
Await your replys thanks.
-
Hi there! I'm assuming that you're seeing the traffic loss in your on-site analytics as well as in tools like SEMRush (because if not my first port-of-call would be to check your GA/similar to make sure this isn't just SEMRush seeing only part of the picture).
The spam could be a factor but I think you've already addressed that. It could be worth bearing in mind that there was a core algorithm update in May so that could be a factor in the traffic change. However, a couple things interest me here.
One is that you mention you've been investing less into this site - this could simply be a matter of the site being out of date and no longer a good result.
The other interesting point is you mention "the next site is here". If you have been building another site targeting the same keywords and using roughly the same content, it could just be a matter of the new site taking the old site's rankings, particularly if the new site has had more recent investment.
If the new site is targeting the same things as the old site, it'd probably be a good idea to redirect the old site to the new site to make sure that you have one site that's ranking on page 1 rather than two sites that are ranking page 2 and below.
If the new site is something very different then I think it comes down to how willing you're willing to invest in the old one. If it's worth the time, the next thing I'd do is look at:
- GA to see if any pages were particularly badly hit
- Search console to see if there were any topics which were particularly badly hit
- Recent dev changes to make sure it's not a tech issue
- The specifics of that core ranking update (article linked above) to see if any themes jump out to you.
If it seems to be sitewide then it's possible it's a dev/algorithm update issue and I'd focus on improving overall site quality. If it's localised on specific topics/pages then I'd start by focusing on how I can update that content specifically.
For more specifics, this blog post by Dom Woodman is a pretty good resource.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Traffic going down in all sites in a niche
Hello, A client has three Ecommerce sites in a niche. Because of competition and a (possibly) non manual penalty due to doorways and paid links (though I think it's mainly competition too) our traffic is going down. What are the keys to increasing traffic at this point. Feel free to include tricks that cost money. A Hrefs (I love Moz though!) has some neat content tricks. Please give me the best tricks in the industry to increase traffic. We're adding content to the main site of the three and maybe that's what to focus on, but we're having trouble driving serious traffic with the content. We need serious traffic. We are experts in our field and capable of almost anything as far as information goes in our field. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Traffic Generation Visitor Exchange Systems & Google Algo / Punihsments
So, in recent years some services have been developed such as Engageya I want to ask the experts to weigh in on these types of services that generate traffic. I know of sites that have achieved higher ranking via these NON-bot, user browser visitors. Here's their own explanation. Any thoughts will be appreciated. I could not find what Google's Matt Cutts has to say about these affairs, I suspect not very good things. However, I KNOW of sites that have achieved higher ranking, with about 30-40% of traffic coming from similar systems to this. Join our exclusive readers exchange ecosystem Engageya offers an exclusive readers exchange ecosystem - either within the network only, or cross-networks as well - enabling participating publishers to exchange engaged readers between them in a 1:1 exchange ratio. No commissions involved! Why networks work with Engageya? Create traffic circulation within your network - increase your inventory and impressions within your existing properties.Engage readers within your network and experience an immediate increase in network's page views. Enjoy readers'- exchange from other networksOur engine intelligently links matching content articles together, from within your network, as well as from other networks. Get new audiences to your network for non-converting users clicking out. New revenue channel - monetize pages with reader-friendly content ad units, while making your readers happy!This is the time to move from aggressive and underperforming monetization methods - to effective and reader-friendly content advertising.
White Hat / Black Hat SEO | | Ripe
Let our state-of-the-art semantic & behavioral algorithms place quality targeted content ads on your publisher's content pages. Enjoy highest CTRs in the industryContent ads are proven to yield the highest CTRs in the industry, starting at 2% and up to 12% click-through rates! This is simple. Readers click on an article they are interested-in, whether it's sponsored or not. Enhance your brand - Offer your publishers private-label content recommendations today, before someone else does.Content advertising is becoming more and more common. New content advertising networks and suppliers are being introduced into the online advertising market, and, sooner or later, they are going to approach your publishers. Engageya offers you a private-label platform to offer your publishers the new & engaging content ad unit - today! Comprehensive reports and traffic control dashboardTrace the effectiveness of the content recommendations ad units, as well as control the traffic within your network.0 -
Loss of 85-90% of organic traffic within the last 2 weeks.
Hey Everybody, Have a client that recently came to us asking for SEO help. Did some initial analysis on their current SEO status and most everything looked pretty good. On-page work was pretty good, nothing really lacking there other then missing alt tags for all images. Their linking profile looked good too. Lots of good links from quality sources, all relevant. Client has done some good press releases. They could probably use a bit more focus in their content as it is somewhat general and not keyword focused. Initially it didn't look like they needed any help with their SEO, so was a bit curious as to why they contacted us. Today we get their google analytics information and immediately noticed that they have had a 85-90 percent drop in organic traffic from all major search engines that started about two weeks ago. If all their SEO looks to be done properly, any ideas what would account for the massive drop in traffic? The only thing that looks like may have happened is that they may have dropped a couple spots from position #1 to position 2-3 for some of their highest traffic terms. Even if that is the case, I would not expect such a high drop off in terms of organic traffic. Just curious as to what anyone else can attribute the huge drop in traffic to or what else may help identify the issue. It's almost as if analytics was turned off or removed from the site, but that is not the case.
White Hat / Black Hat SEO | | Whebb0 -
My Site having a drop in traffic with eash passing month
Hi, I'm running a text message site mixsms.com from 2009. It was performing good till Oct 2012. In the end of Nov 2012 I noticed a drop in traffic and then with each passing month I'm getting 1500-2000 unique visitors drop. In Oct 2012 my daily unique visitors were 15000+ each day and now it is just having 2000 after Feb end. I've done several things to improve my site. I changes the template, removed all unnecessary html elements, changed seo structure (optimize with all modern seo techniques). Stop backlinking from Nov 2012 but instead of getting improvements I'm continuously having a drop in traffic. I'll highly appreciate your time if you look into site deeply to findout exact issues that are causing for this drop. I'm even ready to hire any seo consultant if he is pretty sure to get 100% results. Thanks in advance for your support
White Hat / Black Hat SEO | | intelmixx0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
AdWare is Sending Me Traffic - Why?
I have a client who is receiving a good bit of traffic from a source that I believe to be AdWare, which is obviously bad. I didn't commission anyone to do this, so I'm at a loss as to where it originated, but it sends me many unique users per day - of course, the bounce rate is 80%, and the average time on the site is 8 seconds. If I were to change site servers, would that A) stop the AdWare traffic, and B) change my rankings?
White Hat / Black Hat SEO | | stubenbordt0