Google Algorithm non-manual penalty. How do we fix this (quality?) drop?
-
Hi,
See attached image.
We received a non-manual penalty on March 22, 2015. I don't think we ever came out of it. We have moved up due to the Penguin update, but we should (by DA PA) be up on the first page for tons of stuff and most keyword are lower than their true strength.
What kind of quality errors could be causing this? I assume it was a quality update. I am working on the errors, but don't see anything that would be so severe as to be penalized.
What errors/quality problems am I looking for? We have tons of unique content. Good backlinks. Good design. Good user experience except for some products. Again, what am I looking for?
Thanks.
-
Hi Bob
Tons of unique content but optimized? Tons but focused on keywords you want to rank? Good backlinks but are you sure? (Im not talking about seo metrics only). Etc etc...
Btw whats the domain name (can be in pm)?
Krzysztof
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
How to add ">" category reveal in google search
When i look through google search and see some website categories their site this way. For example groupon www.groupon.com › Coupons › Browse Coupons by Store how do you do this for a website? for example wordpress. does this help with seo?
White Hat / Black Hat SEO | | andzon0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Is it ok to ask for a non reciprocal link
hey guys, got mini discussion question. With rapgenius.com getting penalized today, it raises some questions about linking. What they did is definitely not ok. A link scheme involving their own affiliate network is against Google guidelines for sure. So is it ok to ask for a non reciprocal link if there is no incentive involved and no money changes hands? ie. Someone writes an article related to your article topic, or they reference you without a link. Then uou email the webmaster requesting a link... They add it. Is this against the guidelines?
White Hat / Black Hat SEO | | Anti-Alex0 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
White Hat / Black Hat SEO | | theseolab0 -
Is Google stupid?
Why does buying links still work? I don't mean approaching an individual webmaster and cutting a deal, that seems to be nearly impossible to detect. But the huge link brokers, like Text Link Ads, Build my Rank or Linkvine, Google has to be aware of them, right? Can't they just create accounts to see the whole network, and ban the sites? Why wouldn't they just do that?
White Hat / Black Hat SEO | | menachemp0 -
Confusing penalties
Dear Mozzers, I've been working on a friend's website that is fighting for pretty competitive keywords (+90,000 gms) and has been relying almost exclusively on $1800/mo of comment spam to rank on the first page. Now that I've taken over SEO my first priorities were to: eliminate duplicate content improve site structure optimize internal links build legitimate do-follows add some keyword density fix titles and H tags Essentially just the basics, right? But since cancelling the comment spam, rankings for their primary keyword have consistently dropped over the last 3 months. I'm using the same strategies that I've used successfully on at least 6 similar websites. At the moment their homepage is still almost entirely duplicate content -- which is obviously a huge problem, but it seems a little odd that they could have been held up exclusively by that comment spam for so long, doesn't it? Even stranger, their authority and trust scores are now higher than any of the competition. Needless to say, my friends are getting pretty antsy and I'm starting to second guess myself. Do you think I should continue to push them to improve content, eliminate penalties, and build legitimate links -- or should I give in and suggest buying links as a short term solution? Advice is really appreciated!
White Hat / Black Hat SEO | | brevityworks0