Whitehat site suffering from drastic & negative Keyword/Phrase Shifts out of the blue!
-
I am the developer for a fairly active website in the education sector that offers around 30 courses and has quite an actively published blog a few times a week and social profiles.
The blog doesn't have comments enabled and the type of visitor that visits is usually looking for lessons or a course.
Over the past year we have had an active input in terms of development to keep the site up to date, fast and following modern best practises. IE SSL certificates, quality content, relevant and high powered backlinks ect...
Around a month ago we got hit by quite a large drop in our ranked keywords / phrases which shocked us somewhat.. we attributed it to googles algorithm change dirtying the waters as it did settle up a couple of weeks later.
However this week we have been smashed again by another large change dropping almost 100 keywords some very large positions.
My question is quite simple(I wish)... What gives?
I don't expect to see drops this large from not doing anything negative and I'm unsure it's an algorithm change as my other clients on Moz don't seem to have suffered either so it's either isolated to this target area or it's an issue with something occurring to or on the site?
-
Snowflake,
When you migrate to HTTP's i believe you have to add the new protocol to Search Console. Google looks at HTTP and HTTPs as 2 different sites, which is why you might be seeing your index count going down under your HTTP account in SC. If you add the HTTPs version of your website to search console, you may see that those pages have been indexed under the HTTPs protocol. Check it out, wait a few weeks and see what happens.
Secure Your Site With HTTPS - Search Console Help
https://support.google.com/webmasters/answer/6073543?hl=en -
That is a very good shout!
-
Thanks Don,
I had read that article initially actually which is why I thought a few weeks was enough for it all to have settled back out but maybe I'm expecting a bit much for a 600 page site.
Many thanks for your help I'll maybe just be patient if there is nothing glaringly wrong
-
Also a quick point, if you still have your Google search console setup for HTTP, even though you use HTTPS now, I'd suggest looking at what is being reported as indexed in there. That maybe the missing link.
Cordialement,
Don
-
So I'm not seeing anything blocking crawling on your site which is good. But I did notice that you have at one time used URL types "http" and "https" which leads me to believe you may have recently switched to HTTPS. In such case you should know that it may take Google sometime to adjust. On a technical level, https and http are 2 different domains.
It is highly likely that Google has index the HTTP version of some of these pages which is why your index count maybe lower then normal for the HTTPS version.
I do see you properly 301 redirected these pages and your sitemaps are reflecting the https as well, if again this was a recent change it just looks like its going to take a bit of time for Google to catch up.
This is worth a quick look, https://support.google.com/webmasters/answer/6073543?hl=en (scroll to the bottom) and see the section "Migrating from HTTP to HTTPS".
I sent you additional info in PM.
Hope this helps,
Don
-
We did go from http to https about a month ago but we were careful that all the redirects and sitemaps were reflected correctly. I dont think there is an issue with the robots text (it is present and nothing weird blocking).
I'll take a look at those links and send you a pm - many thanks Don
-
Hi,
There are several reasons.
If you have recently changed your url structure. IE (went from using www to not, or https or not, or trailing / or not). In these cases Google could have indexed the pages already under the "other" version.
Google could be having a crawling error, like in a robots.txt or lack there of. Improper canonical tags, blocked access, improper redirects, or a manual penalty.
If you would like to post a link (or pm me) I will take a look and see if I can spot a potential problem for you.
Here are a couple links on Google that should help:
Why Pages Drop From Index
Overview Pages Not Being CrawledHope this helps,
Don
-
Checking Webmaster Tools it looks like Google has unindexed 500 out of our 630 pages in the last 2 weeks.
Is there any reason for why this maybe?
-
Thanks for your input Donford,
I've had a look in OSE again and I can't see any spam links (all the genuine links are rated 0 through to 3) which looks very good. So it doesn't appear to be a negative campaign against me.
I may try Majestic for peace of mind... it makes it even more the stranger that we are being penalised so much
-
Hi Snowflake,
You can use the OSE (Open Site Explorer) here on Moz to check the links they found. You can download that report to CSV to easily sort and see if you have a possible negative campaign running against you.
You could also use, Majestic, or SEMRush. to find more links. Just note there is no tool, free or paid that is going to be able to get all the links pointing to your site.
If you don't find a lot of spam links to your site, chances are there isn't somebody trying to target you with a negative campaign.
Hope it helps,
-
Thanks Eric,
There are a few languages of the site but as far as I'm aware no duplicate content in the same language but I will check with Siteliner just to be sure.
For disavowing backlinks - is this just via webmaster tools you are recommending to do that? If so we haven't done that yet but it seems sensible to try. When I last checked back links there were a few random sites that we certainly hadn't submitted to and looked spammy but when I went onto them we couldn't see our links.
Do you have a recommendation for a better backlink testing tool?
-
I know this is frustrating. There are a few areas that I would look into that could be causing this: duplicate content issues and links. First, look to see if you have any duplicate content issues on the site. There could be a duplicate copy of the site (perhaps a dev version that should not be indexed) or even certain content on your site that's causing issues. You might try Siteliner's crawler to identify if there are any issues you can fix.
Another possible reason is the links to the site. The site could have been hit by negative SEO, and a lot of "low quality" links or off-topic links could be pointing to your site. I've seen this in the past, and the only thing you can do is identify the links and disavow them. Sometimes you can get them removed, but disavowing them should work.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our forum links are redirecting to high spammy & NSFW sites: Any impact on main website?
Hi all, We have a discussion forum like subdomain.website.com. Some spammers have created many links with our subdomain URL which are redirecting to high spammy and NSFW sites (Not sure how they did). We are trying to stop the redirects. So far many visitors and bots have recorded visits to these spammy sites with our URL. Will this impact our website anyhow ? I noticed that our website spam score has been increased and not sure if this is coincidental or penalized. Ranking even dropped without manual actions. I wonder how much of this subdomain activity will impact main website? Please advise.
White Hat / Black Hat SEO | | vtmoz1 -
Using PURL.org/GoodRelations for Schema Markup
Hello awesome MOZ community! Our agency uses JSON-LD for our local business schema markup. We validate our markup using Google's Structured Data Testing Tool. All good! Recently, I discovered a competing agency using our similar JSON-LD markup (that's ok) and "http://purl.org/goodrelations" markup. The latter appears to be–potentially–black hat SEO. Why? According to MOZ, "there is no conclusive evidence that this markup improves rankings." BUT, the purl.org markup has provided an opportunity for "keyword stuffing". Using purl.org markup, the agency has stuffed/used 66 of the same keywords into the validated markup. I would love to get feedback from the MOZ community. Can schema markup–of any kind–be used to "keyword stuff"? If so, why aren't sites getting penalized for this? Is this practice flying under the elusive algorithm radars? Thanks! Your feedback, insight, and snarky remarks are welcome 🙂 Cheers!
White Hat / Black Hat SEO | | SproutDigital0 -
Is it a good idea to target a similar versions of a keyword?
Salute you all, I am optimizing a site for an attorney. I have done some good research and find the keyword difficulties. Some of my keywords are very similar was wondering is this a good idea and safe (white hat) or not? e.g. page title: 1) city immigration lawyer 2) city immigration attorney My main and first reason is to target all users. Since some will search under 'attorney' and some under 'Lawyer'. Secondly one is easier than the other. I appreciate any input from more experienced seo experts. Chris 🙂
White Hat / Black Hat SEO | | Chris-tx0 -
Can I use content from an existing site that is not up anymore?
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
White Hat / Black Hat SEO | | RoxBrock0 -
Rollover design & SEO
After reading this article http://www.seomoz.org/blog/designing-for-seo some questions came up from my developers. In the article it says "One potential solution to this problem is a mouse-over. Initially when viewed, the panel will look as it does on the left hand side (exactly as the designer want it), yet when a user rolls over the image the panel changes into what you see on the right hand side (exactly what the SEO wants)." My developers say" Having text in the rollovers is almost like hiding text and everyone knows in SEO that you should never hide text. "In the article he explains that it is not hidden text since its visible & readable by the engines.What are everyone's thoughts on this? Completely acceptable or iffy?Thanks
White Hat / Black Hat SEO | | DCochrane0 -
Site Maps
I have provided a site maps for google but although it craws my site www.irishnews.com at 6:45AM the details in the site map are not seen on google for a few days - any ideas how to get this feature working better would be great. example <url><loc>http://www.irishnews.com/news.aspx?storyId=1126126</loc>
White Hat / Black Hat SEO | | Liammcmullen
<priority>1</priority>
<lastmod>2012-01-23</lastmod>
<changefreq>never</changefreq></url> thanks0 -
Beating the file sharing sites in SERPs - Can it be done and how?
Hi all, A new client of mine is an online music retailer (CD, vinyls, DVD etc) who is struggling against file sharing sites that are taking precedence over the client's results for searches like "tropic of cancer end of things cd" If a site a legal retailer trying to make an honest living who's then having to go up against the death knell of the music industry - torrents etc. If you think about it, with all the penalties Google is fond of dealing out, we shouldn't even be getting a whiff of file sharing sites in SERPs, right? How is it that file sharing sites are still dominating? Is it simply because of the enormous amounts of traffic they receive? Does traffic determine ranking? How can you go up against torrents and download sites in this case. You can work on the onsite stuff, get bloggers to mention the client's pages for particular album reviews, artist profiles etc, but what else could you suggest I do? Thanks,
White Hat / Black Hat SEO | | Martin_S0 -
Single-words high keyword density. How many is too many.
Dear All SeoMoz users, I'm a web designer for some time now. Doing some basic SEO from time to time. I just started up with brand new website. The website is not ranking very well for 2nd line keyword (keyword density < 2%), but the problem is not ranking at all for for my main keyword. I think the problem is the keyword density. For phrases that are 3-words long my keyword density is less than 4%. I suspect the problem is that keyword density for single-word phrases is between 8-12%. Please note that the 3 words with highest keyword density make my main 3-words long keyword. Is this the case? Should I be avoiding keyword density larger than 4% for single-word phrases as well? What is you experiences is this matter? Could my single-word phrases be treated as keyword stuffing by Google?
White Hat / Black Hat SEO | | pseefeld0