50% of my keywords just had a significant drop for no reason no changes?
-
I just looked at a report for one of my sites in moz pro and
35/75 keywords just dropped on google, I didn't make any changes recently
Ranking Changes
-
12
Improved
-
35
Declined
Many dropping 5 - 22 places!
Any ideas on whats up, thank fully it didn't effect traffic, its pretty low anyways, but it has stayed the same... But still dropped to page 5 for lots of stuff. Strange. Did google just update or something?
Thanks for any ideas.
-
-
Hello Willem,
Yes, in my opinion you should submit languages. I'm not sure but think that main site on .com should be set as basic NL site.
For each sub-site (language ver) sitemap in particular language is needed, I have no idea how your sitemaps are created. If on automatic way (plugin or other way) it seems to be a bit complicated, but it's better to have sitemap than not to have, it is worth your effort.
Use meta tags "alternative" in section on each site.
And ... ask other people, maybe there is something new. Theres a lot of stuff in the GWT blog and forum...
Marek
-
I do have 3 different sub directories on the main domain. /NL, /DE and /EN, one for each language.
So I should add a GWT 'site' for each language and set a target geographic location to the righ region and delete the hometextileshop.com GWT 'site'(that is the main domain for 3 languages)
That would mean remove the sitemap too since it has 3 languages in it too.
As for the sitemap: would it be better in my case to remove the sitemap all to gether and let the bots just crawl the site and index?
Or should I have the sitemap module reprogrammed to use the "alternative" tag.
And how would the sitemap be applied if I had 3 GWT sites (one for each language)
For each language a sitemap with the target language and the other 2 as 'alternative', or just one sitemap for all 3 GWT-sites with 1 main language and the other languages as "alternative'?
-
HI Willem,
I'm not a prestashop specialist but i know that on one licence you ca do something like that shop-en shop-cs - it is from Polish prestashop forum ... forum.prestashop.pl
My " use rel="alternative" for language versions" refers to the situation when you have language versions and you want to prevent from duplicated content.
When it comes your situation - rankings, in my opinion it is better to have 3 different subdomains or subfolders for different languages.
As I understand your site uses language switcher... and a versions are in main domain subfolders. For GWT doesn't matter which method you use - subdomain or folder.
Key thing is that you should submit to GWT 3 different language versions (subfolders) and do a verification with different GWT meta tag (if you use this method for GWT verification) or code or with three diff GA accounts.
So 3 lang ver -> 3 GWT "sites" -> 3 verifications by GWT
And last but not least ... you still should use "alternative" tag to ensure crawl-bots that it is different site version not duplicated content. remember that not only text can be duplicated content... images or other stuff too
It should work...
Cheers,
Marek
-
Hi Marek,
Your comment on the rel="alternative" got me thinking.
I run a prestashop in 3 languages and use a sitemap module, and it doesn't use the alternative tag
( www.hometextileshop.com/sitemap.xml)
I always had trouble to rank in more than one language, the standard language in the shop seems to affect the language priority and since I use a multi langual sitemap I see english sites rank on dutch keywords (although the standard language is dutch), the dutch keyword is not even on these sites.
Somehow the multi language part of the site brings trouble and search engines seen to have trouble with it althoug each site is differtent in each language. (one url per language, own title and description and (translated) content)
Any suggestions?
-
Hi Syndicate,
As you wrote - you did change nothing. It is possible that one keyword still has a good support for content, links etc... but other not.
Your competition probably did changes. Few tips:
-
you should analyze bounce rate in correlation with time spent on page
-
for pages with low bounce rate and short time spent and low traffic is better do them noidex and nofollow - they decreases site quality
-
implement microformats,
- big impact has rel="author", "author" links should provide to author's site not to publisher site
-
social media are the key to modern SEO position
-
G+1 is not a social media but it helps a lot
-
employ responsive site content (mobile) more than 20% of the internet traffic is on mobile devices (usage of CSS3 and HTML5 is a need)
-
use rel="alternative" for language versions
-
use Google Map Maker
-
and many many more, I advice you to read mentioned conference materials
You have to change your site - it is a must.
Just imagine that you start on Ford Model T in Formula 1 ...
The distance between Ford T and McLaren MP4-27 (2012) is the same like between internet five years ago and now.
Marek
-
-
Anything is possible but I have a site that I haven't touched ANYTHING for about 5 years now and it still holds #1 for its main keywords. So not sure but seems not all sites are punished for old content, that content is nearly 10 years old but it still ranks #1 and top 5 for most all of its original keywords. Makes it real hard to figure things out!
-
Hi Bron,
Maybe it was because you did nothing, remember the words "Change or die" said in 4th century by Claudian Claudianus and also in recent times by marketing guru John Porter from Harvard.
In 2011 google did 525 changes in algorithm. In 2012 they are still doing changes. Google new trend is to promote sites friendly for users not for robots. Backlinks are still have have impact but not as big as in the past. Now we're stepping in new world ... semantic search. Now for Google good article is article between 1000 - 2000 words with good quality and with minimum once a week publication on your site/blog. There are also lots of other changes and news ... you should take a look on SMX conference materials.
...so maybe you have to change something, do a site revision and next do changes...
Take care,
Marek
-
Yes something happened, not sure what. I have 3 competitors in my report.
Most of them dropped by only by a normal amount of 1 point not 25. None had more then 1 or 2 points in changes while the vast majority of mine were off the page.
I didn't do any linkbuilding, all I have done in the past month or so, it do some minor content changes to only the homepage. Really strange, maybe some of the links changed offsite or something. I cannot figure out what happened.
-
I noticed the same thing on my websites. (all of them)
What stood out was that alle the keywords that have more competition and had more linkbuilding over the past year dropped more than keywords with no linkbuilding on it.
Around met I heard more people telling me that their sales in their shop have dropped since middle of this week.
The key question is what to do about this? How to get ranking up again.
If there was a google update it did not seem to effect my competition. The people mentioned above all have websites around the same size as mine. Maybe that's got something to do with it (guess)
Please ask any question that arises, I'll be glad to answer if that leads to a solution to this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One of our top visited page (login page) missing primary keyword, does this makes ranking drop of our homepage for same keyword?
Hi all, So, I have removed the "primary keyword" from login page, which is most visited page on our website to avoid keywords in non related pages. I noticed our homepage ranking dropped for same "primary keyword". Visitors of this login page directly land without searching with "primary keyword". Then how removing it from such page drops our ranking? Thanks
Algorithm Updates | | vtmoz0 -
Why Google changed our page-title suddenly which has been same for years
Hi all, I know Google shows a different page titles. Happens when over optimised or when we copied competitors page title. But we did neither. Suddenly Google changed our homepage page title in search results. Our page title suffix "brand name" has been moved to beginning. Our page title is still for years.
Algorithm Updates | | vtmoz1 -
Keyword Stuffing - Where Do You Draw the Line?
I have a tax software website for which there a multiple pages that compete using different keywords. However, all pages but my home page have recently fallen out of the rankings completely and I really just don't know why. For instance, my page - http://www.1099pro.com/prod1099proEnt.asp - has the title keywords "1099 Efile Software | 1099 Software | 1099 Electronic Filing". When I run a Moz report on the keyword "1099 E-File Software" I get an "A" rating and it finds a total of 8 instances of the keyword. However, when I run a Moz report on the keyword "1099 Software" it finds a total of 26 instances of the keyword - still with an "A" rating. When I search the actual text/html there are only 6 instances of the keyword "1099 software" which leads me to believe that Moz/Google/Search Engines are ignoring the middle term in words like "1099 printing software" or "1099 e-filing software" and only picking up "1099 software". Is this supposed to be happening? Does anyone know why or how many terms can be ignored in that fashion? I used to have multiple landing pages in the top 3 results and now all of my other landing pages have completely fallen from the rankings even though I am not keyword stuffing and am providing unique & relevant content. If anyone has an idea as to why my rankings have dropped so drastically I would really appreciate it (I take no part in black-hat link building so that isn't the reason).
Algorithm Updates | | Stew2220 -
Ideas on why Pages Per Visit Dropped?
Week over week our pages per visit continue to drop. Any ideas on where to look to diagnose?
Algorithm Updates | | Aggie0 -
Dropped from Universal Result: Local
For quite some time our Google Places listing has been in the Universal Results...(for this keyword there is a 7-pack result). Which was great, we had a PPC ad at the top of the page, we were 3rd in the Universal Results (there was 3 places listings before the natural results)...and we were 6th in the natural results - meaning we were on the first page 3 times...which means a happy boss....and lots of traffic. The old places listing was linked to our new Google+ Page pending the eventual demise of places and the merge. The merge has happened, all information from the places listing has migrated (apart from reviews and photos??) and the places listing has been deleted (URL returns 404 error). Problem is now my Google + Page is not even within the first 2 or 3 pages of places results never mind in the Universal results. So it would appear the rank / authority that the places listing had...hasn't been transferred to the Google+ page. My competitors...who were in 1 + 2 in the universal results above the natural results and who have Google+ Pages with NOTHING on...bar their name, are still there! Why would I be dropped when my Google+ Page, has more info, more followers, more photos, more relevant content (they don't have any content ) than my 2 competitors. It seems I've been penalised....somebody suggested that I had the keyword twice in my "About" and twice in my "Introduction" info and that could be it. I thought the loss of the review might be it too...but neither of the businesses now occupying the first 3 spots..have any reviews at all. Anybody else suffered from this? Anybody any other suggestions to why I might have been dropped so dramatically in the places listings? (My SERP listing is unaffected for this keyword) Keyword being mentioned twice hardly seems like "stuffing"! I'm actually not too concerned about the places ranking....not a great driver of traffic...but appearing in the Universal Results did obviously drive traffic...and to appear in the Universal Results...I've now got about 30 positions to climb...... The whole Google+ Local / Google Places thing has been a nightmare from start to finish.... Thanks in advance for any help or advice!
Algorithm Updates | | MarbellaSurferDude0 -
Someone just told me that the Google doesn't read past the pipe symbol. I find that hard to believe. Is this true?
Someone just told me that the Google doesn't read past the pipe symbol.
Algorithm Updates | | MarketingAgencyFlorida0 -
Changing Googles Sitelinks
Hi all, I know Google will only show sitelinks if the site is deemed authoritive and if it will help the user searching a keyword, but is there anyway to order or control which links appear in the sitelinks? I know you can demote a sitelink in Webmasters, but is this not shooting yourself in the foot? If I demote a link will Google replace it with the next link it thinks is worthwhile and be doing this eventually show the links you want to appear in your sitelinks? Thanx Gary
Algorithm Updates | | gazza7770 -
Don't use an h1 and just use h2's?
We just overhauled our site and as I was auditing the overhaul I noticed that there were no h1's on any of the pages. I asked the company that does our programming why and he responded that h1's are spammed so much so he doesn't want to put them in. Instead he put in h2's. I can't find anything to back this up. I can find that h1's are over-optimized but nothing that says to skip them altogether. I think he's crazy. Anyone have anything to back him up?
Algorithm Updates | | Dave_Whitty0