Impressions in Google SERP has declined from 3500 to 1600 after 5-25-2012\. Is it Penguin?
-
It's about the website
http://www.apartments-houseboats-amsterdam.com/
The visitors had declined from 270 to 150 visitors per day. Is this caused by the Google update Penguin? If so what can I do to solve the problem?
Thank you for your time and effort,
-
You need to have a look at your rankings and find out which keywords were driving that traffic. Once you find the keyword(s) that have fallen then you can make the assumption that it could be Penguin.
Although looking at your domain, it is an obvious keyword based domain which was cracked down on as well. Look into Analytics though before assuming Penguin.
-
Hi Sebastiaan, The Google Penguin update targeted 'black hat webspam', which affected the ranking orders on some SERPs.
If your site's rankings did not drop, the change in traffic is unlikely to be connected to the Penguin update. Did you experience a drop in your rankings?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive Google Search Spam
We have come to know that one of competitors of our client is spamming Google search results on massive scale. If we search with keywords like "iphone spy apps" , "text messages spy " etc then most of results from 3rd or 4th page onwards show totally irrelevant sites but when we click on those results/pages then all redirect to either http://topspysoft.com/ OR http://www.mspy.com/ . They have been doing it on massive scale for last few months against hundreds of queries and populating hundreds of search results. If use some country specific Google site then again hundreds of results come from totally irrelevant country specific domains (au,nz,uk etc) and they all redirect to topspysoft.com or mspy.com. Can you please tell how they are doing it and how they are able to do it on such a massive scale without getting noticed by Google ? Is there any way to report this issue to Google as the current only allows one link ? Following are some of the spam urls to give you an idea www.crcincva.com/doc/20-best-iphone-spy-apps/
White Hat / Black Hat SEO | | shaz_lhr
chefitupkids.com/top-10-spy-apps-for-iphone/
jarestaurant.com/text-spying-apps-iphone/
www.lisamishler.com/qn/phone-spy-apps-uk
tigerdenus.com/spy-apps-for-iphone-no-jailbreak
palmhousestl.org/templates/phone-location/iphone-spy-apps-uk.html I'm also attaching couple of images which show that almost 80% of results on those pages are actually spam pages WlpJshL qtuLdHp0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Google admits it can take up to a year to refresh/recover your site after it is revoked from Penguin!
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having. I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY. What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS. My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail. At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name. I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time. i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords. So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May. I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail. I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation. Please feel free to comment below on anything you think is relevant.
White Hat / Black Hat SEO | | gazzerman10 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Google penalty having bad sites maybe and working on 1 good site ?!!!
I have a list of websites that are not spam.. there are ok sites... just that I need to work on the conent again as the sites content might not be useful for users at 100%. There are not bad sites with spammy content... just that I want to rewrite some of the content to really make great websites... the goal would be to have great content to get natual links and a great user experience.. I have 40 sites... all travel sites related to different destinations around the world. I also have other sites that I haven't worked on for some time.. here are some sites: www.simplyparis.org
White Hat / Black Hat SEO | | sandyallain
www.simplymadrid.org
www.simplyrome.org etc... Again there are not spam sites but not as useful as they coul become... I want to work on few sites only to see how it goes.... will this penalise my sites that I am working on if I have other sites with average content or not as good ? I want to make great content good for link bait 🙂0 -
Are directory listings still appropriate in 2013? Aren't they old-style SEO and Penguin-worthy?
We have been reviewing our off-page SEO strategy for clients and as part of that process, we are looking at a number of superb info-graphics on the subject. I see that some of current ones still list "Directories" as being part of their off-page strategy. Aren't these directories mainly there for link-building purposes and provide Users no real benefit? I don't think I've ever seen a directory that I would use, apart for SEO research. Surely Google's Penguin algorithm would see directories in the same way and give them less value, or even penalise websites that use them to try to boost page rank? If I were to list my websites on directories it wouldn't be to share my lovely content with people that use directories to find great sites, it would be to sneakily build page rank. Am I missing the point? Thanks
White Hat / Black Hat SEO | | Crumpled_Dog
Scott0 -
How The HELL Is This Site Ranking So Well In Google Places?
When I do a search for this site it ranks number 2 on Google just below the official federation of master builders website for the keyword phase "builders in london" this is the site http://bit.ly/Lypo8E which is a nasty looking blog which has nothing to do with builders and they don't even have an address anywhere on the site. The only thing I can see is that they are sharing there address with a lot of other businesses and all of the citations from those other businesses are causing them to rank higher on Google places, but surely Google can't be that stupid right?
White Hat / Black Hat SEO | | penn730 -
Why is this ranking first in Google Places for this term....?
"Best Bar In Chicago" - http://www.google.com/search?gcx=w&sourceid=chrome&ie=UTF-8&q=best+bar+in+chicago They have only 5 Google reviews, and their local directory reviews are suspect. One of them goes to rateclubs.com and it's not even a page for their business, while one of them doesn't have user reviews, it's just an editorial review. The other one at superpages.com doesn't even link back to their site, it links to their restaurants.com profile. What is going on here? I've been trying to figure this out for a while as their first place ranking has been solidified for quite some time now. I can also tell you that a few of the bars listed below them have a MUCH higher profile and are better known. You can see that just by the reviews.
White Hat / Black Hat SEO | | MichaelWeisbaum0