Google Penguin 2.0 - Coming soon
-
There is an interesting article on SEW that Google is going to update Penguin to the next major version - http://bit.ly/15Vkr6O
So what do you think, what should we expect? And also, is there available updated webmaster guidelines?
-
This kind of link building is a real trap. You must pay every month for these links and we all know that soon or later Google finds out how to take down this network. Besides you will loose your money, you risk to be penalized.
-
Actually... looking in more detail, it seems the 30% dip in traffic started on May 7, through all my sites. Not May 2, but May 7. Still, more reason to believe it was an algo change and not my SEO software change that cause the dip.
-
I've been scratching my head because on May 1st I changed my most important site from Genesis SEO to Yoast Wordpress SEO, in order to have more control. I noticed my rankings fell about 30% since May 2 onwards.
This whole time I was thinking it was the change in SEO plugin. But... Looking at 2 of my other sites that had no plugin change, they have fallen about 30% as well. Looks like there may indeed have been an algo update and this whole time I thought it was my plugin changes that caused the problem.
-
On a "basic" research I did a month ago, I found over 5,000 Website buying sape links, and while checking rankings for say 100 of those, almost all had 1 - 3 first spots on those keywords they were targeting and pagerank of 4+. It seems that Google can't get them down. Matt Cutts tweeted that they were working to take down a pretty huge russian network (I guess it was sape), but some of the sites I researched are still ranking on the first spots with PR 4+ and only using those damn link.
-
There are still plenty of people selling SAPE links and they still work. But as a large part of the network is made up of hacked websites, I think Google will target the people at the end of the link rather than the victim website itself, which is probably why it's taking time to crack down on it. It's a targeted penalty rather than an algo update to find & destroy it.
-
It seems that they already did it, I've searched and almost every page with keywords "SAPE link network" has a 0 PR.
-
I really hope they were able to find a way to take down that SAPE link network, which apparently they were working on it.
I see hundreds of quality sites outranked by those link buyers, and all of them go back to sape. HATE THAT!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to remove the inbound links of a website from Google Webmaster Tools?
Hello viewers, One of my projects (BannerBuzz.com) is having linked with this Site: http://www.article-niche.com/ and we can see so many inbound links in our Webmaster account from this site, we have already disavowed this site but still it is found in our Webmaster Tools and we don’t have option to mail them as the site is down, so kindly anyone help us out how to remove this back-links and I want to remove it from my Webmaster account as well as from “Search Results” as the site is down. Currently as this site is show down from long time and because of its back-links, our website (BannerBuzz.com) has been penalized by Google.
Industry News | | CommercePundit0 -
XML Sitemap Leads to loss in ranking and traffic in Google.
I have submitted XML sitemap in Google's webmaster tools for some of the categories of my website. after submitting the sitemap i have seen the drop in traffic and ranking of keywords for the categories tof which sitemap was submitted. The other categories for which i have not submitted the sitemap the ranks have not changed. my sites alexa rank is approx 3000 Please help me out should i remove the site or change the priority of the urls which i have fixed in the sitemap?
Industry News | | seosogo0 -
Hit by Penguin 2.0
Hi, The site I'm working on seems to have been suffered from the Penguin 2.0 algorithmic update. It used to rank at position 1 for its top traffic term (which is very competitive) but now ranks at position 12. The traffic seems to have dropped since 22<sup>nd</sup> May 2013 when the update rolled out. There aren't any dodgy backlinks to the site. The only two things that changed just a month before Penguin 2.0 were – The meta title, description and H1 of the site’s homage changed The site received about 30,000+ backlinks from its parent site which sits on a different domain. These links were as a result of the site being included as part of the parent site’s main navigation which appears site wide. The parent site is a reputed and authority site. So I don’t understand why Google would penalise the site for acquiring links from its parent site (even though they are site wide links). Also, the links acquired from parent's site were not just homepage links but also links to some of its other deeper pages. Can somebody please suggest if they've had any similar experience and suffered the Penguin 2.0’s wrath? Any suggestions or solutions, most welcome. Should I be thinking of nofollowing those links? Should I submit a reconsideration request to Google? Many Thanks Pri
Industry News | | pri3910 -
We noticed that goods offered in our email newsletters used to disappeared from first google search results page!?
We noticed that goods offered in our email newsletters used to disappeared from fisrt google search results page. Goods where in top 5 positions or even higher, but after email newsletters we didn't find them even in top 100. We suspect service provider of email sending is in blacklist? Could it be reason? If yes, how could we check that?
Industry News | | Patogupirkti0 -
Google Penguin 2.0 - How To Recover?
Hi all,
Industry News | | chanel27
Last year, we have engaged a SEO company who promised to bring us to the first page on Google. But after 4 months, we actually found out that he might be using doing non quality mass link building tactic and this caused our ranking for all 3 sites we given to him to drop in ranking overnight on 22nd May 2012 after the Google Penguin 2.0 rolled out. Is there anything we can do to recover?1 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google Directory no longer available?
Now, we will forever not know what is in the Google Directory. I just clicked on the link..... and everything is dead and points you to DMOZ. What does this mean for us? Is DMOZ going to get more editor juice, so submissions are actually reviewed for once? The Yahoo! directory has also been glitching - new submissions have been disabled for over a week now. Any comments?
Industry News | | antidanis0 -
Google's New Release "What do you love"
Is this gonna be a game changer for SEO http://www.wdyl.com/ Regards, Shailendra Sial
Industry News | | IM_Learner0