Panda 2.5
-
I'm sure we have all read about the latest round of Google's algorithm changes also known as the "Panda 2.5" updates. This latest update seems to have hit some pretty large press release sites including PR Newswire and Businesswire (both of these have a great page rank and domain authority making them a great tool for SEO's in regards to inbounds links).
Ultimately this update has directly affected their sites traffic, keyword rankings, and the number of indexed pages in Google. But what will this do to our smaller sites that benefit from these great links? Will these panda updates continue to target these content farms and lower their domain authority? Will that extrapolate out and effect the domain authority of our sites?
What are your thoughts for those of us that utilize these services, should we re-evaluate our process?
I look forward to a great discussion.
Regards - Kyle
-
Oh if I felt like agreeing. Understand it was a challenging day. I have one client site surpassed by two sites with PA and DA of 1. No links, on page worse than my clients and, arguably, we have better content.
On top of that I have another client site surpassed by clowns with Chinese and Australian links to no where, etc. The others are all junk - literally 40 to 60 linking root domains with maybe five even close to unpaid, non-reciprocal, and that are at least within 90 degrees of the site content (the others are somewhere beyond 180 degrees.
Yes, Google occasionally has a moment and a JC Penny feels the sting. On the whole, I see too many who do not and the rankings in anything competitive are replete with BS content, from BS sites stuffed to the gills with Keywords and linked to Bangladeshi laundromats.So, why the rant.....I was getting ready to suggest PR Newswire to my clients as a counterposition after seeing so many competing sites simply run the same content as on their homepage over and over through our RSS feeds.
Yes, both Ryan and Justin are correct; I just wish Mr. Google would take some uppers and get about the business of cleaning out the junkers.
Don't worry, I is still smilin' cause we are smarter than them! -
What are your thoughts for those of us that utilize these services, should we re-evaluate our process?
Yes, on an ongoing basis.
After every Panda update it is important to quickly assess what changes were made (i.e. who was hit) and how these changes affect our clients.
In short, Google has made us clearly aware of what they want to see in terms of links: a completely independent link to websites. They do not want to see any form of influence.
-
Press Releases is content usually under the complete control of the company which offered the release. It is not an unbiased link.
-
Articles published on various content farm sites are not (usually) independent links.
-
Links from various forms of link networks, directories, etc. all fall under this category as well.
While the above links do offer some value, it is greatly diminished compared to the value of an authentic link.
The question is, how do we adjust? My suggestion is to focus more then ever on onpage SEO. Work with clients to ensure their websites are more streamlined, more focused, more usable, liked, trusted, helpful (and 100 other adjectives) then ever before. There is one word which encompasses everything else...compelling.
In the world of sales there are products and services which need to be sold, and there are products and services which sell themselves. Is the content on your site what you want to write about? Or does it cover topics readers want to hear about? Do you have known detractors from site quality (i.e. ads, keyword stuffing, etc)?
TL:DR: Treat your website like it is the only site you own. Make the site the most helpful and compelling resource on the topic you cover.
Once the above is complete, you only need to let the world know about your site and they will want to link to it. Keep pounding away at on page until you feel you attained a level of perfection. Then, ask for and openly accept feedback. Ask other SEOs and more importantly ask your visitors whether it is via surveys or A/B testing. "Build it and they will come".
-
-
I feel Google no longer wants people to build links in the manner of the great bringing up the meek, but more of the meek making themselves individuals with their content and uniqueness so that people cause natural traffic. I think Panda is more about making an Organic operating system over a silicon hive mind that ranks people based off of their involvement with already established sites. This is just my opinion but I feel it is for the better, because it actually makes it easier to optimize a site if one take the initiative and has to the discipline to do research and hop on social and economic trends with great content which will gain all the organic users. Then again I could be entirely wrong, but I like Pandas and Google Panda ^.^
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Impact of May 2015 quality update and July Panda update on specialty brands or niche retail
We are seeing the following trend in our rankings and traffic after the recent Google algorithm updates (May 2015 quality/phantom, and July 2015 Panda), and I am curious if anyone here has encountered similar and/or has any good ideas on how to react. Background - we operate in a niche segment, but compete for keywords with large home improvement stores and mass retailers. In the past, prior to May 2015, we generally ranked higher than the large home improvement stores and mass retailers for our key specific terms in our niche. We believed that it was because we have a very specialized focus and so our store was highly relevant for someone searching in that niche (for example for the name of the product category as a keyword). In general, we ranked #1-3. Along with a few of our competitors in our niche. And then would be the big box home improvement stores in spots 5-10. The change we saw starting in May is that now all the home improvement stores and also a few large multi-category retailers took over those top 5 spots and bumped all the specialty retailers and the specialty brand manufacturers (like us) down. Our direct competitors in our niche all seem to have been impacted pretty much the same as us. So, in summary it seems like these latest updates may have favored the more general retailers but with stronger domain authority than the more specific but smaller retailers. Hard to know for sure, but this is the trend we believe we see. So, that said, what are some good strategies to respond to this situation? We can't really compete on overall domain authority with these huge retailers. And our previously successful strategy of having a very focused niche, with lots of helpful content, videos, instructional guides, etc. no longer seems to be enough. Has anyone else seen similar results since this past May? Where specialty retail or brand sites lost ground to larger general retailers? And if so, has anyone found any good strategies to gain back their previous rankings, or at least partially?
Intermediate & Advanced SEO | | drewk1 -
Only 285 of 2,266 Images Indexed by Google
Only 285 of 2,266 Images Indexed by Google. Images for our site are hosted on Amazons CDN cloud based hosting service. Our Wordpress site is on a virtual private server and has its' own IP address. The number of indexed images has dropped substantially in the last year. Our site is for a real estate brokerage firm. There are about 250 listing pages set to "no-index". Perhaps these contain 400 photos, so they do not account for why so few photos have been indexed. The concern is that the low number of indexed images could be affecting overall ranking. The site URL is www.nyc-officespace-leader.com. Is this issue something that we should be concerned about? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
With the New Panda update supposedly only weeks away, is it wise to No Index my products I have not had time to rewrite the product descriptions for ?
Hi Mozzers, I read on SEJ yesterday than apparently the Panda update was due in the 2 - 4 weeks. I still have a large of my products which I have not got around to rewriting unique product descriptions for. I know these product descriptions are duplicated on other affiliate sites so do it think it in light of the panda update coming , would it wise to put a NO INDEX Meta tag on these product pages until I get around to rewriting the descriptions. That way, I may not hit my Panda and it will buy me a bit more time. Just an idea , but thought I'd run it by. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Can we have 2 websites with same business name and same business address?
I have 2 websites with same business name and same business address, and obvious 2 different domain names. I am providing the same services from 2 websites. Is this is a problem?
Intermediate & Advanced SEO | | AlexanderWhite0 -
We haven been hit by penguin 2.0 what to do?
Hi, Last week we got hit bij penguin 2.0. Our sites dropped on most keywords on average 10 places. We had a steady place for 2 to 3 years. We have site-wide links in the top of our websites to the other websites ( about 9 e-commerce sites). Today i have put rel= "nofollow " tags in all these links (accept on the hompages). To prevent spammy links. Is there anything else we can do ? Url ww.klokkenpaleis.nl most important keyword = klokken ( previous position, 2nd place) search engine = google netherlands Thanks a lot for your help.
Intermediate & Advanced SEO | | GTGshops0 -
2 sites or one sites: 2 locations
Hello, I have a dog training client who is offering services in 2 separate locations. We're looking to be first in the non-local search results and also rank well in google places. Would it be better to go for 2 separate sites or one site and try to rank for 2 different locations with one site? There's both local and standard search results when we type in our keywords. Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Using 2 wildcards in the robots.txt file
I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string. So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it? So something like /_Q1. Will that pickup and block every URL with those characters in the string? Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1. So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on? Thanks.
Intermediate & Advanced SEO | | seo1234560