Seo results are down. Is my "all in one seo pack" to blame?
-
My website www.noobtraveler.com has shown a dip of 40% since Penguin's last update in November. I also transferred hosting at time, but I was wondering if I'm over optimizing with the all in one seo pack.
I would appreciate it if someone could do a quick sweep and share their thoughts.
Thanks!
-
You have a huge amount of high DA links, specifically from boardingarea, hyatt blogs and appsumo. You need to build a better link profile, not focus so much on 3 sites with high DA, and get links more naturally as often as possible. Almost every link on your OSE is a "controlled" link - blog comments, profiles, appsumo pages, etc.
These links are bad by themselves - they're actually good. But because they're such a high percentage of what you have as a whole, it can look to Google like you're not really all that popular except to 3 sites.
You have 1850 links on OSE from 46 domains - about 35 links per domain, a LOT. Also, nearly 100% of your anchor text is branded. That tells me you're controlling it. If I can see you're controlling the anchor text, Google can see it. Again, diversity is best.
202 of your 1850 links are nofollow, 1650 are dofollow. As before, a high ratio of nofollow signals spam or controlled link building. Natural nofollow profiles have 5-10% nofollow ratios, yours is 16% which is a touch high.
So I would start with your actual link profile. It's not likely AIOSEO unless you're doubling up on titles and meta, you should be ok.
Also, when you pin things, don't use the non-www version of your site. Several of your pins have "http://noobtraveler.com" instead of www-noobtraveler.com I would be as consistent as possible on that particular issue.
-
The All in One SEO Pack plugin is pretty good and used quite widely among WordPress-based websites. It focuses on on-site optimization including keywords/density, appropriate linking, canonical URLs, and the like.
Use of this plugin should not cause a 40% drop in traffic (you didn't say what you mean by a "dip", but I assume you mean traffic or search traffic).
Such a significant drop in traffic after the Penguin update is more likely to be related to what Google is perceiving as non-organic backlinks, including paid links or link exchanges. If you engaged in such practices, it would be best to work on removing those backlinks, including the link disavowal tool if you can't have them removed.
-
Hi Geoff,
Id start by doing a website crawl and utilizing SEOmozs excellent toolset: http://www.seomoz.org/tools. Your questions is a rather general: will you do an audit of my site issue. Which many of the people on this board do for livings. You are better off asking specific questions. As people can answer those and best practice recs.
The All-in-one SEO pack has limitations but is overall usually pretty decent for a basic WordPress website with low to medium competition thresholds.
If you believe you were hit by penguin I would recommend checking your backlink profile and disavowing any nefarious links.
-Phil
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Are SEO Friendly URLS Less Important Now That Google Is Indexing Breadcrumb Markup?
Hi Moz Community and staffers, Would appreciate your thoughts on the following question: **Are SEO friendly URLS less important now that Google is indexing breadcrumb markup in both desktop and mobile search? ** Background that inspired the question: Our ecommerce platform's out of the box functionality has very limited "friendly url" settings and would need some development work to setup an alias for more friendly URLS. Meanwhile, the breadcrumb markup is implemented correctly and indexed so it seems there's no longer an argument for improved CTR with SEO friendly URLS . With that said I'm having a hard time justifying the URL investment, as well as the 301 redirect mapping we would need to setup, and am wondering if more friendly URLs would lead to a significant increase in rankings for level of effort? Sidenote: We already rank well for non-brand and branded searches since we are brand manufacturer with an ecommerce presence. Our breadcrumbs are much cleaner & concise than our URL structure. Here are a couple examples. Category URL: http://www.mysite.com/browse/category1/subcat2/subcat3/_/N-7th
Algorithm Updates | | jessekanman
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 Product URL: http://www.mysite.com/product/product-name/_/R-133456E112
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 > product name The "categories" contain actual keywords just hiding them here in the example. According to my devs they can't get rid of the "_" but could possible replace it with a letter. Also they said it's an easier fix to make the URLs always lower case. Lastly some of our product URLS contain non-standard characters in the product name like "." and "," which is also a simpler fix according to my developers. Looking forward to your thoughts on the topic! Jesse0 -
What is your experience with markups (schema.org) in terms of SEO and best practice learnings?
Hi, I am looking to implement schema markups into a variety of websites and currently wondering about best practices. I am working on energy providers, building material, e-retailers, social association among others. While I understand every single one of these is an individual case, I could do with some advices from you, guys. Which markups would you consider key for search engines? I would have naturally chosen markups to highlight the business name, location and products but there is so much more to schema.org! Thanks,
Algorithm Updates | | A_Q0 -
Is it Safe to Migrate from All-in-Seo WordPress Plugin to Yoast Plugin ?
I am using All-in SEO plugin from the time we started using blog, can we use the new Yoast plugin, i mean i do not want to lose any data or value to our article. I want to migrate them Will it be safe ! some saying do not go - others says you can ! Please if you ever did or have experience tell me ! My Blog is important for me ! The Yoast plugin gives many features compare to All-in-SEO.
Algorithm Updates | | Esaky0 -
Recovered from penguin/panda but which one?
So the good news is that for the first time since April 24th, one of our websites is back in the search results as of around December 12 but I am still unsure as whether it was panda or penguin (or both) that was impacting the site?? Note this was not a manual penalty. I diagnosed it as a penguin issue (drop on April 24th, aggressive on-page optimisation, around 10% of links from spammy directories like addyourfreelinks.com with anchor text built by a questionable agency), but on further advice it was thought that panda was also an issue because it is a hotel microsite so there was duplication with our own brand site and across third party travel sites and there were a number of pages with bare content. I figured it was a good time to clean everything up to address both. Here is a summary of actions taken: submitted disavow file on October 24th with all questionable links including actions taken and comments. Since then I have cleaned up some content so it is less aggressively targeting certain keywords. Amended several third party listings with duplicate content No follow,indexed pages that were directly duplicated with our brand site and over the last month have built a few good quality links. Cleaned up 404's in webmaster tools over the last week I have searched to see if there were any algorithm updates around December 12 but cannot find any mentions. Thoughts?
Algorithm Updates | | jay.raman0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Today all of our internal pages all but completely disappeared from google search results. Many of them, which had been optimized for specific keywords, had high rankings. Did google change something?
We had optimized internal pages, targeting specific geographic markets. The pages used the keywords in the url title, the h1 tag, and within the content. They scored well using the SEOmoz tool and were increasing in rank every week. Then all of a sudden today, they disappeared. We had added a few links from textlink.com to test them out, but that's about the only change we made. The pages had a dynamic url, "?page=" that we were about to redirect to a static url but hadn't done it yet. The static url was redirecting to the dynamic url. Does anyone have any idea what happened? Thanks!
Algorithm Updates | | h3counsel0 -
Were you affected by the "Farmer Update?" What are you doing about it?
I woke up on Friday morning to see that my traffic from Google on Thursday was down 30% on one of my sites. Traffic hasn't bounced back, and I'm wondering why I've been lumped in with the content farms. My site only has original, high quality content. It has a great link profile with tons of links from .edu page, and I've always played by Google's rules. I can't understand why my site has been negatively affected, which makes it hard to do something about it. Right now, the only thing that I can come up with is to work really hard at building more links. Were you affected? What are you doing about it?
Algorithm Updates | | WillyF0