Should I Wait Until the "Dust Settles" on the Algorithm Update or Get Busy Now?
-
We were hit hard on both of our sites yesterday and can't afford to wait for the dust to settle, as some folks are advising. We have been attempting to work off a penalty on one of our sites by undergoing a massive (and expensive) link removal project over the last four months. We are on our third reconsideration request and, hopefully, this last round of link removal will have done the job. I'm hesitant to go in and "de-optmize" the site by changing title tags and changing the anchor text until the penalty is removed, but I'm not sure if that's the right plan of action. I am, however, going to dig into the non-penalized site and change some title tags and anchor text.
Any thoughts on this strategy would be greatly appreciated.
-
Thanks, Peter. I appreciate your feedback. We are going to move slowly until we get a response (or not) to our reconsideration request.
-
I'd echo Rand but will just add this. You've got two factors here:
(1) On the one hand, the "Penguin" update hit hard and it isn't going to go away, I expect. Google may make minor adjustments, but these could happen over weeks and months and the philosophy is here to stay. Keep in mind that we're over a year into Panda update snow. So, absolutely don't wait for the dust to settle.
(2) On the other hand, you're dealing with an existing penalty, and you do need to resolve that. Don't panic, in other words. Act decisively, but follow through on solving the current problem.
If you go changing things site-wide, like titles/URLs, it's going to be hard to get a clean read of the data. If you can, tackle the worst culprits proactively (a spammy home-page title, for example). Tackle anything that's obviously a win - dupe content is a great example - you know it isn't helping you. From a link-building perspective, diversify and show a positive forward path.
-
Cool - glad to hear you're having a positive experience. I think waiting until you hear from Google is a wise decision, too.
-
Rand, I appreciate your feedback. We have been doing this for 9 years, thought we were doing a fairly decent job of building a brand (until we took a hit) and plan to continue for the long term, so I think your advice on building the new site(s) is a good one.
Once Google responds (or doesn't) to our latest request, we'll make that decision. By the way, we have been using an SEOMoz-referred company to help us with link-removal and, despite Google's refusal to budge after a huge number of links have been removed, I think they're doing a very good job. I know what they're up against, so I'm hoping to be able to give them rave reviews once this is done.
-
Richard - this is just my personal opinion, and it's not based on a ton of recent experience (I haven't had sites banned/penalized by Google in a very long time now). However, my feeling is that if you've been hit by this update, it's a great catalyst for immediate change. I'd be thinking about all of the following:
- Is now the time to start a new site that's exclusively white hat?
- Can I salvage this site and what's involved in that?
- How do I get rid of the bad links?
- How do I explain all the things I've done to Google (through reconsideration) and are they likely to let me back in?
Obviously, it depends on your long/short term goals, why you're doing web marketing in the first place, what you want to accomplish in the next few years, etc. If you're in the short-term, make money on the web fast with as little effort as possible and don't want to build a brand, then it's a question of whether 301'ing the domain to a new one and trying some tactics Google's not-yet-devalued is better than trying reconsideration and removal of bad links.
If you're thinking long term and brand building and want to put in sweat equity now that might take years to pay off, I'd be thinking around the new domain vs. trying to salvage this one (or two).
Wish you luck whatever you choose!
-
Try to generate more natural looking links. Diversify your anchor text portfolio so it looks natural and continue with your link building efforts. Do it in a way like your grandma will do it. With or without www, capital letters in the domainname, same anchor texts to different urls, use as much reasonable permutations as possible when generating the anchor texts. Bring good quality links, links from PR, links from guest blogging, post even more unique and genuine content. That will pay off. You can become more focused once the "dust settles". If you wait you will be definitely losing time and you will still start at the same point where you are now.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Descriptive domain vs business name domain
I originally set up my domain as "overlandparkphotographer.com" and then have my "jpshots.com" pointing to it. What I recently discovered is that even though the pages of my I set Yost SEO Title to be "JPShots Senior Pictures | Wedding Photographer" When you search "overland park photographer" the snippet tile is just "overland park photographer" which sounds super sketchy. I don't know if this is something to do with yost, or if my sneaky Domain isn't worth much, and I should simply use my regular jpshots.com domain as the primary. I know it works like a charm with yahoo, but I'm not sure how much the domain name factors google these days.
Algorithm Updates | | JPRichardson0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Is there a we to get Google to index our site quicker?
I have updated some pages on a website, is there a way to get Google to index the page quicker?
Algorithm Updates | | webguru20140 -
Why Google loves MOZ for "Directory Submmission Service" ?
I have just for "directory submission service" in Google.com ( Geo Location USA ). I got two results from moz community for same thread. Does Google don't understand 301 redirect from seomoz.org to moz.com ? What about Domain Clustering ? PFA: SERP Screenshot kn8evtt.png
Algorithm Updates | | SanketPatel0 -
Google Algorithm Update .. Author-rank finally kicking in ?
These few days I've been seeing great movement of my sites growing by 70-100% in traffic spikes. Some how I think this has something to do with AuthorRank maybe kicking in now as more of a factor in rankings? Anyone have an idea whats going on ?
Algorithm Updates | | NikolasNikolaou0 -
How to get global search results on Google ? Also, is it possible to get results based on some other geographic location?
I don't want results based on my geographic location. When I am in India, I don't want local search results. In fact, I want results which are not dependent on my current location. Also, can I change my current location to some other city and will it affect the results ? For eg: While I am in London, can my search results be modified as if I am sitting in New York ?
Algorithm Updates | | EricMoore0 -
Local SEO-How to handle multiple business at same address
I have a client who shares the same address and suite number with multiple business. What should be done to optimize their website and citations for local SEO? Is this a huge issue? What should we do so our rankings aren't affected. Will changes take a long time to take place? Thanks
Algorithm Updates | | caeevans0 -
Conveying Farmer Update To Client
I work with a site that saw their super competitive top terms drop off page one with the Farmer update. So, #4 to #12.... that kinda thing. In the last year they've added a huge catalog of 500,000 item pages. The catalog has climbed to a 76% bounce rate, where as the handful of top pages is in the 20s +/-. To date, I haven't had much of anything to do with the catalog. That makes for a sitewide average bounce rate of almost 70% which has almost doubled in the past year as the catalog has ramped up. The catalog gets a ton of search traffic and sells a lot of items via that organic traffic. I'm advocating for a variety of measures, including cleaning up the catalog: 301ing out of stock pages to the homepage 301ing 100% bounce rate pages who've had hundreds/thousands of visits over time.. Improving the user experience. Offering rainchecks for out of stock items. They generally don't believe that the huge bounce rate (bad user experience stats) is hurting their top terms on their top pages. They see it as two different issues. Any thoughts on how to present evidence that the catalog is the culprit? In researching it, I found these two quotes: "In particular, it's important to note that low quality pages on one part of a site can impact the overall ranking of that site," the Google spokesman said. and... "Google spokesman told PCMag that sites that believe they have been adversely impacted should "extensively evaluate their site quality." Not only that, but the item descriptions are straight from the manufacturer, so the pages aren't that unique text-wise. Any industry standard on catalog page bounce rates? Not that it's the only possible area of SEO improvement, because it's not. I thought those quotes were pretty conclusive, but I guess not. Is there some straight-from-Google additional info to suport this? Or, am I just wrong to focus on user experience... bounce rate, pages, time on site, etc? Thanks! Mike
Algorithm Updates | | 945010