Solving a Mystery
-
I've been banging my head trying to solve a mystery for a few months, with little luck. Maybe one of you has an answer.
My home page organic SERP rankings have dropped from #1 or #2 late last year, to between #8 and #25 a few months ago for a couple of important keywords.
I did not make any on-page changes during this period.
Meanwhile, many inner pages continue to rank well (top 5) for long-tail keywords.
All of my SEO is white hat, no unusual on-page tactics, no purchased or otherwise fishy backlinks. In fact, looking closely at the on-page and backlink tactics of some of the websites that have leapfrogged ahead of me, I'd say some of them have (and continue to) engaged in shady practices.
No errors reported by SEOmoz. I've been judiciously applying canonical tags and other techniques to ensure minimal duplicate content.
I've not received a letter from Google warning me of any unnatural backlink profiles.
The biggest part of the mystery is that it seems like my home page was the only page really affected. As mentioned above, the inner pages continue to rank well for their respective keywords. Thus, I don't think I've been hit by any kind of site-wide penalty.
Also, I continue to rank well on Bing/Yahoo.
Does anybody have an insight into what could have happened? Has anybody else experienced anything similar (home page dropping rank while other pages stay put)?
-
I think Guy Warner nailed it. I wrote the copy for my home page back in 2004. I've tweaked it over the years, but it remained largely the same. During that time, lots of others have been copying the text for their own websites. Now, if you search for an exact match, a few of the other sites come up even higher in the SERP than my own site!
I've done my best to get folks to remove the copies (and serving DMCA complaints when necessary) but I think my time is better spent simply rewriting the home page.
It's frustrating on one hand, but it's also good to finally have an idea of what the problem might be.
Thanks, Guy!
-
Yeah, that is weird. Rankings shift all the time though. If it wasn't a named algo update, it could be just some change that Google made to the algo.
For example, if you have a lot of sitewide links (or blog links, or directory links) it could be that Google lowered the value of those for your type of query, relative to other types. Or some of the sites that are linking to you could have been penalized for selling links, or seen a drop in pagerank, or got shifted into a different niche from your site, which lowered their value.
Whatever the case, unless you're being directly penalized for something, your best bet is to just build more links. If your competitors have high value links that you don't, that could be a good starting point for regaining your ranking for your keyword and improving your site traffic overall.
-
I did a review of his keywords and his site is competitive link wise with the top 3. The only thing I could find was duplicate content which would play into the timing of the Panda update.
-
Hi Takeshi,
My competitors for the specific keywords I'm talking about have stayed fairly static in the rankings. I haven't noticed any on-page changes for any of them during this period (I check frequently) and I haven't noticed any major changes in backlink profiles (I use Raven and Majestic in addition to SEOmoz Pro).
Over the same period, my site's on- and off-page SEO factors did not change either, and yet my site dropped >10 spots relative to competing sites for these specific keywords.
Meanwhile, other pages of my site continued to rank well for other keywords - so if it was an algo change, it's an algo change that only affected my home page for certain keywords, while leaving my other pages and my competitor's pages alone.
Strange, no?
-
Google's algorithm is changing all the time, even out side of the big named updates, so it's natural for rankings to fluctuate for specific keywords.
Have you noticed any fluctuations among the other sites in SERPs? If that's the case, Google could have rolled out a change that affects queries of that type.
It could also be possible that your competitors have stepped up their SEO efforts over that time, how have their backlinks been growing, and how do their link profiles compare to yours?
Finally, links tend to lose value over time as blog posts are pushed into archives, sites go down or get penalized, pagerank goes down, etc. How has your link building been during that time period? If you're not suffering from an actual penalty like Penguin, the best strategy would be to make sure your on-page factors are solid and continue building better links.
-
I'm sure it IS algorithm-related. The strange thing is that it doesn't seem to have affected all of my pages/rankings equally. I'll PM you shortly.
-
You can PM me it and see I can help you track it down.
Early Feb 3 Google announced 17 search quality highlights which they mention and update to Panda for quality sites. Your site might have been caught up in that.
-
There's actually been a steady and gradual decline in organic traffic since around early February.
I'd rather not say which website or what keywords in a public forum, but I'd be happy to discuss it privately.
The Google Webmaster Tools "site performance" chart shows a period of slow performance for the first half of April. There was also a brief website outage about 2 weeks ago - I don't think it lasted more than 10 minutes.
Thanks for your thoughts on this.
-
About what date did the drop occur? If around April 24th, the it was an algorithm update that changed you.
Has your server had any downtime or slowness to it?
And, lastly what's the keyword you fell in (and I assume it's for the site in your profile)?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
How do I know if I am correctly solving an uppercase url issue that may be affecting Googlebot?
We have a large e-commerce site (10k+ SKUs). https://www.flagandbanner.com. As I have begun analyzing how to improve it I have discovered that we have thousands of urls that have uppercase characters. For instance: https://www.flagandbanner.com/Products/patriotic-paper-lanterns-string-lights.asp. This is inconsistently applied throughout the site. I directed our website vendor to fix the issue and they placed 301 redirects via a rule to the web.config file. Any url that contains an uppercase character now displays as a lowercase. However, as I use screaming frog to monitor our site, I see all these 301 redirects--thousands of them. The XML sitemap still shows the the uppercase versions. We have had indexing issues as well. So I'm wondering what is the most effective way to make sure that I'm not placing an extra burden on Googlebot when they index our site? Should I have just not cared about the uppercase issue and let it alone?
Intermediate & Advanced SEO | | webrocket0 -
How to solve our duplicate content issue? (Possible Session ID problem)
Hi there, We've recently took on a new developer who has no experience in any technical SEO and we're currently redesigning our site www.mrnutcase.com. Our old developer was up to speed on his SEO and any technical issues we never really had to worry about. I'm using Moz as a tool to go through crawl errors on an ad-hoc basis. I've noticed just now that we're recording a huge amount of duplicate content errors ever since the redesign commenced (amongst other errors)! For example, the following page is duplicated 100s of times: https://www.mrnutcase.com/en-US/designer/?CaseID=1128599&CollageID=21&ProductValue=2293 https://www.mrnutcase.com/en-US/designer/?CaseID=1128735&CollageID=21&ProductValue=3387 https://www.mrnutcase.com/en-GB/designer/?CaseID=1128510&CollageID=21&ProductValue=3364 https://www.mrnutcase.com/en-GB/designer/?CaseID=1128511&CollageID=21&ProductValue=3363 etc etc. Does anyone know how I should be dealing with this problem? And is this something that needs to be fixed urgently? This problem has never happened before so i'm hoping it's an easy enough fix. Look forward to your responses and greatly appreciate the help. Many thanks, Danny
Intermediate & Advanced SEO | | DannyNutcase0 -
Solving pagination issues for e-commerce
I would like to ask about a technical SEO issue that may cause duplicate content/crawling issues. For pagination, how the rel=canonical, rel="prev" rel="next" and noindex tag should be implemented. Should all three be within the same page source? Say for example, for one particular category we may have 10 pages of products (product catalogues). So we should noindex page 2 onwards, rel canonical it back to the first page and also rel="prev" and rel="next" each page so Google can understand they contain multiple pages. If we index these multiple pages it will cause duplicate content issues. But I'm not sure whether all 3 tags need adding. It's also my understanding that the search results should be noindexed as it does not provide much value as an entry point in search engines.
Intermediate & Advanced SEO | | Jseddon920 -
Forum generating automatically extra pages. Can I solve it with canonical?
Hey there Webmasters of the Universe. So i have this problem with my forum. The platform I am using it automatically creates extra pages for every page. For exampleIf my forum had one page called forum.com/examplethe same page you can find at forum.com/example?page=1If I set rel canonical into the second one pointing to the first one will that cause a problem for me?Thanks in advance!
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
How to solve outbound broken links? Those don't exist now?
There are many, many broken links on the website. What normal strategy to use for that? http://www.txacspecialist.com/air-conditioning-equipment-service-austin/american-standard/ It's an AC site, so all the links to AC vendors who have changed their product pages, all of those links are broken So for instance, the carrier 20xl doesn't exist anymore. Now they sell the carrier 45abp. We link carrier 20xl and now the page and AC model is not exist. So what I can do to solve the broken link issue?
Intermediate & Advanced SEO | | bondhoward0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0