Shoing strong for last 2 years for search terms NOW GONE! What happened?
-
Hi All! I have a 9-11~ my website www.popscrap.com has been showing strong (top 2) for about 2 years now for many of the search terms we are targeting (scrap software; scrap medal software; recycling software; etc.), and I just noticed today that we are nowhere. What do you suggest for troubleshooting this to find the cause and fix?
Thanks!
-
Well, I removed the suspect content, and after 2 weeks, nothing. Then I added Google Authorship to each page, and the NEXT DAY the site is back in the top positions for our target terms, and the leads are pouring in. Was it the Google Authorship? It certainly felt like it. But I thought that was not a ranking factor.
Anyway, thanks for all the support! BB
-
On a quick look my gut instinct is that this is ok. However, on a site: search I'm seeing that you have over 19,000 pages indexed in Google. That's a bit of a Panda flag for me as most likely there are not 19,000 unique pages that add value on your site.
-
Thanks for the response, Marie
I asked the question as I was wondering whether I'd need to add "boilerplate" text to each description to fill it out. I'd rather not as a) it's not very scaleable and b) I'm not sure it would add value to our users per se, as in the main people want to see pictures. Here's an example of one of the shorter descriptions we run.
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it Of the 4,500 pages, 95-98% are content that's unique to our site (the other ~2-5% are managed by individual realtors who I'm guessing probably copy and paste descriptions from their own sites. We're not in the US so aren't part of the MLS network).
-Do users engage with your content? Mos' def.
-
It's hard to say what Google views as thin. Here are some factors I would consider when making that decision:
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it.
-Do users engage with your content? Short content can be useful. If Google sees that people are actually engaging with your site then they will have no problem with thin content.
It sounds to me like these pages are probably ok. But I can't say for certain.
-
"Thin content" question:
I run a real estate website and carry about 4,500 property pages (each page consisting of between 5-13 photos and about 50-300 words of a property description) Might the pages of ~50 words run the risk of being deemed "thin content" even though they have photos on them?
I also have around 200-250 article pages that are far more text-heavy.
FWIW, I don't think I've been hit by Panda 4.0. (I've slid from about #8 to #12 over the past 2 weeks but I suspect that's more to do with sluggish content marketing/link-building). -
If unpublishing causes the pages to either be removed from your site or noindexed then yes, that's the same thing.
-
Thank you! But what about unpublising? Is that the same thing as removing, in the eyes of Google?I want to remove ALL pages under the "Scrap Laws" menu, because I think that is where the issue is. But I don't want to delete totally and have to recreate them all later. Thnaks again!
-
While you can test this over time, it would be difficult because you will never know if you've done enough to satisfy Panda. And really, you don't even know for sure if Panda is the culprilt. (I think it is, but no one can say for sure.)
So, let's say you took out some of the low quality content and a month later nothing has changed. That could mean that you didn't take out enough to make the Panda algorithm see your site as high quality. But, it could be that you just need more time. While some sites recover within one Panda refresh (and that usually happens approximately monthly), others seem to need several refreshes.
In regards to unpublishing vs deleting the content, you can either delete the pages or you can use a noindex tag to tell Google not to include the pages in the index. Having low quality pages on your site that are noindexed will not hurt you in the eyes of Panda.
-
Thanks Marie! I'm getting the feeling it's the content. Quick question: Could I just unpublish the content and then test over time, OR do I need to completely delete the questionable content from the site? Does Google see it if it is unpublished and still penalize?
-
There were two major algorithm updates last week - Panda and the Payday loans algorithm. Payday loans affects sites that had done really spammy link building and it is very unlikely that this affected you. But, Panda is certainly possible.
I haven't had a good look at the site, but I see that you have 263 pages indexed in Google. Are all of these pages high quality pages that Google would be proud to show to searchers? If you've got duplication amongst the pages or if you've got "unhelpful" pages that are indexed then you need to remove or noindex them. On a quick look here are some examples of pages that should be removed or noindexed:
http://www.popscrap.com/component/content/category/11-demo-articles
http://www.popscrap.com/component/users/?view=remind
http://www.popscrap.com/24-products/120-scrapshield - It looks like a good amount of the text on this page is on multiple pages of your site.
Of course, there could be other issues. If you've made any changes to the site recently then I'd look at those changes first, but otherwise I'd go on a thorough cleanup so that only the pages that are the best are shown to Google.
-
To help figure out what is causing the 404 errors do the following in webmaster tools:
-login to your websites profile, then on the left hand side navigation hit crawl > crawl errors > not found. Under not found review the list of URL's for clues (you can also click on an individual link to see where the 404 page was linked from). Depending on how large your site is, if the 747 not found URL's is a large percentage of your total page count, you could be experiencing a temporary rankings drop that will disappear one you fix your error pages. If you could add a link to a few of the 404 error pages we could help you figure out what is wrong with your site code or server setup.
-
Just my two cents friend..
4 days back, Google released Panda 4.0. You can check if that caused the drop.
Here is a tool that can help you find if any of the Google penalties are behind the drop:
http://www.barracuda-digital.co.uk/panguin-tool/
Once on the page, click on the 'Log-in to Analytics' button and allow the tool to access your Google Analytics account and check if the recent Panda caused the drop. Hope this helps.
Good luck. By the way, thin content is of no use these days and you should be investing all your quality time in producing quality content.
Best,
Devanur Rafi
-
I looked at some of your content, and some of it seems quite thin, such as the regulations for each state. There's really only a couple of sentences (in the instances that I saw) that deal with the individual state, and then there's a lot of boilerplate content, navigation, and other site elements that are the same from page to page. Just one more thing to think about.
-
It looks like google penalize you, it's happen to one of my websites on January, I was going nuts because I didn't see any message until 2 weeks after on my google webmaster tools. I would recommend a couple o days to see if you see something if not then try to check your links if a couple of websites you are linking got penalized they you can get in trouble too.
-
Kevin, any insight into where to start with respect to the 747 missing URL's?? What causes that? How to fix? Thanks!!
-
haha! Ok! Thanks Kevin!
-
No, no. My bad. You mentioned above that you've been ranking strong for two years, and then when I peaked at your site I saw the RT template. I wrongly assumed the Joomla template was released at the same time as the Magento template (I actually use the same exact template for Magento at www.88k.com.tw, although heavily modified). I was just thinking if you had done a site revamp with a new template that might be a factor in your recent bump off SERPs. Sorry to worry you about that. But it looks like you found an issue with the 404 errors. Good job.
-
Also, I just noticed this (see image). 747 missing URL's!?
-
What do you mean by "it's not 2 years old"? Is being under 2 years old a factor?
-
Thanks! Yes, it's Google. We actually are ranking better on Bing and Yahoo now!
Looked at Google Webmaster and it shows a steep drop on 5-21. (image attached)
-
A couple of things I'd do right away:
Look in Google Webmaster Tools to see if there are any notices there (I'm going to assume that it's Google where you are no longer ranking).
Look in your analytics to see if there was a particular day that you dropped off. You can then look to see if that coincided with any known algorithm update.
-
My bad. Looks like it is. It was release for Magento only late last year.
-
Always great to help out a fellow Rocketeer! Did you recently update your website, because that template is not 2 years old. This could certainly be a factor.
-
Thanks, Kevin. I haven't made any changes in months, and do not do any crazy linking schemes. Competitors seem to be at the same places on the page. We are the only one hit by this.
-
That's a tough one without more to go on. Google releases updates to it's ranking algorithm every so often and some site get hit hard. If you're content hasn't changed and you haven't engaged in any unusual activity in terms of link building or advertising, then I'd say wait it out. Give it a week or two, which is how long it's taken many other quality sites to bounce back from a Google update. Unlikely you'll have issues here, but you still might want to check your webmaster tools to see if any manual actions have been applied.
This might be a good time to go over your site, again, for the first time;-) See what could be done to answer visitor questions and lead them to the right pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to try when Google excludes your URL only from high-traffic search terms and results?
We have a high authority blog post (high PA) that used to rank for several high-traffic terms. Right now the post continues to rank high for variations of the high-traffic terms (e.g keyword + " free", keyword + " discussion") but the URL has been completed excluded from the money terms with alternative URLs of the domain ranking on positions 50+. There is no manual penalty in place or a DCMA exclusion. What are some of the things ppl would try here? Some of the things I can think of: - Remove keyword terms in article - Change the URL and do a 301 redirect - Duplicate the POST under new URL, 302 redirect from old blog post, and repoint links as much as you have control - Refresh content including timestamps - Remove potentially bad neighborhood links etc Has anyone seen the behavior above for their articles? Are there any recommendations? /PP
Intermediate & Advanced SEO | | ppseo800 -
URL Parameters Settings in WMT/Search Console
On an large ecommerce site the main navigation links to URLs that include a legacy parameter. The parameter doesn’t actually seem to do anything to change content - it doesn’t narrow or specify content, nor does it currently track sessions. We’ve set the canonical for these URLs to be without the parameter. (We did this when we started seeing that Google was stripping out the parameter in the majority of SERP results themselves.) We’re trying to best strategize on how to set the parameters in WMT (search console). Our options are to set to: 1. No: Doesn’t affect page content’ - and then the Crawl field in WMT is auto-set to ‘Representative URL’. (Note, that it's unclear what ‘Representative URL’ is defined as. Google’s documentation suggests that a representative URL is a canonical URL, and we've specifically set canonicals to be without the parameter so does this contradict? ) OR 2. ‘Yes: Changes, reorders, or narrows page content’ And then it’s a question of how to instruct Googlebot to crawl these pages: 'Let Googlebot decide' OR 'No URLs'. The fundamental issue is whether the parameter settings are an index signal or crawl signal. Google documents them as crawl signals, but if we instruct Google not to crawl our navigation how will it find and pass equity to the canonical URLs? Thoughts? Posted by Susan Schwartz, Kahena Digital staff member
Intermediate & Advanced SEO | | AriNahmani0 -
How to maximize CTR from Google image search?
I'm getting good, solid growth in my Google SERPs and Google search traffic now, but I do notice that 70% of my high ranking search results are images and the CTR on those is only 3-4%. All my images are illustrative and highly relevant to my travel blog, but I guess that hardly matters unless they get CTR so people see them in context. Has anyone seen or done any good research on what makes people click through on Google Image Search results? What are the key factors? How do you optimize for click-through? Is it better to watermark your images or overlay label them to increase likelihood of click-through? Thanks, Tony FYI the travel blog in question is www.asiantraveltips.com and a relevant Google search where I rank highly is "songkran 2016 phuket".
Intermediate & Advanced SEO | | Gavin.Atkinson0 -
How do we better optimize a site to show the correct domain in organic search results for the location the user is searching in?
For example, chicago-company.com has the same content as springfield-company.com and I am searching for a general non-brand term (i.e. utility bill pay) and am located in Chicago. How can we optimize the chicago-company.com to ensure that chicago's site results are in top positions over springfields site?
Intermediate & Advanced SEO | | aelite1 -
Search box within search results question
I work for a Theater news website. We have two sister sites, theatermania.com in the US and whatsonstage.com in London. Both sites have largely the same codebase and page layouts. We've implemented markup that allows google to show a search box for our site in its results page. For some reason, the search box is showing for one site but not the other: http://screencast.com/t/CSA62NT8 We're scratching our heads. Does anyone have any ideas?
Intermediate & Advanced SEO | | TheaterMania0 -
Social Buttons Help SEO, 2 Questions...
Howdy Guys, I noticed a weird thing over the weekend - our main keyword has been hit pretty hard by penguin and we had dropped down to #79. On Friday I decided to change some on-page optimisation and changed the title tag and some tags. When I've ran my rank tracker this morning we have jumped up to #62... Has anyone else noticed just a simple change boosts rankings? Second Questions We took all our social buttons off the website back in January as no-body was using them but from a few recent reports I've seen having the buttons on the site help organic rankings... Is this true? Scott
Intermediate & Advanced SEO | | ScottBaxterWW0 -
Search result clicks for one of my posts down significantly
I started receiving a warning warning in google webmaster tools about 2 weeks ago that said "big traffic change for top url". On reading the message i saw "Search results clicks for http://goo.gl/EyhUJ have decreased significantly". When I search google using the keyword "sore breasts" for which that post used to rank at least number 2 on google, I dont see anything. The related post http://goo.gl/vP025 is still ranking well. Can anyone give me an idea of what might have happened? I am totally at sea. Thanks.
Intermediate & Advanced SEO | | adaeze0 -
Soft Hyphenation: Influence on Search Engines
Does anyone have experience on soft hyphenation and its effects on rankings? We are planning to use in our company blog to improve the layout. Currently, every word above 4 syllable will be soft hyphenated.
Intermediate & Advanced SEO | | zeepartner
This seems to render okay in all browsers, but it might be a problem with IE9... In HTML 5, the "" soft hyphenation seems to be replaced with the <wbr> Tag (http://www.w3schools.com/html5/tag_wbr.asp) and i don't find anything else about soft-hyphenation in the specs. Any experiences or opinions about this? Do you think it affects rankings if there are a lot of soft hyphens in the text? Does it still make sense to use or would you switch to <wbr> already?0