Shoing strong for last 2 years for search terms NOW GONE! What happened?
-
Hi All! I have a 9-11~ my website www.popscrap.com has been showing strong (top 2) for about 2 years now for many of the search terms we are targeting (scrap software; scrap medal software; recycling software; etc.), and I just noticed today that we are nowhere. What do you suggest for troubleshooting this to find the cause and fix?
Thanks!
-
Well, I removed the suspect content, and after 2 weeks, nothing. Then I added Google Authorship to each page, and the NEXT DAY the site is back in the top positions for our target terms, and the leads are pouring in. Was it the Google Authorship? It certainly felt like it. But I thought that was not a ranking factor.
Anyway, thanks for all the support! BB
-
On a quick look my gut instinct is that this is ok. However, on a site: search I'm seeing that you have over 19,000 pages indexed in Google. That's a bit of a Panda flag for me as most likely there are not 19,000 unique pages that add value on your site.
-
Thanks for the response, Marie
I asked the question as I was wondering whether I'd need to add "boilerplate" text to each description to fill it out. I'd rather not as a) it's not very scaleable and b) I'm not sure it would add value to our users per se, as in the main people want to see pictures. Here's an example of one of the shorter descriptions we run.
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it Of the 4,500 pages, 95-98% are content that's unique to our site (the other ~2-5% are managed by individual realtors who I'm guessing probably copy and paste descriptions from their own sites. We're not in the US so aren't part of the MLS network).
-Do users engage with your content? Mos' def.
-
It's hard to say what Google views as thin. Here are some factors I would consider when making that decision:
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it.
-Do users engage with your content? Short content can be useful. If Google sees that people are actually engaging with your site then they will have no problem with thin content.
It sounds to me like these pages are probably ok. But I can't say for certain.
-
"Thin content" question:
I run a real estate website and carry about 4,500 property pages (each page consisting of between 5-13 photos and about 50-300 words of a property description) Might the pages of ~50 words run the risk of being deemed "thin content" even though they have photos on them?
I also have around 200-250 article pages that are far more text-heavy.
FWIW, I don't think I've been hit by Panda 4.0. (I've slid from about #8 to #12 over the past 2 weeks but I suspect that's more to do with sluggish content marketing/link-building). -
If unpublishing causes the pages to either be removed from your site or noindexed then yes, that's the same thing.
-
Thank you! But what about unpublising? Is that the same thing as removing, in the eyes of Google?I want to remove ALL pages under the "Scrap Laws" menu, because I think that is where the issue is. But I don't want to delete totally and have to recreate them all later. Thnaks again!
-
While you can test this over time, it would be difficult because you will never know if you've done enough to satisfy Panda. And really, you don't even know for sure if Panda is the culprilt. (I think it is, but no one can say for sure.)
So, let's say you took out some of the low quality content and a month later nothing has changed. That could mean that you didn't take out enough to make the Panda algorithm see your site as high quality. But, it could be that you just need more time. While some sites recover within one Panda refresh (and that usually happens approximately monthly), others seem to need several refreshes.
In regards to unpublishing vs deleting the content, you can either delete the pages or you can use a noindex tag to tell Google not to include the pages in the index. Having low quality pages on your site that are noindexed will not hurt you in the eyes of Panda.
-
Thanks Marie! I'm getting the feeling it's the content. Quick question: Could I just unpublish the content and then test over time, OR do I need to completely delete the questionable content from the site? Does Google see it if it is unpublished and still penalize?
-
There were two major algorithm updates last week - Panda and the Payday loans algorithm. Payday loans affects sites that had done really spammy link building and it is very unlikely that this affected you. But, Panda is certainly possible.
I haven't had a good look at the site, but I see that you have 263 pages indexed in Google. Are all of these pages high quality pages that Google would be proud to show to searchers? If you've got duplication amongst the pages or if you've got "unhelpful" pages that are indexed then you need to remove or noindex them. On a quick look here are some examples of pages that should be removed or noindexed:
http://www.popscrap.com/component/content/category/11-demo-articles
http://www.popscrap.com/component/users/?view=remind
http://www.popscrap.com/24-products/120-scrapshield - It looks like a good amount of the text on this page is on multiple pages of your site.
Of course, there could be other issues. If you've made any changes to the site recently then I'd look at those changes first, but otherwise I'd go on a thorough cleanup so that only the pages that are the best are shown to Google.
-
To help figure out what is causing the 404 errors do the following in webmaster tools:
-login to your websites profile, then on the left hand side navigation hit crawl > crawl errors > not found. Under not found review the list of URL's for clues (you can also click on an individual link to see where the 404 page was linked from). Depending on how large your site is, if the 747 not found URL's is a large percentage of your total page count, you could be experiencing a temporary rankings drop that will disappear one you fix your error pages. If you could add a link to a few of the 404 error pages we could help you figure out what is wrong with your site code or server setup.
-
Just my two cents friend..
4 days back, Google released Panda 4.0. You can check if that caused the drop.
Here is a tool that can help you find if any of the Google penalties are behind the drop:
http://www.barracuda-digital.co.uk/panguin-tool/
Once on the page, click on the 'Log-in to Analytics' button and allow the tool to access your Google Analytics account and check if the recent Panda caused the drop. Hope this helps.
Good luck. By the way, thin content is of no use these days and you should be investing all your quality time in producing quality content.
Best,
Devanur Rafi
-
I looked at some of your content, and some of it seems quite thin, such as the regulations for each state. There's really only a couple of sentences (in the instances that I saw) that deal with the individual state, and then there's a lot of boilerplate content, navigation, and other site elements that are the same from page to page. Just one more thing to think about.
-
It looks like google penalize you, it's happen to one of my websites on January, I was going nuts because I didn't see any message until 2 weeks after on my google webmaster tools. I would recommend a couple o days to see if you see something if not then try to check your links if a couple of websites you are linking got penalized they you can get in trouble too.
-
Kevin, any insight into where to start with respect to the 747 missing URL's?? What causes that? How to fix? Thanks!!
-
haha! Ok! Thanks Kevin!
-
No, no. My bad. You mentioned above that you've been ranking strong for two years, and then when I peaked at your site I saw the RT template. I wrongly assumed the Joomla template was released at the same time as the Magento template (I actually use the same exact template for Magento at www.88k.com.tw, although heavily modified). I was just thinking if you had done a site revamp with a new template that might be a factor in your recent bump off SERPs. Sorry to worry you about that. But it looks like you found an issue with the 404 errors. Good job.
-
Also, I just noticed this (see image). 747 missing URL's!?
-
What do you mean by "it's not 2 years old"? Is being under 2 years old a factor?
-
Thanks! Yes, it's Google. We actually are ranking better on Bing and Yahoo now!
Looked at Google Webmaster and it shows a steep drop on 5-21. (image attached)
-
A couple of things I'd do right away:
Look in Google Webmaster Tools to see if there are any notices there (I'm going to assume that it's Google where you are no longer ranking).
Look in your analytics to see if there was a particular day that you dropped off. You can then look to see if that coincided with any known algorithm update.
-
My bad. Looks like it is. It was release for Magento only late last year.
-
Always great to help out a fellow Rocketeer! Did you recently update your website, because that template is not 2 years old. This could certainly be a factor.
-
Thanks, Kevin. I haven't made any changes in months, and do not do any crazy linking schemes. Competitors seem to be at the same places on the page. We are the only one hit by this.
-
That's a tough one without more to go on. Google releases updates to it's ranking algorithm every so often and some site get hit hard. If you're content hasn't changed and you haven't engaged in any unusual activity in terms of link building or advertising, then I'd say wait it out. Give it a week or two, which is how long it's taken many other quality sites to bounce back from a Google update. Unlikely you'll have issues here, but you still might want to check your webmaster tools to see if any manual actions have been applied.
This might be a good time to go over your site, again, for the first time;-) See what could be done to answer visitor questions and lead them to the right pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mapping ALL search data for a broad topic
Hi All As our company becomes a bigger and bigger entity I'm trying to figure out how I can create more autonomy. One of the key areas that needs fixing is briefing the writers on articles based on keywords. We're not just trying to go after the low hanging fruit or the big money keywords but actually comprehensively cover every topic and provide actual good quality up to date info (surprisingly rare in a competitive niche) and eventually cover pretty much every topic there is. We generally work on a 3 tier system on a folder level, topics and then sub-topics. The challenge is getting an agency to: a) be able to pull all of the data without being knowledgeable in our specific industry. We're specialists and, thus, target people that need specialist expertise as well as more mainstream stuff (the stuff that run of the mill people wouldn't know about). b) know where it all fits topically as we kind of organise the content on a heirarchy basis. And we generally cover multiple smaller topics within articles. Am I asking for the impossible here? It's the one area of the business I feel the most nervous about creating autonomy with. Can we become be as extensive and comprehensive as a wiki-type website without having somebody within the business that knows it providing the keyword research. I did a searh for all data using the main two seed keywords for this subject on ahrefs and it came up with 168000 lines of spreadsheet data. Obviously this went way beyond the maximum I was allowed to export. Interested in feedback and, if any agencies are up for the challenge, do let me know! I've been using moz pro for a long time but have never posted and apologise if what I'm describing is being explained badly here. Requirements Keywords to cover all (broad niche) related queries in the UK, no relevant uk (broad niche) keywords will be missed Organised in a way that can be interpreted as article brief and folder structure instructions. Questions How would you ensure you cover every single keyword? Assuming no specialist X knowledge, how will you be able to map content and know which search queries belong in which topics and in what order. Also (where there is keyword leakage from other regions) how will you know which are UK terms and which aren’t? With minimal X knowledge – how will you know whether you’ve missed an opportunity or not (what you don’t know you don’t know) What specific resources will you require from us in order for this to work? What format will the data be provided to us in - how will you present the finished work so that it can be turned into article briefs?
Intermediate & Advanced SEO | | d.bird0 -
Why are these pages showing up as 404 in Search Console?
This page is showing up as a 404 in Google Search console- https://www.wolfautomation.com/blog/autonics/ It shows it has been linked from these pages- https://www.wolfautomation.com/blog/raffel/ https://www.wolfautomation.com/blog/new-temperature-controllers-from-autonics/ https://www.wolfautomation.com/blog/ge-industrial/ https://www.wolfautomation.com/blog/temp-controller/ https://www.wolfautomation.com/blog/tx4s/ I never created this page, I don't want this page but it keeps showing up. The problem is the link isn't found on those pages anywhere so I can't delete it. What am I missing? How can I get rid of it?
Intermediate & Advanced SEO | | Tylerj0 -
Traffic has not recovered from https switch a year ago.
I have an ecommerce site that was switched to https a year ago almost to the day. Our category pages are about half of what they were. The redirects were put in properly, and everything in webmaster tools looks good. Anything out there I may not have thought of? Want to add that the drop is only in Google, Bing stayed just fine.
Intermediate & Advanced SEO | | EcommerceSite0 -
Google Search Results...
I'm trying to download every google search results for my company site:company.com. The limit I can get is 100. I tried using seoquake but I can only get to 100. The reason for this? I would like to see what are the pages indexed. www pages, and subdomain pages should only make up 7,000 but search results are 23,000. I would like to see what the others are in the 23,000. Any advice how to go about this? I can individually check subdomains site:www.company.com and site:static.company.com, but I don't know all the subdomains. Anyone cracked this? I tried using a scrapper tool but it was only able to retrieve 200.
Intermediate & Advanced SEO | | Bio-RadAbs0 -
"No index" page still shows in search results and paginated pages shows page 2 in results
I have "no index, follow" on some pages, which I set 2 weeks ago. Today I see one of these pages showing in Google Search Results. I am using rel=next prev on pages, yet Page 2 of a string of pages showed up in results before Page 1. What could be the issue?
Intermediate & Advanced SEO | | khi50 -
Google cached pages and search terms
Here's something I noticed. We have a rank A page and it's ranking 10 on Google search results. When I hover my mouse over our search result, Google gives us a preview, but Google also highlights in red where the search keyword is present on the page. Reviewing our page, even though we have it as the h1 header and intro paragraph, Google is highlighting it half way down the page. Any ideas why? I review rank 1 - 5 and Google highlights the keyword on the intro paragraph and h1 header Have you guys experienced anything like this? It makes me think..Google could be crawling my site and thinking I haven't got it in the h1 or intro paragraph etc.. Thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Search Refinement URLs
My site is using search refinement and I am concerned about the URL adding additional characters when it's refined. My current URL is: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm and when someone chooses their specific year, make, and model then it changes to: http://www.autopartscheaper.com/Air-Conditioning-Heater-Parts-s/10280.htm?searching=Y&Cat=10280&RefineBy_7371=7708. Will this negatively affect SEO for this URL? Will the URL be counted twice? Any help would be great!
Intermediate & Advanced SEO | | BrandLabs0 -
Temporarily Delist Search Results
We have a client that we run campaign sites for. They have asked us to turn off our PPC and SEO in the short term so they can run some tests. PPC no problem straight forward action, but not as straight forward to just turn off SEO. Our campaign site is on Page 1, Position 4, 3 places below our clients site. They have asked us to effectively disappear from the landscape for a period of 1-2 months. Has anyone encountered this before, the ability to delist good SERP for a period of time? Details: Very small site with only 17 pages indexed within google, but home page has good SERP result. My issues are, How to approach this in the most effective manor? Once the delisting process is activated and the site/page disappears, then we reverse the process will we get back to where we were? Anyone encountered this before? I realise this is a ridiculous question and goes against SEO logic, get to page 1 results only to remove it, but hey, clients are always presenting new challenges for us to address..... Thanks
Intermediate & Advanced SEO | | Jellyfish-Agency0