Subfolder ranks worse than the rest of the site
-
We have the strangest problem. The blog for our website ranks very poorly:
www.lifeionizers.com/blog = average position in SERPs = 200. The site itself has an average position in SERPs of 12. The blog has a few terms it ranks #1 for such as branded terms and:
is mineral water alkaline = 1.3
kangen water vs alkaline water = 2.6
kangen water pyramid = 1.2
ph of redbull = 1.1 (Used by Google as answer in knowledge graph)
But the blog ranks terribly for most search terms. This blog has about 440 pages of in-depth, well-written authoritative content. Readers are well engaged, the blog has a bounce rate of ~3.5% with average time on page of over 6 minutes. The problem can't be the quality of the content.
Does Google levy penalties against specific subdirectories? Or is this a configuration problem? Bad links have been disavowed.
-
Awesome! Thanks for the update.
-
UPDATE: Organic traffic to the blog has doubled in the last few days. It started going up about 4 days after I unblocked the categories from being crawled. I'm not certain that it's all Google organic traffic, but it sure looks encouraging since the blog hasn't responded to any SEO fixes for nearly two years!
-
I would definitely check out the list on Clarity: https://clarity.fm/browse/technology/wordpress - and for developers it's often best to look in your own network - so I'd ask friends / colleagues for referrals, or you can search your LinkedIn connections as well.
-
Got to work on those blocked scripts. It turns out they are all outside resources:
- Hubspot
- Zopim Live Chat
According to Google, if outside resources are blocked, you have to contact the vendor about unblocking them. I contacted both Hubspot and Zopim, they will get back to us in about 2 days. Thankfully, we didn't have any of our scripts/CSS blocked.
I'm also working on that redirect. It turns out that if I shut it down, and redirect to the 200 OK page, that the blog will then render search result pages that will be indexed. That will give us a massive duplication problem. Its because of this that we did the redirect in the first place.
We're considering getting a Wordpress pro to come in and fix it right. Any suggestions?
-
Great! So glad it's helped so far, please keep us updated
-
You were right about the Fetch and Render. I found tons of scripts and images that were blocked. We're unblocking them all. I'm still working on your other suggestions, we do have a lot of old content. My plan is to leave all the evergreen content, and purge everything else
-
I've unblocked the categories in Wordpress. Let's keep our fingers crossed!
-
Hi - I wouldn't focus too much on that - I would take the suggestions made in my first answer and start with those! You really don't want to block crawling of categories!
-
My mistake, we deleted a page with a similar URL. That page was published on Dec 9th. Three days is not an uncommon lag for Google to index a new blog post. WMT shows that we are only indexed to the 7th of December. Google appears to re-index our site once per week:
Lastest index 12/7
Previous index 11/30
Previous index 11/23
Is this unusual? And thanks for your help! This has been a very frustrating problem!
-
Do you reactivate it? It's still live:
http://www.lifeionizers.com/blog/health-more/benefits-alkaline-water-hair-loss
-
We recently deleted that page. It was ranked at ~1000 in SERPs, so that indicated to us that Google had a major problem with it. Since we couldn't figure it out, we got rid of the page.
-
I just want to add for record, one thing that was really interesting. That is this page: http://www.lifeionizers.com/blog/health-more/benefits-alkaline-water-hair-loss
Was cached in Google but not indexed - which is odd. And to me a sign that Google is not crawling and processing the blog correctly. I've attached screenshots since they may very well index the page shortly.
Cache - http://screencast.com/t/cZcGbIHb
Site: search not indexed - http://screencast.com/t/IJQbyMhd
-
Hi - this was an interesting one! But I think I have found some of the issues.
- I'd really let Google crawl the categories. They are currently blocked from crawling in robots.txt - http://www.lifeionizers.com/robots.txt - this is an issue because I suspected part of the problem may be due to crawl efficiency. One reason I say this, is because Google has yet to index a blog post from about 2-3 days ago.
- This is a small thing, but link to the 200 OK version of the blog from your main menu. Right now, it links to /blog but then redirects to /blog/ with the trailing slash. Any little bit friction you can reduce the better.
- Because you have a lot of things in that robots.txt file - I would definitely perform some fetch and render tests in webmaster tools. Here's the thing, Google has said if you block CSS or JS from being crawled it will harm your sites ranking - so definitely do fetch and render and make sure that's not the case.
- The order of "Recent Articles" in the main content area in the blog homepage: http://www.lifeionizers.com/blog/ - don't seem to be "recent" at all. At least they are not in chronological order. This is confusing for me (and others users probably) so likely very confusing for Google. Most would expect the /blog/ homepage to list the most recent posts by published date. Especially since it is labled "recent". If these are supposed to be maybe "popular" I would label it as such.
- Lastly, with this much old content I would do a thorough content audit (directions here or here) of your blog. You should prune old, poor, outdated, low-traffic content just like you'd prune a plant - this will certainly help user metrics signals and keep your indexed:trafficked ratio healthy!
Those are just some of the immediate things I saw. I'd start there.
-
I think that it could just be that the key terms are extremely competitive. I would advise actually having someone take a look at the site in depth. Anything someone says without actually seeing the site is just speculation and maybes. I'm sorry I can't be more help!
-
I have been treating them as separate entities. I've focused on testing the blog, and fixed everything I could find - it had no effect. I've voraciously pursued scrapers with takedown orders etc, it had no effect.
I'm reaching out now, because I'm out of ideas, done everything I could, and nothing has worked. We are considering abandoning SEO entirely because there seems to be nothing we can do to get our rank to improve. I'm hoping someone in here can help me figure out what the problem is before we abandon ship
-
All on the subdomain? Google treats your subdomain as a separate site from your domain. If the penalty was on the subdomain level, that is where you need to focus your efforts. You have to treat them as separate entities.
-
We've seen a lot of Keywords improve significantly (+200 positions improvement in SERPs) but then a week or so later, they simply drop back down to where they were. We've seen other terms improve, and stay improved. We've also picked up about 500 keyword phrases since Penguin 3.0. The site as a whole has improved it's position in SERPs by about 20 positions since Penguin
So the answer is a definite we don't know.
-
Did you have a Penguin penalty by chance? You said you disavowed bad links, but if you were penalized was the penalty removed? I think the terms that you are targeting are extremely competitive and you need to do some more off site op to get them ranking well. Run a competitive SERP and see what page on looks like.
-
Yes, this blog targets specific terms related to alkaline water, water ionizer, ionized water, kangen water. We used to be competitive for all those terms, until Google nuked us. We fixed everything we could find, SEO-wise, but have seen zero improvement for those search terms.
I'd expect that if you improve the copy on a page, and promote it in social, that it should do better than position 200 (Google supplemental index). But SEO optimization on-site has had no effect on how this blog ranks. It improved from 220 to 200 after Penguin ran recently, but that's it
-
Without seeing the site I am not sure what else it could be. Are the blogs targeting specific key terms? If so, did you analyze them to see what metrics you need in order to compete with the people on page 1?
-
Yes, we've built good links for the site, and the blog has acquired good organic links all on it's own. We share regularly on social media and the blog has videos from YouTube on it.
This is the strangest thing. The blog has been built and maintained using white hat techniques, with every effort to provide value to the user, and play by the rules. Yet Google still treats it like we're pushing payday loans or something.
I've been fighting this for a year and a half, with no improvement. As a company, we are at our wits and and may just shut the blog down if this persists
-
The bad links have been disavowed but have any good links been built? Refresh some of those links with good, high quality, relative links. Share some of the pages on social media and add a couple of videos if you can, from Youtube. All of those things should help you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
More content, more backlinks, more appealing design, better SEO than competitor but ranking worse. Why?
Me and my team are working hard to put a good SEO strategy in place and good, useful content for our visitors. We're online for a little more than a year and we got some success but nothing compared to the majority of our competitors. The only thing we noticed so far is that our strongest competitors all seem to have a website based on WordPress... well is it possible to rank better than a site based on WP if we optimize everything well? One of our competitor is a very simple, thin affiliate website based on WP and is called www.comparatif-meilleur.fr ... their content is limited but they do better than us in term of ranking organically... On our side (www.lecomparateur.net) we have backlinks from Wikipedia, Dmoz and a couple of other very good sites, plus a better content, better design and better overall optimization. There must be something we do bad, but we can't find it. Please help 😞 P.S. We update our site quite often too
Intermediate & Advanced SEO | | benoit_20180 -
If I put a piece of content on an external site can I syndicate to my site later using a rel=canonical link?
Could someone help me with a 'what if ' scenario please? What happens if I publish a piece of content on an external website, but then later decide to also put this content on my website. I want my website to rank first for this content, even though the original location for the content was the external website. Would it be okay for me to put a rel=canonical tag on the external website's content pointing to the copy on my website? Or would this be seen as manipulative?
Intermediate & Advanced SEO | | RG_SEO1 -
Finding Ranking for search term and increasing ranking
Hi. The company that I'm working with would like to rank highly in google for certain generic search terms (dentist, dentists, etc.). Certain websites the company has used to rank highly in google for generic keywords, but has not for years now since google has revised their algorithm so many times. Moz lists that the company websites are not found in the top 51+ results in google. My first question is: **Is there a way, apart from manually searching the results, to find the ranking position of the website in google? **Ideally, I would like to find a program that will do this. Second, I've been reading a lot of the great articles and comments on Moz, and I've been learning a lot more about SEO. My focus has shifted to spending more attention on User Experience and Social Media instead of placing the exact keywords in the pages / tags of the website. What area(s) should I be focusing on to best increase the ranking of the company website for certain generic terms? Ideally, I'd like to create good quality content, so that users will not instantly click away. I appreciate any thoughts or comments. Thank you in advance!
Intermediate & Advanced SEO | | americasmiles0 -
Ranking sites in vertical markets with 90% scraped content
Hi, Hoping to get advice about ranking sites (a vertical market search engine/portal like a car site for example) that gets its content from scraping car sites. For various reasons (mostly scale eg cant get car dealers to push their listings to us) content was scraped. The startup has received great press, TV interviews, incubator programs etc, and has also secured very significant investment. I feel if this site was launched pre-panda it would be ranking much better. We have invested significantly in our tech, our search tools and site innovation place us easily as market leader in this space. Anyone with experience in ranking sites with legitimate reasons for using scraped content?
Intermediate & Advanced SEO | | edthomasnp0 -
How can I rank a national site for local terms
Hi All I have a website that covers all parts of the UK and I wish to be found for terms such as "car for sale London" "car for sale Manchester" and so on. In the past I have created separate landing pages for each town and city but with the quality score of a page becoming more of a ranking factor it is hard to make 300 + town pages interesting and useful. Is it best practice to do what I am doing and improve the quality of each of the pages or would I be better off removing the old pages and using some other technique to rank for the local searches? Thanks for your help
Intermediate & Advanced SEO | | MotoringSEO0 -
2 Year Old Keyword Focused Site Will Not Rank for Keyword
Hi All, I need your help. This site is confounding me. The site is turnstilefactory.com It's a few years old. Strong domain name and seo focused on the term 'turnstile'. In bound links are not abundant, but certainly not absent either. Considering the subject matter, content and competition in the space, I would expect this site by now to at least be in the top 10 pages for the search 'turnstile', but it's not. I've tried everything I can think of with this, but it just won't rank for anything other than it's domain name. Can anyone please take a look and let me know if they see something I'm missing? It would be appreciated. Thanks.
Intermediate & Advanced SEO | | seomozpaul0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0