Weird behavior with site's rankings
-
I have a problem with my site's rankings.
I rank for higher difficulty (but lower search volume) keywords , but my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.I thought very seriously to start new with a new domain name, cause what ever i do seems that is not working.
I will admit that in past (2-3 years ago) i used some of those "seo packages" i had found, but those links which were like no more than 50, are all deleted now, and the domains are disavowed.
The only thing i can think of, is that some how my site got flagged as suspicious or something like that in google.Like 1 month ago, i wrote an article about a topic related with my niche, around a keyword that has difficulty 41%. The search term in 1st page has high authority domains, including a wikipedia page, and i currently rank in the 3rd place.
In the other had, i would expect to rank easily for a keyword difficulty of 30-35% but is happening the exact opposite.The pages i try to rank, are not spammy, are checked with moz tools, and also with canirank spam filters. All is good and green. Plus the content of those pages i try to rank have a Content Relevancy Score which varies from 98% to 100%...
Your opinion would be very helpful, thank you.
-
Hi Nikos,
It's important to remember that Keyword Difficulty scores are a Moz metric, not a Google metric - they are based on Moz' ability to judge how well other sites are competing for that term, and may not capture the entire competitive landscape (since nobody except Google knows everything that Google looks at).
Based on your ability to rank well for some terms and not others, it doesn't seem likely to me that you are under any sort of penalty, so much as that Google just isn't ranking you for some terms. In addition to the Keyword Difficulty scores for each term, take a look at which sites rank for the term (you can do this in the SERP Analysis feature of the Keyword Difficulty tool. Ask youself:
- What kinds of sites rank for this term? For example, if you are an individual business, but all of the sites and pages that are ranking for that term are aggregators or lists of multiple sites, it may be that Google has determined that an individual business site is not a good fit for that query. Similarly, if your page is a blog post and no other blog posts appear in the SERP, Google may have decided that a blog post isn't what people are looking for when they search that term.
- What is the search intent of the query? Based on the other pages that rank, what is the question or task that Google has decided users are trying to answer or complete when they search this term? Does your page do a better example of helping answer that question or complete that task than the other pages that rank?
- What types of content are ranking? Do they all have rich snippets? Are there images, video, shopping or maps results? All of these will tell you more about the kind of content Google thinks will match this query.
- Is there a specific page or website that is ranking for that term that you think you could push out of the top 10? Look for areas of opportunity. For example, maybe there is a site with high authority, but the page that ranks has very low page authority and doesn't fit the query very well. Try to create a page that is better than that page, specifically.
- How closely is the phrase related to your niche? You can tell from the keywords you are successfully ranking for, which topic areas Google is associating with your site. If you have a whole site about chocolates, it will be harder to rank a page about asparagus, even if the difficulty score is lower.
Also, don't forget to continue promoting your content to earn high-authority links to individual content pieces. Where it makes sense to do so, you may also want to link internally from some of your more popular and successful pages to some of the pages that are struggling.
I hope that helps!
-
Hi!
I have the same question as before
If someone has an idea, i would love to hear it -
Hi Nikos! Did EGOL answer your question? If so, please mark his response as a "Good Answer." If not, what questions do you still have?
-
Thanks for your answer.
User experience was one of my first concerns. So i purchased a bootstrap theme, which actually looks very good and is very user friendly. You can check it here. The pages i try to rank for, looks very similar to that one.
Time on site and Bounch rate
Average Bounch rate is 60% , and average time on page is 4 minutes, and 10 seconds (average last month metrics). My site is actually a review site if that helps you somehow.I receive often link requests from other webmasters (meaning other people think my site looks, and content is good), so overal, i don't think my site deserving those rankings. Unless some "old sins" are chasing me.
-
my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off.
We often focus too much on competitive metrics and not enough about the presentation that we are making to our visitors. Many search professionals believe that google is looking at the behavior of visitors, how long they stay, how far they scroll, the number who click in, do they bookmark, do they share your site with friends... and more important... Are They Asking for You By Name in navigational and domain queries?
This is much of the "machine learning" that Google has patented and what they say they are using in some of their new algorithms. I've believe that this has been important for a long time and was willing to stick my neck out about it and bet my ranch a long time ago.
lower difficulty, higher volume keywords
The numbers you are looking at are not based upon what visitors think of your site and how they behave, they are based upon completely different things. I don't think that Moz or others who publish keyword difficulty estimations have very good abilities for determining how visitors behave. Google is the one who has that data, both from the SERPs and from Chrome, and from the engagement platforms like bookmarks and + and other things that they either control or can count.
Keyword difficulty is a brute force metric. Visitor satisfaction is much more discerning and very hard to measure.
which literally pisses me off.
How do your visitors feel when they try to use your website? Compare your site to the sites at the top of the SERPs. Do they have better content? Do they give a better visitor experience? Do they have a broader menu? Is their design better for navigation, comfort of reading, scanning, sharing, and all of the things that people want to do on a website. How do visitors feel when they click in.
Lots of people believe that it is really easy to earn good metrics. Really easy. But it is harder than Hell to please your visitor. How are you doing there? Take a look at be honest.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
What's the best URL structure?
I'm setting up pages for my client's website and I'm trying to figure out the best way to do this. Which of the following would be best (let's say the keywords being used are "sell xgadget" "sell xgadget v1" "sell xgadget v2" "sell xgadget v3" etc.). Domain name: sellgadget.com Potential URL structures: 1. sellxgadget.com/v1
Intermediate & Advanced SEO | | Zing-Marketing
2. sellxgadget.com/xgadget-v1
3. sellxgadget.com/sell-xgadget-v1 Which would be the best URL structure? Which has the least risk of being too keyword spammy for an EMD? Any references for this?0 -
Why isn't my site being indexed by Google?
Our domain was originally pointing to a Squarespace site that went live in March. In June, the site was rebuilt in WordPress and is currently hosted with WPEngine. Oddly, the site is being indexed by Bing and Yahoo, but is not indexed at all in Google i.e. site:example.com yields nothing. As far as I know, the site has never been indexed by Google, neither before nor after the switch. What gives? A few things to note: I am not "discouraging search engines" in WordPress Robots.txt is fine - I'm not blocking anything that shouldn't be blocked A sitemap has been submitted via Google Webmaster Tools and I have "fetched as Google" and submitted for indexing - No errors I've entered both the www and non-www in WMT and chose a preferred There are several incoming links to the site, some from popular domains The content on the site is pretty standard and crawlable, including several blog posts I have linked up the account to a Google+ page
Intermediate & Advanced SEO | | jtollaMOT0 -
'Nofollow' footer links from another site, are they 'bad' links?
Hi everyone,
Intermediate & Advanced SEO | | romanbond
one of my sites has about 1000 'nofollow' links from the footer of another of my sites. Are these in any way hurtful? Any help appreciated..0 -
Do links to PDF's on my site pass "link juice"?
Hi, I have recently started a project on one of my sites, working with a branch of the U.S. government, where I will be hosting and publishing some of their PDF documents for free for people to use. The great SEO side of this is that they link to my site. The thing is, they are linking directly to the PDF files themselves, not the page with the link to the PDF files. So my question is, does that give me any SEO benefit? While the PDF is hosted on my site, there are no links in it that would allow a spider to start from the PDF and crawl the rest of my site. So do I get any benefit from these great links? If not, does anybody have any suggestions on how I could get credit for them. Keep in mind that editing the PDF's are not allowed by the government. Thanks.
Intermediate & Advanced SEO | | rayvensoft0 -
Is My site seeing a Google Dance ? - Rankings all over the place
My eCommerce website is seeing some rankings flucuate daily from say rank 20 to rank 130 whilst some other keywords ago up and down by as much as 40 places. I have been putting up alot of new unique content and can see in GWT that google has been crawling my site more but given that I was affected by the google panda updates which saw a 40% drop in traffic I'm only just to recover some of it., i have also been trying to get rid of any poor links and our linking building is only concentrating on high quality posts and links. I am wondering if this is the post panda update - "google dance" or is google having issues trying to work where to rank my site and possibly punish me? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
2 Year Old Keyword Focused Site Will Not Rank for Keyword
Hi All, I need your help. This site is confounding me. The site is turnstilefactory.com It's a few years old. Strong domain name and seo focused on the term 'turnstile'. In bound links are not abundant, but certainly not absent either. Considering the subject matter, content and competition in the space, I would expect this site by now to at least be in the top 10 pages for the search 'turnstile', but it's not. I've tried everything I can think of with this, but it just won't rank for anything other than it's domain name. Can anyone please take a look and let me know if they see something I'm missing? It would be appreciated. Thanks.
Intermediate & Advanced SEO | | seomozpaul0 -
How can someone not call B.S. on this site ranking 4th.
We manage a lot of sites that are around pharmaceuticals and lawsuits. I was checking a couple of the sites around the keyword: Actos Lawsuit using the keyword difficulty with serp analysis. Our sites have done very little Adwords except for first month about a year ago and we have always ranked well and the client is very happy with the results. Tonight I notice a site that is http://wikilawsuit.org/drug-recalls/actos-side-effects-bladder-cancer-actos-lawsuit/ They are ranked fourth on Google. Our url which is http://actos-lawsuit.org/ is ranked 9th?? Frankly there are several sites ranked ahead and when you look at the parameters all the way across some we are killing. But Wiki, everyone is killing and it is still fourth. I ran it in OSE and the metrics came back better, but there is at best 3 to 4 real links out of 30 domains. This is a commercial site with a contact form in right sidebar and my guess is they are selling leads to lawyers. So they are about as Wiki as Hooters. That said, we see all the talk about quality links and I am seeing a lot of sites with few quality links and lots of junk links. Should we still believe it matters? Or, is it that it matters when the sites are huge (JC Penny), etc. but not if the site is under some critical number of poor links? Looking forward to a moz Fest on this.
Intermediate & Advanced SEO | | RobertFisher0