No longer to be found for "certain" keywords.
-
I'd like to see if anyone could potentially shade a light on this rather strange scenario:
Basically yesterday I noticed that we are no longer to be found for 'certain' keywords that we had page 2-3 ranking. Yet, for other keywords we still appear on page 2-3. These keywords are very competitive and our rankings has constantly improved in the course of 5-6 months.
Now my question is that what could or may have contributed to the fact that for only some keywords we are no longer to be found? Another question is, can Google remove you from their SERPs for certain keywords 'only'? Thank you,
Maximilian. -
Thank you once again. I will get in touch with you. Please PM your contact details. We are located in downtown Manhattan.
-
Strategies for dealing with external duplicate content are overviewed here.
A much bigger concern is duplicating your OWN content. I know it sounds kind of silly, but it is a very real issue, especially since Google's most recent update (Panda). Is there more than one URL for any page on your website? Does http://website.com direct to http://www.website.com? Do you have canonicalization problems related to pagination or something similar?
These issues are discussed in-depth here.
If you think that your issues may be related to on-site or site architecture factors, the best thing you can do is hire a qualified SEO consultant that can assess these issues and make actionable recommendations for correcting them.
-
Anthony, our website has been online since 2002 and we have been getting organic quality backlinks for a very long time. Directory submission was just added literally 4 weeks in an effort to 'add more value'. It was never intended to use it as the only source, but just was an effort to increase the rankings. Who knew it would or could potentially backfire.
On another note, you mentioned that duplicate content may cause this issue. Our content is very reach and well-written and there are a number of sites who have copied/pasted our content. Despite all of our efforts and having had contacted their hosting companies to shut down their website, no changes have been made. Now my question is, do we get affected if "others" have copied our content on their website? Does Google have any algorithmic to define which site has copied which sites data?
Thank you once again for your insightful information.
-
Maximilian -
If Google has devalued some of your links, there's no way to "recover" these links. That's the bad news.
The good news is that this happens to websites every day, and it's not by any means a permanent penalization or anything of that sort. All it means is that the time and effort spent to build these links was squandered.
What can you do to speed up your recovery? Switch the way you think about link building. Directory submissions are useful, but they shouldn't by any means comprise the majority of your link building strategy.
I would suggest reading this article cover to cover. It will teach you the proper mindset you should have when approaching linkbuilding, and it will give you dozens and dozens of ideas to get started with building high quality links from relevant, authoritative domains.
Replace directory submissions with strategies like writing guest blog posts, creating amazing content (linkbait) and subsequently promoting it on social media and social bookmarking websites, and/or creating a widget that would be an invaluable addition to any website in your industry. All of these are covered under the "Content-Based Link Building Strategies" section of the SEOmoz Professional Guide to Linkbuilding that is linked to above.
-
"You have been building low quality links and they have been devalued by Google, causing you to lose hundreds (thousands?) of keyword-rich backlinks."
Anthony, thank you for your response. The above statement could potentially be the source of issue, considering one of our staff members been doing directory submissions (30 per day).
Now you mentioned that Google may have devalued the high-quality banklinks that have had due to gaining low quality backlinks. In your experience, what can be done to resolve this issue? Is this devaluation something permanent or a temporary? What can be done to speed up the recovery?
Thank you in advance.
-
There are tons of reasons why your website might see a drop in rankings for certain keywords. Here are a few:
-
Increased competition in your niche pushed your website down as new websites started to rank above you.
-
You have been building low quality links and they have been devalued by Google, causing you to lose hundreds (thousands?) of keyword-rich backlinks.
-
Your website has been affected by a Google algorithm update. For one reason or another (duplicate content? shallow or nonexistent content on some pages? too many ads / too big of a "footprint"?), your website is being seen as less trustworthy, and this has affected rankings sitewide. (Note, some rankings would be maintained, particularly for keywords with low competition.)
-
You were logged into your Google account when you saw these rankings, and your website was appearing higher than it typically does because of your personalized search data. When you logged out and checked your rankings, you noticed a "drop" in ranking that was not actually real. (Okay, this one is unlikely, but we've all seen it before.)
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do i rank for 1000 keywords?
i have dr 25 and 200 referring domains and ranking for 90 kws in usa. i saw this trend that if you rank for more kws then chances are that you can rank for those high traffic kws in 1 to 5 positions. what i mean is that it increases your odds ? possible answer1 :increase dr and da both and ur and pa ( ahrefs and moz) i know pagerank matters but these are some metrics we can look at for right now possible answer 2 : get a lot of backlinks maybe from same site but how does my backlinks can help me to rank for 1000 kws so that i can have at least 100 kws to rank in position 1 to 5? detailed answers will defi be appreciated
Intermediate & Advanced SEO | | Sam09schulz0 -
Disallowed "Search" results with robots.txt and Sessions dropped
Hi
Intermediate & Advanced SEO | | Frankie-BTDublin
I've started working on our website and I've found millions of "Search" URL's which I don't think should be getting crawled & indexed (e.g. .../search/?q=brown&prefn1=brand&prefv1=C.P. COMPANY|AERIN|NIKE|Vintage Playing Cards|BIALETTI|EMMA PAKE|QUILTS OF DENMARK|JOHN ATKINSON|STANCE|ISABEL MARANT ÉTOILE|AMIRI|CLOON KEEN|SAMSONITE|MCQ|DANSE LENTE|GAYNOR|EZCARAY|ARGOSY|BIANCA|CRAFTHOUSE|ETON). I tried to disallow them on the Robots.txt file, but our Sessions dropped about 10% and our Average Position on Search Console dropped 4-5 positions over 1 week. Looks like over 50 Million URL's have been blocked, and all of them look like all of them are like the example above and aren't getting any traffic to the site. I've allowed them again, and we're starting to recover. We've been fixing problems with getting the site crawled properly (Sitemaps weren't added correctly, products blocked from spiders on Categories pages, canonical pages being blocked from Crawlers in robots.txt) and I'm thinking Google were doing us a favour and using these pages to crawl the product pages as it was the best/only way of accessing them. Should I be blocking these "Search" URL's, or is there a better way about going about it??? I can't see any value from these pages except Google using them to crawl the site.0 -
Pages excluded from Google's index due to "different canonicalization than user"
Hi MOZ community, A few weeks ago we noticed a complete collapse in traffic on some of our pages (7 out of around 150 blog posts in question). We were able to confirm that those pages disappeared for good from Google's index at the end of January '18, they were still findable via all other major search engines. Using Google's Search Console (previously Webmastertools) we found the unindexed URLs in the list of pages being excluded because "Google chose different canonical than user". Content-wise, the page that Google falsely determines as canonical instead has little to no similarity to the pages it thereby excludes from the index. False canonicalization About our setup: We are a SPA, delivering our pages pre-rendered, each with an (empty) rel=canonical tag in the HTTP header that's then dynamically filled with a self-referential link to the pages own URL via Javascript. This seemed and seems to work fine for 99% of our pages but happens to fail for one of our top performing ones (which is why the hassle 😉 ). What we tried so far: going through every step of this handy guide: https://moz.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study --> inconclusive (healthy pages, no penalties etc.) manually requesting re-indexation via Search Console --> immediately brought back some pages, others shortly re-appeared in the index then got kicked again for the aforementioned reasons checking other search engines --> pages are only gone from Google, can still be found via Bing, DuckDuckGo and other search engines Questions to you: How does the Googlebot operate with Javascript and does anybody know if their setup has changed in that respect around the end of January? Could you think of any other reason to cause the behavior described above? Eternally thankful for any help! ldWB9
Intermediate & Advanced SEO | | SvenRi1 -
"WWW" versus non "WWW" on domain
We plan on migrating our site to a new shorter domain name. I like the idea of removing "www" to gain an additional 3 letters in the URL display. Is there any disadvantage of doing so from a technical or SEO perspective? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Ranking drop for "Mobile" devices category in Google webmaster tools
Hi, Our rank dropped and we noticed it's a major drop in "Mobile" devices category, which is contributing to the overall drop. What exactly drops mobile rankings? We do not have any messages in search console. We have made few redirects and removed footer links. How these affect? Thanks,
Intermediate & Advanced SEO | | vtmoz
Satish0 -
Risk Using "Nofollow" tag
I have a lot of categories (like e-commerce sites) and many have page 1 - 50 for each category (view all not possible). Lots of the content on these pages are present across the web on other websites (duplicate stuff). I have added quality unique content to page 1 and added "noindex, follow" to page 2-50 and rel=next prev tags to the pages. Questions: By including the "follow" part, Google will read content and links on pages 2-50 and they may think "we have seen this stuff across the web….low quality content and though we see a noindex tag, we will consider even page 1 thin content, because we are able to read pages 2-50 and see the thin content." So even though I have "noindex, follow" the 'follow' part causes the issue (in that Google feels it is a lot of low quality content) - is this possible and if I had added "nofollow" instead that may solve the issue and page 1 would increase chance of looking more unique? Why don't I add "noindex, nofollow" to page 2 - 50? In this way I ensure Google does not read the content on page 2 - 50 and my site may come across as more unique than if it had the "follow" tag. I do understand that in such case (with nofollow tag on page 2-50) there is no link juice flowing from pages 2 - 50 to the main pages (assuming there are breadcrumbs or other links to the indexed pages), but I consider this minimal value from an SEO perspective. I have heard using "follow" is generally lower risk than "nofollow" - does this mean a website with a lot of "noindex, nofollow" tags may hurt the indexed pages because it comes across as a site Google can't trust since 95% of pages have such "noindex, nofollow" tag? I would like to understand what "risk" factors there may be. thank you very much
Intermediate & Advanced SEO | | khi50 -
"Jump to" Links in Google, how do you get them?
I have just seen yoast.com results in Google and noticed that nearly all the indexed pages show a "Jump to" link So instead of showing the full URL under the title tag, it shows these type of links yoast.com › SEO
Intermediate & Advanced SEO | | JohnPeters
yoast.com › Social Media
yoast.com › Analytics With the SEO, Social Media and Analytics all being clickable. How has he achieved this? And is it something to try and incorporate in my sites?0 -
Meta keywords vs tags
On a blog from an SEO perspective how do you choose keywords to use in the "meta keyword tag" vs. "post tags"? Will it be different based on the search volume/competition of the keywords targeted?
Intermediate & Advanced SEO | | saravanans0