Are they going to deindex everyone?
-
Looking at the over-optimisation list it pretty much seems to me something that everyone is doing so are we suddenly going to find the best results actually de-indexed.
Maybe Google will slowly shut off indexing stage by stage so everyone changes.
What are your thoughts?
-
Alan, I like most of your responses. A fat thumbs up for you! I don't risk getting my clients' websites. Imagine me losing all the income? I had a talk with my client that I can get him to the top by creating a little shady things (spam) OR I could take the long hard road. We took the long hard road which was based off his decision to retire in the next two years. We can't afford to get Google smacked.
@Gareth, not everyone spams or over-optimizes. I definitely don't over optimize. It lowers conversions because the content becomes unreadable. ACTUALLY, my sites are ranking higher now so I am glad this is happening.
-
I would be inclined to agree with the gentleman who have answered above me. I do not know what Google will do for sure as I believe almost no one including even some that work at Google do. However I would imagine Google would be more apt to tackle the people who are manipulating their ranking on Google by use of keyword stuffing, link farms and on and on. The fact that someone can be penalized for having a website that is to highly optimized I find disturbing however I really doubt and hope it is only the people that have practiced Google has told us our black/grey hat tactics. I consider myself a ethical business man like the gentleman above me said I would never practice any of the black/grey hat techniques to improve my own or my clients ranking it just doesn't make sense to do that. I would imagine if they were to delist everyone that has done something to make sure there on page seo is optimized they would devalue some of their results that everyday people count on Google. To put it plainly if they did that Google would hurt themselves because it ordinary people not get what they are looking for when they search Google. I hope this is of some help.
-
At the end of the day google is trying to get poor websites who are listing because they either stuffed title tags, crammed keywords in filler content in a footer and other poor seo tactics. If you optimized your site properly with keyword research in a logical manner I do not suspect to see an impact. The only impact you could see is the devaluing of links from poor websites to yours and you see a trickle down effect.
-
What list is this?
From all i have heard, it is spam that is getting de-indexed, i would not say everyone is spamming, I certainly do not risk my clients sites with spam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site redesign makes Moz Site Crawl go haywire
I work for an agency. Recently, one of our clients decided to do a complete site redesign without giving us notice. Shortly after this happened, Moz Site Crawl reported a massive spike of issues, including but not limited to 4xx errors. However, in the weeks that followed, it seemed these 4xx errors would disappear and then a large number of new ones would appear afterward, which makes me think they're phantom errors (and looking at the referring URLs, I suspect as much because I can't find the offending URLs). Is there any reason why this would happen? Like, something wrong with the sitemap or robots.txt?
Technical SEO | | YYSeanBrady1 -
Please help, (going bananaz) trying to trouble shoot sitemap submitted to Bing
We need help to figure out what seems to be an error in our sitemap.
Technical SEO | | IMSvintagephotos
We have submitted the sitemap to BING and the sitemap includes 1,2 million pages that should be crawled. After initial submission, Bing says in the dashboard that 1,2 million pages have been submitted. Then always after 2-4 days the number drops to either 500.000 pages or like now 250.000 pages. Why is that? is there an error in our sitemap and BING in excluding pages, and it lowers the submitted number after going through them and discovering the error ?. We need to figure this out and fix so that BING can crawl and index all our 1,2 million pages. See the screenshot showing the BING dashboard.
We are also having issues with google but we can't figure out what is going on. Here are the sitemaps: https://imsvintagephotos.com/google_sitemap/sitemap.xml and here: https://imsvintagephotos.com/sitemap.xml. Your website is www.imsvintagephotos.com qqp6gj0 -
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | | BT20090 -
Time to deindexing: WMT Request vs. Server not found
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error. I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that. Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
Technical SEO | | erin_soc0 -
Ranking going down and down and disappears.
I asked a question a few weeks ago about a main keyword that we are targeting that is fluctuating up and down. The keyword is "trash bags" and that is what my company sells. All different colors, thicknesses and sizes. This isn't just a random keyword we are trying to optimize for, this is our business. Before I started optimizing the website for "trash bags" we used the term "garbage bags", now that we started we have been off the charts and on the charts, but we have never regained rankings. The trend finally looked like it was going upwards... but now I see I dropped off of google again. Is this normal? Should I be worried that google is penalizing me for this keyword (There is many links that have trash bags in the anchor text - but we do sell that!)? Here is a screenshot of our ranking history for trash bags: https://www.diigo.com/item/image/3vpdp/no01
Technical SEO | | EcomLkwd1 -
Why is google not deindexing pages with the meta noindex tag?
On our website www.keystonepetplace.com we added the meta noindex tag to category pages that were created by the sorting function. Google no longer seems to be adding more of these pages to the index, but the pages that were already added are still in the index when I check via site:keystonepetplace.com Here is an example page: http://www.keystonepetplace.com/dog/dog-food?limit=50 How long should it take for these pages to disappear from the index?
Technical SEO | | JGar-2203710 -
We just fixed a Meta refresh, unified our link profile and now our rankings are going crazy
Crazy in a bad way!I am hoping that perhaps some of you have experienced this scenario before and can shed some light on what might be happening.Here is what happened:We recently fixed a meta refresh that was on our site's homepage. It was completely fragmenting our link profile. All of our external links were being counted towards one URL, and our internal links were counting for the other URL. In addition to that, our most authoritative URL, because it was subject to a meta refresh, was not passing any of its authority to our other pages.Here is what happened to our link profile:Total External Links: Before - 2,757 After - **4,311 **Total Internal Links: Befpre - 125 After - 3,221
Technical SEO | | danatanseo
Total Links: Before - 2,882 After - 7,532Yeah....huge change. Great right? Well, I have been tracking a set of keywords that were ranking from spots 10-30 in Google. There are about 66 keywords in the set. I started tracking them because at MozCon last July Fabio Riccotta suggested that targeting keywords showing up on page 2 or 3 of the results might be easier to improve than terms that were on the bottom of page 1. So, take a look at this. The first column shows where a particular keyword ranked on 11/8 and the second column shows where it is ranking today and the third column shows the change. For obvious reasons I haven't included the keywords.11/8 11/14 Change****10 44 -34
10 26 -16
10 28 -18
10 34 -24
10 25 -15
15 29 -14
16 33 -17
16 32 -16
17 24 -7
17 53 -36
17 41 -24
18 27 -9
19 42 -23
19 35 -16
19 - Not in top 200
19 30 -11
19 25 -6
19 43 -24
20 33 -13
20 41 -21
20 34 -14
21 46 -25
21 - Not in top 200
21 33 -12
21 40 -19
21 61 -40
22 46 -24
22 35 -13
22 46 -24
23 51 -28
23 49 -26
24 43 -19
24 47 -23
24 45 -21
24 39 -15
25 45 -20
25 50 -25
26 39 -13
26 118 - 92
26 30 -4
26 139 -113
26 57 -31
27 48 -21
27 47 -20
27 47 -20
27 45 -18
27 48 -21
27 59 -32
27 55 -28
27 40 -13
27 48 -21
27 51 -24
27 43 -16
28 66 -38
28 49 -21
28 51 -23
28 58 -30
29 58 -29
29 43 -14
29 41 -12
29 49 -20
29 60 -31
30 42 -12
31 - Not in top 200
31 59 -28
31 68 -37
31 53 -22Needless to say, this is exactly the opposite of what I expected to see after fixing the meta refresh problem. I wouldn't think anything of normal fluctuation, but every single one of these keywords moved down, almost consistently 20-25 spots. The further down a keyword was to begin with, it seems the further it dropped.What do you make of this? Could Google be penalizing us because our link profile changed so dramatically in a short period of time? I should say that we have never taken part in spammy link-building schemes, nor have we ever been contacted by Google with any kind of suspicious link warnings. We've been online since 1996 and are an e-commerce site doing #RCS. Thanks all!0 -
Opinions on SEOHosting.com - will this get me deindexed?
On a recent post in /r/SEO I mentioned that I used SEOHosting.com in the past, and it was met with several warnings that this could result in deindexation or penalization. I just wanted to know if there is anything to back this up. I did some digging on my own and it looks like some of the private blog networks that got shut down recently were using SEOHosting.com, and they were speculating (screenshot of pertinent parts included below) that this is how Google was able to track down their network and shut 'em down. But I've also heard a lot of speculation that the smartest way for Google to map out these networks would be to create tons of content, submit it all, and look for patterns - which has me wondering what role, if any, SEOHosting.com played in taking down the private blog networks earlier this month. Is using SEOHosting.com a legitimate concern? Is it a threat even to websites that meet Google's quality guidelines? Is it especially a threat to those that don't? Any thoughts you guys have would be greatly appreciated. XX6nM.jpg
Technical SEO | | AnthonyMangia0