Big rise in "Keyword not defined"
-
Hi, all.
Anyone else seen a massive increase in the Not Provided keywords in their analytics in the past couple of weeks. Probably related to this (source:http://searchengineland.com/post-prism-google-secure-searches-172487) _In the past month, Google quietly made a change aimed at encrypting all search activity — except for clicks on ads. Google says this has been done to provide “extra protection” for searchers, and the company may be aiming to block NSA spying activity. _
Other than the unreliable stats from WMT, there doesn't seem too many ways which we can now find out what is sending traffic to our sites!
-
Can anyone confirm if this will have an impact on the traffic data showing in Moz? I'm assuming that the data is coming from the Google Analytics data and will therefore be affected in the same way?
-
I hope Bing steps up their game and offers everyone a free analytics suite more on par with GA in response to the loss of keywords. I've never been enamored with Bing but they have been looking for a means to steal away people from Google. They can even market it as Microsoft saving the little guy/small business while Google hoards information.
-
anyone think Google is going to come up with a way to charge businesses/seo companies to view keyword data?
-
Google has turned into a black box.
-
Absolutely agree with you Grumpy Carl! I can see that this change is just going to increase the need to check rankings in order to find out which page is ranking.
Why we can't get this link in the google webmaster tools data I just don't know (but I know it's nothing to do with privacy!). I just want to know which pages the keywords are sending traffic to... grr
I just get a feeling that it's all going to get messy and I'm going to be spending a lot more time in front of spreadsheets.
-
It is, I do not guess there is any coming back from this either. It will be interesting to see how this changes SEO.
-
I would agree, in part. However, even if you don't know which keyword is sending you traffic, If anything this makes ranking reports more important. If we see traffic going up, but cannot directly see which keyword is sending it, then one could draw a link (however tenuous) between the rise in rankings and the rise in traffic
-
Scary how the 100% date, in the chart, has become this December. Was scary enough when it was 2017!!!
-
I think Google is on a covert mission to napalm the SEO industry....
-
Apparently things have taken a change today look at this, http://www.notprovidedcount.com/ and this http://www.searchenginejournal.com/google-gone-100-provided-secure-search/70799/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google push down for not ranking top for branded keywords?
Hi all, Usually websites rank for their branded keywords. Some times third party websites takeover the websites for branded keywords. If there are more number of such queries where website is not ranking (top) for branded keywords, Google push down website in overall rankings? Any correlation? Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
How is this possible? #2 ranking with NO on-page keywords, no backlinks, no sitemap...
Hi everybody. I have a question ... I'm totally stumped. This question is being asked today (November 16th, 2015) just after Google updated something in their algorithm. Nobody seems to know what they did. and it has something to do with the new "Rank Brain" system they're now using. My niche is Logo Design Software (https://www.thelogocreator.com). I had the keywords "logo creator" on the page roughly 7 times. After Google updated, I lost about 10 spots and as of this writing, I've dropped to #15. So, maybe I over optimized. fine. Noticing that for the keyword "logo creator" ... NONE of the top 14 spots actually have "logo creator" in their page title and NONE of them have more that 2 instances (if any) of the keyword "logo creator" on the actual page. So I removed ALL instances of my keyword "logo creator" from my home page - used the Webmaster's Fetch Tool and moved up a few spots instantly. So what the heck? And the #2 spot for that keyword is www.logomakr.com - they have NO words at all on their pages, no blog, no sitemap and far fewer links than anybody in the top 10. Can anybody reading this shed some light? Marc Marc Sylvester
Algorithm Updates | | Laughingbird
Laughingbird Software0 -
Keyword Targeting - How to Properly Target Two Similar Terms?
Hi all, So I have a question about "best practices" when you have two unique, but highly similar keywords you are targeting. Let's use the examples of "raincoats for women," which gets 9,900 searches a month, and "rain jackets for women," which gets 4,400. I am in the process of selecting keywords for my client's "keyword portfolio" and need to come up with a strategy when faced with two similar keywords that use different terminology. I'm well aware that there should only be one page for "women's raincoats" but there is no doubt in my mind that Google will give preferential treatment to whichever version of the keyword (raincoats/rain jackets) I include in my title tag, meta description, content, etc. I know that the modern philosophy is that Google is sophisticated enough to understand that the two words are essentially synonymous. That said, would you A) only pick "raincoats for women" for your client's keyword portfolio and focus exclusively on that term in your optimizations? b) pick both terms and try to strike an even balance between both in your optimizations? c) pick both terms and only optimize for "raincoats for women" and hope that "rain jackets for women" gets some peripheral benefit from your optimizations via Google's understanding of synonyms? Thanks!
Algorithm Updates | | FPD_NYC0 -
How on earth is a site with ONE LINK ranking so well for a competitive keyword?
Ok, so I'm sure you get the gist of what I'm asking about in my question. The query is 'diy kitchens' in Google UK and the website is kitchens4diy[dot]com - which is ranking in third from my viewing. The thing is, the site has just ONE BACKLINK and has done for a good while. Yet, it's ranking really well. What gives?
Algorithm Updates | | Webrevolve0 -
Seo results are down. Is my "all in one seo pack" to blame?
My website www.noobtraveler.com has shown a dip of 40% since Penguin's last update in November. I also transferred hosting at time, but I was wondering if I'm over optimizing with the all in one seo pack. I would appreciate it if someone could do a quick sweep and share their thoughts. Thanks!
Algorithm Updates | | Noobtraveler0 -
Why am I getting different Google SERP result for same keywords?
Hi Mozzers, I have noticed recently that Google (.com.au) has been serving up different SERP results for the same keywords. For example, one of our main keywords is "Car Loan". One result will show our site as ranking #5 organically from 242,000,000 results. A refresh of this search will then result in our site not ranking at all from 133,000,000 results. We have been noticing this happen only in the last few days & more frustrating is that Google is throwing up the SERP from 133,000,000 results more frequently. Would anyone know why this is occurring? And what can we do, if anything, to ensure we are shown regardless of how many results Google calls from? Is it from recent algo update & will it settle down over time? Any help would be greatly appreciated. (Just to add - I'm not gogged in to Google when completing this test & regularly clear cookies etc so I don't believe its a personalised search issue)
Algorithm Updates | | 360Finance0 -
New Google "Knowledge Graph"
So according to CNN an hour ago regarding new Google update: "With Knowledge Graph, which will begin rolling out to some users immediately, results will be arranged according to categories with which the search term has been associated" http://www.cnn.com/2012/05/16/tech/web/google-search-knowledge-graph/index.html?hpt=hp_t3 Does this mean we need to start optimizing for Categories as well as Keywords?
Algorithm Updates | | JFritton0