Big rise in "Keyword not defined"
-
Hi, all.
Anyone else seen a massive increase in the Not Provided keywords in their analytics in the past couple of weeks. Probably related to this (source:http://searchengineland.com/post-prism-google-secure-searches-172487) _In the past month, Google quietly made a change aimed at encrypting all search activity — except for clicks on ads. Google says this has been done to provide “extra protection” for searchers, and the company may be aiming to block NSA spying activity. _
Other than the unreliable stats from WMT, there doesn't seem too many ways which we can now find out what is sending traffic to our sites!
-
Can anyone confirm if this will have an impact on the traffic data showing in Moz? I'm assuming that the data is coming from the Google Analytics data and will therefore be affected in the same way?
-
I hope Bing steps up their game and offers everyone a free analytics suite more on par with GA in response to the loss of keywords. I've never been enamored with Bing but they have been looking for a means to steal away people from Google. They can even market it as Microsoft saving the little guy/small business while Google hoards information.
-
anyone think Google is going to come up with a way to charge businesses/seo companies to view keyword data?
-
Google has turned into a black box.
-
Absolutely agree with you Grumpy Carl! I can see that this change is just going to increase the need to check rankings in order to find out which page is ranking.
Why we can't get this link in the google webmaster tools data I just don't know (but I know it's nothing to do with privacy!). I just want to know which pages the keywords are sending traffic to... grr
I just get a feeling that it's all going to get messy and I'm going to be spending a lot more time in front of spreadsheets.
-
It is, I do not guess there is any coming back from this either. It will be interesting to see how this changes SEO.
-
I would agree, in part. However, even if you don't know which keyword is sending you traffic, If anything this makes ranking reports more important. If we see traffic going up, but cannot directly see which keyword is sending it, then one could draw a link (however tenuous) between the rise in rankings and the rise in traffic
-
Scary how the 100% date, in the chart, has become this December. Was scary enough when it was 2017!!!
-
I think Google is on a covert mission to napalm the SEO industry....
-
Apparently things have taken a change today look at this, http://www.notprovidedcount.com/ and this http://www.searchenginejournal.com/google-gone-100-provided-secure-search/70799/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best place to employ "branded" related keywords to gain SEO benefits and rank for "non branded" keywords?
Hi all, I want to put this question straight with an example rather than confusing with a scenario. If there is company called "vertigo", a tiles manufacturer. There are many search queries with thousands of searches like "vertigo tiles life", "vertigo tiles for garden", "vertigo tiles dealers", "vertigo tiles for kitchen", etc....These kind of pages will eventually have tendency to rank for non-branded keywords like "tiles for garden", "tiles for kitchen", etc. So where to employ these kind of help/info pages? Main website or sub-domain? Is it Okay to have these pages on sub-domain and traffic getting diverted to sub domain? What if the same pages are on main website? Will main website have ranking improvement for non branded keywords because of employing the landing pages with related topics? Thanks
Algorithm Updates | | vtmoz0 -
Homepage alone dropped for one "keyword"
Hi Moz community, Our websites has dropped almost 50 positions for main keyword and Okay with other keywords. Other pages are doing consistent for other keywords. We haven't made any changes in website. What could be reason this ideal scenario of homepage dropping for main keyword. And recent unconfirmed algo update have anything do with this? Thnaks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Exact Keywords Domain name
Hello everyone!, I would love to have your opinion on this matter. I am working on a company e-commerce site; these guys would like to change their domain name AND their company name, so the most logical thing that came to mind was to name the domain after the company name. However, they also bought in the past a domain that have the exact keyword they would like to rank for. I know that keywords in the URL are not as important as they used to be in the past, but nonetheless when I do a Google search for those keywords, 3 domains out of 10 on the first page are slight variations of those same keywords, meaning that they might have a really good domain name (also the other result are government, medical stuff and so on). And, no matter how many times I have read that keywords in the URL are not so important anymore, I still see a lot of sites ranking also because of their domain name (well at least outside the US) So, my question here is: would it be better for them to use the exact match keyword-domain name or should they use their company name for their new site? Or some sort combination of the two? (the keyword-domain that in some way points also to the brand domain). Thanks for your opinions on this; really appreciate it! Cheers
Algorithm Updates | | Eyah0 -
Keyword stuffing in
Having a discussion with my boss over whether the following page is over-saturated (stuffed) with keywords in the element: http://www.godreamvacations.com/BarceloHotels -- We implemented the description and keyword tag text back in 2010 when the boss gave me the text. Anyone have any good responses to the bosses' response (below)? "These are the ones (pages) that are actually working wonderfully well on Bing. At the time, I researched the optimal number of characters and tried to really follow all that was suggested by SEO experts. As far as the keywords, I would say you could remove the ones without the “s”, for example, take out “Barcelo Hotel” and leave “Barcelo Hotels” I think this is all relevant to what is found on the page. I don’t know what they would expect us to do differently than this. Do you? What is your MOZ currently saying is the optimal number of characters for a Title?" Any responses would be appreciated. Am I wrong in saying it's "stuffed" and looks spammy? What would you tell your boss?
Algorithm Updates | | godreamvacations0 -
Are Some Websites "White Listed"?
I track several niches that I am not in so I am not to biased with my own, and I noticed one site despite its rather mediocre quality, never moves. I have seen other websites rise and fall in rank, a few with pretty good content. He writes reviews, but very obviously never touched the products he reviews. However I see some other sites with real photos, and good advice for making a decision - they will sit on page two or three. I havent done a lot of research other than the size of the sites, and the links, and they are about equal. Sometimes the ranking site is smaller (its about 90 pages in google). The other sites I have seen have more content on one topic as well, which is interesting google opts for his one page "once over" review over something more in depth and authentic. It got me thinking about whether some sites are white listed by google, as in hand picked to rank despite what else is out there. Is this possible?
Algorithm Updates | | PrivatePartners0 -
Keyword Stuffing
Hi, I have a particular page I am working on fully optimised as per the page rank tool and grading A. Problem is, there is still one issue on the page, the keyword I want to rank for in particular on that page is reporting that I am over using the word. I have looked through the HTML on the body of the page and see that the actual content is not an issue, I have less than 15 mentions of the keyword, the problem actually arises from the Nav Bar The word I am ranking for say is football Well in the nav-bar, I have several pages that are named football-xxx.html So where the actual page content only has 12 mentions of the keyword Football, list has another 13 due to my page naming properties.* Football- xxx Football- xxx Football- xxx Football- xxx Football- xxx Football- xxx ....... And so on up to 13 different pages with a similar naming structure (obviously the XXX are products like socks, shorts, tops, boots etc) So, without asking the obvious, I assume that the way to change is would be to rename the pages (can't really happen), or remove the term football from the body of the content on the website. So firstly, is the Google algorithms not smart enough to recongise that the content in the menu tag is in fact the nav bar ... and page names as opposed to keyword stuffing. Also, how did anyone else handle a similair situation And lastly, I am assuming basic on common sense alone that Keyword stuffing is a big no no, of everything I have learned though SEO Moz, this makes clear and concise sense to me ... So how do I resolve the issue I have here Many thanks in advance for any help offered Regards John
Algorithm Updates | | Johnny4B0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1