Nov 19th & 20th Update?
-
Did anyone see any big changes around Nov 19th & 20th? Mozcast had some high temps around there.
If you saw any big changes in organic search, any ideas WTH that was all about?
Any guesses?
One site I work with took about a 15% hit and has since sort of skidded sideways.
-
Evidence of something around 11/18-19 is pretty strong at this point. Glenn's article that Peter N. posted is worth checking out. I've heard rumors of a mobile connection, but that's been hard to pin down - there does seem to be a "quality" aspect, but that's such a hard word to pin down. No confirmation from Google, but MozCast and similar data definitely saw spikes, and there was solid chatter.
EGOL is right, though - that time period before Black Friday is a hairy one, search-wise, and there are so many variables to disentangle. I think something algorithmic happened, but that doesn't mean that any particular problem or drop was due to a Google change, and it's going to be really tough to piece together any particular story, I'm afraid.
-
There have been a few reports about people experiencing some drops, nothing being released from Google as per usual but at least other people are seeing an affect
-
I'm still trying to understand a total crash in Google organic search traffic that started on 25 November and hit rock bottom on 27 November. It has not recovered. At first I thought it was because I accidentally had "deter indexing" ticked in my Wordpress settings after a new site version was pushed live, but it's now been three days since I corrected that and there's no sign of even a small recovery.
I resubmitted my sitemap and forced re-index using GWT and my site is definitely indexed. But Google traffic had dropped from over 200 a day to around 10 a day.
I have a lot of links to TripAdvisor on my site (travel blog), so I don't know if that's the issue? Or if it's something in the new premium theme?
I read something about affiliate links using affiliate keyword anchors being punished by this update but I don't use affiliate links as such, just direct links to the relevant TA listing which are converted on click by a JS script provided by TA. They are all do-follow, which might be an issue (but has never been in the past). And they are all anchored by the actual hotel name or photo of the hotel. And not all my posts have them either, but the whole site has suffered just as I was getting some traction in SERPs.
There is no manual penalty notice in GWT.
Site is http://www.asiantraveltips.com if you would be kind enough to offer any advice or opinions about what's happened.
-
Hi Egol,
Thanks for the message. For this particular site, I'm looking at a very noticeable dip in position tracking that is then partially obscured by the following week's Thanksgiving holiday. This week, the week after the holiday week, is down about 15% traffic wise compared to 2 weeks prior. As a point of reference, last year this site was flat for the same period. So, don't really think it's just seasonality.
I might think it was just this site, but it just happened to coincide with that increase in mozcast temps and other stories of flux.
Has anybody benefited or suffered with the same kind of timing?
-
According Glenn Gabe there was something: http://www.hmtweb.com/marketing-blog/november-19-google-algorithm-update/ http://www.hmtweb.com/marketing-blog/unconfirmed-google-algorithm-updates-2015/
Google says: https://www.seroundtable.com/google-update-no-21225.html
"Don't have anything more specific to announce, sorry!" -
Traffic on many non-retail sites in the United States started to dip on the 19th and 20th in advance of Thanksgiving Week. Many schools take a break that week and lots of adults plan trips for the holiday. Traffic on my informational sites went down then, but has been nothing less than volcanic this week. I was wondering if something happened yesterday to give me a traffic blast that continues today. Up over 20% on a site that is already busy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Google Organic Ranking & Traffic Dropped
Hello, We have been struggling to keep our website (http://goo.gl/vS37qA) ranking well in Google since April 30, 2015. For some reason at that time, there were around 15000 blocked pages (mainly Magento layered navigation pages) showing in Google's Search Console. We used canonical tags, and now all these pages have been removed from Google's index and Google Search Console. We didn't do anything that is against Google's Guidelines. Currently in Google Search Console we see:- Around 50 crawl errors- no malware- no blocked pages - no other error messages in both Webmasters tool.We have never practiced black hat SEO, paid for links, or used tactics that Google penalizes. We noticed in the last few months there are around 1000 Chinese/Russian/Japanese links points to our website, and we have used the disavow tool to notify Google of these attacks.Any help would be greatly appreciated in advance!
White Hat / Black Hat SEO | | NancyH0 -
Update: Copied Website
So I discovered a website the other day that is a complete duplicate of ours: justinchina.co.uk This is our website: petmedicalcenter.com . Thanks to help from Erica, I dug in deeper to see why this was happening. It seems that the justinchinca.co.uk which is hosted by GoDaddy has their A Record pointing at our web host. So that being said, our website does not seem to be hacked which is good news. Would this still cause an issue with our Google rankings? Our host, Host Monster said to contact GoDaddy and GoDaddy said that a domain owner can point their URL to anywhere that they choose. Anyway, any feedback would be helpful. Thanks for everyone thus far that has helped me. Brant
White Hat / Black Hat SEO | | BCB11210 -
If our site hasn't been hit with the Phantom Update, are we clear?
Our SEO provider created a bunch of "unique url" websites that have direct match domain names. The content is pretty much the same for over 130 websites (city name is different) that link directly to our main site. For me this was a huge red flag, but when I questioned them and they said it was fine. We haven't seen a drop in traffic, but concerned that Google just hasn't gotten to us. DA for each of these sites are 1 after several months. Should we be worried? I think yes, but I am an SEO newbie.
White Hat / Black Hat SEO | | Buddys0 -
Duplicate Content due to Panda update!
I can see that a lot of you are worrying about this new Panda update just as I am! I have such a headache trying to figure this one out, can any of you help me? I have thousands of pages that are "duplicate content" which I just can't for the life of me see how... take these two for example: http://www.eteach.com/Employer.aspx?EmpNo=18753 http://www.eteach.com/Employer.aspx?EmpNo=31241 My campaign crawler is telling me these are duplicate content pages because of the same title (which that I can see) and because of the content (which I can't see). Can anyone see how Google is interpreting these two pages as duplicate content?? Stupid Panda!
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0