Nov 19th & 20th Update?
-
Did anyone see any big changes around Nov 19th & 20th? Mozcast had some high temps around there.
If you saw any big changes in organic search, any ideas WTH that was all about?
Any guesses?
One site I work with took about a 15% hit and has since sort of skidded sideways.
-
Evidence of something around 11/18-19 is pretty strong at this point. Glenn's article that Peter N. posted is worth checking out. I've heard rumors of a mobile connection, but that's been hard to pin down - there does seem to be a "quality" aspect, but that's such a hard word to pin down. No confirmation from Google, but MozCast and similar data definitely saw spikes, and there was solid chatter.
EGOL is right, though - that time period before Black Friday is a hairy one, search-wise, and there are so many variables to disentangle. I think something algorithmic happened, but that doesn't mean that any particular problem or drop was due to a Google change, and it's going to be really tough to piece together any particular story, I'm afraid.
-
There have been a few reports about people experiencing some drops, nothing being released from Google as per usual but at least other people are seeing an affect
-
I'm still trying to understand a total crash in Google organic search traffic that started on 25 November and hit rock bottom on 27 November. It has not recovered. At first I thought it was because I accidentally had "deter indexing" ticked in my Wordpress settings after a new site version was pushed live, but it's now been three days since I corrected that and there's no sign of even a small recovery.
I resubmitted my sitemap and forced re-index using GWT and my site is definitely indexed. But Google traffic had dropped from over 200 a day to around 10 a day.
I have a lot of links to TripAdvisor on my site (travel blog), so I don't know if that's the issue? Or if it's something in the new premium theme?
I read something about affiliate links using affiliate keyword anchors being punished by this update but I don't use affiliate links as such, just direct links to the relevant TA listing which are converted on click by a JS script provided by TA. They are all do-follow, which might be an issue (but has never been in the past). And they are all anchored by the actual hotel name or photo of the hotel. And not all my posts have them either, but the whole site has suffered just as I was getting some traction in SERPs.
There is no manual penalty notice in GWT.
Site is http://www.asiantraveltips.com if you would be kind enough to offer any advice or opinions about what's happened.
-
Hi Egol,
Thanks for the message. For this particular site, I'm looking at a very noticeable dip in position tracking that is then partially obscured by the following week's Thanksgiving holiday. This week, the week after the holiday week, is down about 15% traffic wise compared to 2 weeks prior. As a point of reference, last year this site was flat for the same period. So, don't really think it's just seasonality.
I might think it was just this site, but it just happened to coincide with that increase in mozcast temps and other stories of flux.
Has anybody benefited or suffered with the same kind of timing?
-
According Glenn Gabe there was something: http://www.hmtweb.com/marketing-blog/november-19-google-algorithm-update/ http://www.hmtweb.com/marketing-blog/unconfirmed-google-algorithm-updates-2015/
Google says: https://www.seroundtable.com/google-update-no-21225.html
"Don't have anything more specific to announce, sorry!" -
Traffic on many non-retail sites in the United States started to dip on the 19th and 20th in advance of Thanksgiving Week. Many schools take a break that week and lots of adults plan trips for the holiday. Traffic on my informational sites went down then, but has been nothing less than volcanic this week. I was wondering if something happened yesterday to give me a traffic blast that continues today. Up over 20% on a site that is already busy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google crawler understand & flag a blog post has text asserting sponsorship with dofollow outbound link?
I kind of know the answer, but just wanted to get some feedback from others. For the sake of argument, assume there are no other issues with the linking blog, such as: too many ads, thin content, etc. Question: If you make a payment for a blog post with a dofollow link, and in the blog post there is something to the effect of: "this post has been sponsored by..." Will Google crawlers detect that and flag that as an unnatural link?
White Hat / Black Hat SEO | | kekepeche0 -
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Homepage not ranking for targeted keywords (established site with somewhat ok UR&DR)
Hello everyone, i have a question regarding my homepage issue. My homepage is not showing up in google search result for all the keywords except brand name. I have checked the following things to make sure my homepage is working properly. 1.The page is indexed. 2.No canonical issues 3.No robots.txt issues. 4.Ahrefs UR45 DR55 while my competitors ranking in 2nd and 3rd page have lower UR and DR Have tens of thousands of backlinks but i think most of them are legit I suspect the problem might be the hoempage has more than 70 Anchor text (Internal links) working as directory, and many of them contain the keywords we are targeting. Will that be the reason my homepage is not ranking at all? Since the google might consider it as keyword stuffing and penalize my homepage for that. What are your thoughts on this? Any suggestion would be greatly appreciated!
White Hat / Black Hat SEO | | sufanfeiyan0 -
Google Organic Ranking & Traffic Dropped
Hello, We have been struggling to keep our website (http://goo.gl/vS37qA) ranking well in Google since April 30, 2015. For some reason at that time, there were around 15000 blocked pages (mainly Magento layered navigation pages) showing in Google's Search Console. We used canonical tags, and now all these pages have been removed from Google's index and Google Search Console. We didn't do anything that is against Google's Guidelines. Currently in Google Search Console we see:- Around 50 crawl errors- no malware- no blocked pages - no other error messages in both Webmasters tool.We have never practiced black hat SEO, paid for links, or used tactics that Google penalizes. We noticed in the last few months there are around 1000 Chinese/Russian/Japanese links points to our website, and we have used the disavow tool to notify Google of these attacks.Any help would be greatly appreciated in advance!
White Hat / Black Hat SEO | | NancyH0 -
Is article syndication still a safe & effective method of link building?
Hello, We have an SEO agency pushing to implement article syndication as a method of link building. They claim to only target industry-relevant, high authority sources. I am very skeptical of this tactic but they are a fairly reputable agency and claim this is safe and works for their other clients. They sent a broadly written (but not trash) article, as well as a short list of places they would syndicate the article on, such as issuu.com and scribd.com. These are high authority sites and I don't believe I've heard of any algo updates targeting them. Regarding linking, they said they usually put them in article descriptions and company bylines, using branded exact and partial matches; so the anchor text contains exact or partial keywords but also contains our brand name. Lately, I have been under the impression that the only "safe" links that have been manually built, such as these, should be either branded or simply your site's URL. Does anyone still use article syndication as a form of link building with success? Do you see any red flags here? Thanks!
White Hat / Black Hat SEO | | David_Veldt0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Video & Image Spam?
We have 50 product videos and 100 product images to distribute. For the sake of increasing nofollow Linking Root Domains, my manager wants to distribute them in the following manner: 10 Company profiles on 10 video sites, each with 5 videos. The sites to be used are sites like YouTube, Vimeo, DailyMotion, MetaCafe, etc. 10 Company profiles on 10 image sites, each with 10 images. The sites to be used are sites like Photobucket, Flickr, imageshack, Imgur, etc. My thoughts are that we should stick to one service for video (YouTube) and one service for images (Flickr). We can increase nofollow LRD's by doing some quality blog commenting. Keep in mind that the product images look great, but the videos are amateur and consist of someone holding the product and discussing it's features. Each vid is around one minute in length. What do you think of the two approaches and which do you prefer? Do you think creating many profiles will come off too spammy? We are also weathering a Panda penalty and submitting a Reinclusion Request to Google within the next two weeks. Your thoughts are very welcomed and appreciated. Thanks 🙂
White Hat / Black Hat SEO | | Choice0 -
How many times should one submit the same article to various websites? 1 time? 10 times? What is okay to do with the most recent Panda update?'
For link-building purposes, seemingly it was okay to post the same article to multiple sites for links in the past. However, after the most recent Panda update our thought is that this may not be a good practice. So the question is, how many times is okay to submit an article for link building purposes. Should you always only submit to one site? Is it okay to do more than once? What is the right way to submit for link-building in Google's eyes? Thanks
White Hat / Black Hat SEO | | Robertnweil10