A Bazaar 10 Days In WMT & Moz
-
Hey All,
Just wondering if anyone else has experienced anything like this or has any idea what happen to one of my sites last week.
I have a lot of question and unanswered weirdness that seems to be occurring in the data.
It kind of plays out like this.
- Noticeable drop off in organic SE traffic
- WMT kicks out 70% of my structured schema data
- Moz drops DA to 14 from 17
- Moz MR drops from 3.50 to 2.74
- Total Indexed pages (as per SEOQuake tool bar) Drops from 500 to 224
- All the images submitted via a sitemap have been unindexed according to WMT.
This only a 3 month old web site with a minimal back linking profile, we're talking less than 100. No one has been working on the site, the only thing that has happen in the last month is blog post being posted. All unique content and well formatted.
Then at the beginning of this week I saw traffic increasing again, and this was confirmed when WMT updated and I got my Moz update.
- All schema data back in WMT, with no acknowledgement of the drop; the graph just continues in a steady rise as tho nothing happened.
- Traffic back up to normal levels, as tho nothing happened.
I have noticed a 100% increase in the amount of pages being blocked by robots.txt, I haven't changed it, nor has anyone else, it has been the same since the site was launched.
I'm hoping that Moz will put the DA and MR back to where they were, and no damage will be done.\
But my question is, What the hell happened? It nearly wiped the site of the face off the earth for 10 days, how do I find out what did it, why and stop it from happening again?
I have noticed the number of pages indexed by google, as quoted by SEOquake tool bar not WMT regularly jumps from anywhere from 200 to 500, Any ideas?
I know their are many different elements at play here, where do I start, I can't sit back and hop it never happens again.
Thank you to anyone that wants to roll their sleeves up and have ago at this one.
Best, IAn
-
Ian I don't think anyone is going to be able to help you without being able to see your site, do some checking around the SERPs, etc... If you don't want to provide that information, we understand, but I don't think anyone could provide you with a useful answer.
As far as new sites going up and then dropping down in the first few months, we used to call this the "Google Sandbox effect". Nobody talks about it anymore for various reasons, but many of us believe that there is still an amount of time for a new site where Google will give it the benefit of a doubt in order to collect metrics about it from the SERPs, such as user's reactions to it appearing in the search results for different queries. Once that data is crunched the rankings often go down. That's just a theory, but I thought I'd throw it out there. It doesn't seem to make sense in this case though, as it doesn't explain the Moz metric drops, which are not associated with Google at all.
Google recently cleaned house a bit on structured data in the SERPs, which might explain that part of it. Even if your data markup was using the correct syntax it wouldn't help if you don't have the trust metrics necessary for Google to deem the sight worthy of structured data in the SERPs.
-
Thanks for your reply,
All the pages that are blocked in robots.txt are ones I want blocked, admin; login etc. But the robots.txt hasn't been changed for over 6 weeks. What jumped out at me was; Why all of a sudden would their be a spike on WMT of blocked pages? Why didn't these show up 6 weeks ago? They have been blocked all the time. Maybe it's nothing but why did they show up at the same time the site was dropped?
No pages that shouldn't be blocked are blocked.
Thanks,
Ian
-
I see. Well first of all, the Moz Rank and Domain Authority don't actually feed information to Google and are nothing more than a slight-indicator of what's happening. If your DA went from 17 to 14 this could be for any number of reasons but even if it went back up to 17 all the sudden you wouldn't see any difference in traffic or SERPs just based on this number alone.
I'm not sure why it would disappear for 10 days and then reappear, to be honest. Could be any number of things. But I wouldn't worry about it if your placement has since returned.
I would worry about the pages being blocked by your robots.txt that you don't want blocked. Am I understanding that correctly? Do you have an example of that?
-
Hi Jesse,
Sorry, That's my fault for not clarifying myself properly. Allow me to elaborate.
What I mean by "No one is working on the site" Is no coding or backend work is going on, work which could have disturbed the site from Google's point of view, or caused a sudden change in the code, that would cause the mark up data to disappear from WMT.
The site is very much being actively optimised, onsite with blog posts and additional site pages being added almost daily, and off site through social media channels and outreach programs. Their are 100 Back links which have been picked up in WMT, their are an awful lot more than this out there.
Traffic is building nicely considering the age of the site (3 months). This is why it was such a shock when it was virtually wiped out for 10 days.
My question is not about SEO or how to do it (however I always appreciate someone else point of view and interesting spins), it's about why WMT and MOZ would have knee jerk reaction to something, and almost simultaneously drop the site. Then a week later pick it straight back up again?
When nothing was changed/done (coding/site structure/links) on site. The code/site structure/backend hasn't been touched for 6 weeks, only content has been added.
I'm trying to find a way to find out what caused this, for want of a better word 'seizure', and know how to stop it ever happening again.
I hope this clarifies my position a little, sorry for being vague before.
BEst IAn
-
Based on this quote: "This only a 3 month old web site with a minimal back linking profile, we're talking less than 100. No one has been working on the site," none of this is surprising to me. You have a young site and you're not working to market it. So yeah, you're gonna see your rankings disappear.
Why some of the pages have been de-indexed could be for any number of reasons. I think the only way to tell would be to post your URL and some of the fine folks here could take a look.
But besides that mystery, the best thing for you to do is roll up your own sleeves and get a good SEM campaign going for this domain. That's the name of the game!
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google AMP or CDN?
Hello. I'm running a CMS that cannot currently support both CDN and Google AMP. I would have to choose one or the other. Does anyone have any insight on which may be the better choice until I can figure out how to have both? I installed CDN first to reduce the time it took for my pages/images to load. I'd like to have AMP because it can do the same, and perhaps be a little more Google friendly (their product). I would appreciate any thoughts. Thanks! Steve
On-Page Optimization | | recoil0 -
Dynamic links & duplicate content
Hi there, I am putting a proposal together for a client whose website has been optimised to include many dynamic links and so there are many pages with duplicate content: only the page title, h1 and URL is different. My client thinks this isn't an issue. What is the current consensus on this? Many thanks in advance.
On-Page Optimization | | lorraine.mcconechy0 -
When To And When To Not Use AMP
Hi All, I hope someone might be able to help me with this. I am looking at implementing AMP on a website. I have been doing a stack of research into advantages of using AMP etc. However I am struggling to find an answer to when I should not / where not to be implementing AMP. More specifically I see large sites that are running AMP but it does not seem to be on all pages, how is the best way to determine when or when not I should be using AMP. Thanks In Advance, Mark
On-Page Optimization | | amevamark0 -
Should title contain the term Top 10?
One of my clients' run a hyperlocal marketplace (have different pages for each city) to hire music teachers. Should the title be "Hire Violin Teachers, Tutors from Boston"? or "Top 10 Violin Teachers of Boston". I prefer former title as it has 2 keywords - "violin teachers", "violin tutors" than latter. But my client argues that "Top 10" has a strong affinity to attract users that would increase CTR. Am I right? or Is he right? and Why?
On-Page Optimization | | Avin1230 -
EMD vs brand? I'm new to moz
Hi to all so this is my first month using moz tools. When I check my competitors and keywords, everyone close to google #1 is a emd site But with google always getting smarter, is it smart in order to compete to get an emd? So say I rank first page #9, or not at all for my brand, if since my site is a construction site, if instead of say abc construction.com I find colorado home builders.com etc would I maybe do better? Or would google give me a bad ranking? I am just so lost? One last question? If I did build a emd site then just pointed my brand to it, would this be good or stupid for seo? thank you for all advice and tips chris
On-Page Optimization | | asbchris0 -
Multible Dublicate Titles & Descriptions
Hi, I reviewed a site for a client recently and noticed when I produced a crawl report (SEOmoz) that there were thousands of pages with the same content. This is because the developer added a ticket booking system which has a page for each day up to 2017 (5 years). I see in my campaign monitoring that this results in 9,445 duplicate content errors. The site is only 3 months old and has a Google PR of 4 and is ranking reasonable well for main keywords. Just wondering if there is a way to fix this or should I leave it. Is it eventually going to result in problems?. I cant see these pages when I view the server with FTP - perhaps they are dynamically created. Pat
On-Page Optimization | | Patff0 -
Waiting 3 days for Crawl Test to complete
Being new to seomoz Im not sure if I understand the crawl test completely. You setup a campaign, enter all your info, rogerbot goes out and crawls your site and gives you results as to what your doing right and what is wrong or could use looking into. So once I get my results, I make edits to my site pages. In my case Im getting lots of duplicate content and duplicate titles. So I go back and make adjustments and then submit a crawl test to see the change results. In other tools Ive used in past I was able to re run crawl immediately and fine tune results on the fly. seomoz crawl test is still pending after three days. is this normal? or is there another way to make changes and run reports to see results instantly? If your working on many sites and making changes, having to wait 3 or more days to see how your changes were received seems like a long time.
On-Page Optimization | | anthonytjm0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0