Major Ranking Drop
-
My site bestdslrreview dot net was ranking number one for Canon T3i. I was amazed considering my competition. Canon T3i was at number 3 to 5 usually for US traffic.. Canon 7d review was ranking number 2 and Canon 7D was on page one.
Canon 7D keywords dropped about 10 places but Canon T3i Review dropped about 50. and same for Canon T3i.
I worry about a number of things, including thin affiliate site but I don't know, my articles are original. I also worry about to many links with my keywords in anchor text.
My on page results from SEOMOZ look great to me. This drop was my main reasons for going ahead and signing up. Hoping I could find out what happened.
The site is only about six months old. I waited probably three months before doing any significant link building.
The good thing is, I was ranking high from Black Thursday until a few days before Christmas. It can't be someone else just started doing SEO as I fell far too fall. Some kind of penalty. I had Google do this to another of my sites once though and after a couple of months it bounced back for most of my keywords. A site very much like this one.
Is this just the Google Dance?
Thanks
-
Oh I laughed about blaming it on the kids but personally, I think it is more likely the dog. haha
-
I don't THINK I have a think content issue. But I'll re-write those features just to be on the safe side.
I think I built too many links too fast, thousands of them and then stopped. I wont do that again. I think it is a sudden decrease in link velocity. I've have struggled with link velocity a lot and finally, I think I understand why it is important to continue with it.
You touched on it. To me it could look unnatural but like I said a small site could be popular and get lots of links. Another SEO expert told me that is true but if the backlink velocity drops to fast it could be that Google takes that as a signal the page is no longer popular as well as the possible natural linking problems.
I will continue with my blog network, slower but better quality than the others I built. However, I will also work on the duplicate content issue. I have lots of content with no links just to help with the thin but now I realize that a page could still be thin too. Will make things a bit better if I do that too.
As always, I'm all ears.
-
I dont mean that duplicate content on same site is ok, only that you want be penelized for it. But only one version of the duplicate content will rank. meaning you wont get any credit for it. they will pick a version to rank, using a canonical you can hint to the search engines what version that should be.
what the penalty you want to avoid is thin content. meaning what is left on the page after you take away advertizing and duplicate content found elsewhere on the web, then if what is left is not substantual and usefull, then it is thin content. Duplicate content wont hurt you in itself, it wont help you, but thin content will hurt you.
-
Yes, I made them no-index follow. I've now removed the no-follow on category and tags but that's the way it was before. the drop.
As for the duplicate content, only the bullet text is duplicate. That's 105 words out of 890. That's 13% and it is too high for me. I'll re-write that and resolve that just to be on the safe side. Its really two of the bullets that are causing it. But I'll change them all, looks like I got lazy there. I'll check the other articles on the site as well to make sure I didn't do that on those.
The duplicate content issue is as clear as mud. Now I know that Goggle says they don't penalize a site for duplicate content on other sites. Of course, that doesn't hold true for autoblogs. But that doesn't seem to be true when one considers backlinks. I'm uncertain. You seem very knowledgeable and I'm sure you are. What you basing the statement that onsite duplicate content will not hurt you. I'm pretty sure I've read on Google site itself that they HATE that. The reason they do is because of having the same page show up in the SERPs more than once. And I clearly had that going on before.
Yeah, It was Matt Cutts, I just couldn't remember his last name.
I'm really having trouble with this issue and based on what I've seen around the web, I'm not alone. I've got one fairly large site with over 1500 indexed pages. Google Web Tools showed a LOT of pages with duplicate titles and it took me forever to get those to disappear. Finally they did after I removed the index for categories and tags.
I'm having a lot of trouble with canonical URLs. I know I'm not alone. i've read much and understand little. I learned the most for an SEOmoz help link.
If I understand it, the intent is to tell SE where the main article is. I turned the canonical option back on in SEO platinum but then turned it off again. I'll show you why. It gave me two different links in the header. One for my category and one for my article. That seems counter productive to me and exactly what I don't want to happen.
Thanks for all of your help.
I probably shouldn't have marked your message Good answer as that made the thread answered. LOL oh well, I'll know next time. I don't want to take up all of your time.
Thanks very much for the help.
-
i noticed to made them follow that is good becqause linkjuice can still flow back out.
but you that fact that a page is not indexed does not change the flow of LJ to it.I fyou have 2 duplicates then use a canonical tag, or get rid of one of them things
like tag pages with shippets of other pages, are ok, at worst they wont rank, they
will not hurt you. its duplicate content from other sites that is the problem.As a rule blocking from robots is a last resort. i would concentrate on the duplicate
content from other pages, you need to add enouth original conent to make the page usefull.As Matt Cutts said, take away advertisments, content that is avaiable elsewhere, and what is left,
is it usefull and original?I wouldent bother with the tags in the sitemaps, category pages yes, bing for one say that
they only want the main pages in the site map, not every page, but having said that I asked
Duane Forrester from Bing and he said for a small site listing all pages is not a problem.We have alll been in panic mode when we lose rankings, I am a computer programmer
and when things are not working I start to believe anything and everything is responcible,
I will uninstall things, delete thing, blame my kids, assume i have a virus, but when i settle
down i usually find its somthing simple. -
As users build links to a website or URL, we see that pattern emerge and can
track the time over which links are built. Fast growth can indicate
popularity. Normally links build in a similar fashion across all websites, with
spikes potentially indicating popularity. When we compare that data against the
other signals we track, we get a truer picture of if that popularity is real or
not. Machines can build links quickly, but if we know the source of the links
is a common location or service that is “less than organic”, we’ll discount the
value of the links. -
I'm really glad you brought up the no no-index issue.
Most of that was probably created after the drop and I'll tell you why I did it to hopefully learn why I shouldn't do it.
I no-index privacy policy and disclaimer earnings type pages. I do that because I don't want those pages to dilute my money pages. Is that a mistake?
I also went in and usually do it from the beginning and set tag and categories to no index. I'm not sure when I did that on this site but I think it was after the drop. I'm not sure. The reason I do it is that seems to create duplicate listings in the SERPs. Or I thought it was afraid of duplicate content. Is my thinking wrong on that? I see the same pages/articles listed when I do site:domain.com and sometimes I get two listings for the same page on page one of Google. One for the article and one for the tag or category. I use SEO Platinum to accomplish this I also removed categories and tags from my sitemap using the xml sitemap generator plugin.
On the scripts, now that is something totally new to me. I'm pretty sure I did that after the drop. When I did site:domain.com I saw a lot of them listed. My robots.txt file looks like this:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://bestdslrreview.net/sitemap.xml
This is likely what is causing the no index on the scrips/plugins. So I should remove those to lines? I'm going to go ahead and do that since I was never sure that they should have been there in the first place.
Also I do not include categories and tags in my sitemap, should I start including those in the sitemap?
Thanks very much for the help. Sometimes I over think when I don't know exactly what the answers are.
-
Thank you. I think it will come back too but the too many links issue.....
I have always wondered about that and I seem to get different advise.
I forgot his name, the Goggle guy, Matt ??? most of you will know... He wrote a post indicating you can't link too many times. That is what most of the people that say you can't build to many links to fast base their advice on. But to me, I've always wondered if it makes since to have 2000 links to a site with five articles. Then I reason people could love a site with only a few pages, I'm sure that happens naturally at times?
If you're right, then I've made it worse because I've been building more links but aiming at higher PR sites when I do it. I've got a lot of links from PR 0 sites but in open site explorer it appears I have a pretty good natural looking distribution of links across different PR sites so I'm not at all sure.
I'm all ears if you and others have more to say to help me with this issue.
-
There is a fewe pages with no index, They see to have content i cant see why you would no index them, like this one
/tag/canon-7d/
You have also no indexed a few scripts, I beleive this is a mistake, because a search engine knows that the scripts are being used, but does not know what you are doing with them, and may lead to a lack of trust.Thhere are a few other technical issues, but not what would lead to a huge drop in ranking.
You do have a fair bit of duplicate content, the reviews are on other pages, if you wrote the reviews i would not worry, search engines usealy know why is copying and who is the creater.
If you are getting this content from elsewhere, you need to add enouth content to each page so that it apears original.
-
So you are saying you have been only building links for 3 months, but looking at OSE the Canon TSi page has 425 linking domains and canon 7D 751 linking domains?? This is way too many for that time frame IMHO. I would say you have tripped a filter. Wait 60 days and see if it comes back up... it should do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Implementing nofollow tag affect ranking
if a blog that is performing well on the first page at an average position of 4 for an informative term. The blog contains a lot of outbound links, adding nofollow tag to all of them affect the current ranking? There are few other same cases. Here is the link - monk.webengage.com The was a lot of other optimization done on the page but the ranking did not improve at all. Is the issue with my domain authority or is it due to higher backlink to other competitive domains.
On-Page Optimization | | Web-Engage0 -
When do Panda ranking factors apply when Google deindexes a page
Here is 2 scenarios Scenario 1 Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them. Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty. Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed? Scenario 2 I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off? What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty? The pages are useful to my users so I need them to stay. Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban? Hope this makes sense
On-Page Optimization | | cbielich0 -
Do these items affect Google ranking or Quality Score?
Hi community, After my first crawl of a site that I'm working to improve the SEO on, I find that I have about 500 issues regarding missing Alt Text and Title text for images on my site, as well as about 175 issues with regard to duplicate meta description, missing meta descriptions, and too short/too long meta descriptions. My client is not sure it matters to fix these items and only wants to do so if they have an affect on Google ranking or Quality Score. Does anyone know? Thanks!
On-Page Optimization | | gataninc0 -
Panda 4.1 - Regained rankings by making changes on my site
So thought I would share this with others as it 'might' help. So I got around 40 pages, all of them have taken a dive in rankings so on average from Pg1/2 to Pg5/6. I took one page and started working on content upgrades, I had around 50 words of boiler plate content above the fold and 750 words of unique content way down the page so maybe 600px below the fold before you hit it. I removed the boiler plate content and produced 75 words of unique content to replace it, I shuffled things around and managed to get the 750 words maybe 250px below the fold then chucked another 100 unique words into the mix. Fetched the page in GWMT's and resubmitted the URL - hour later that page alone, the page I upgraded had recovered its rankings back to the original position before Panda 4.1 hit. Will the rankings for this page drop back down again, not sure but interesting this page was Pg6/7 since the 23rd and doing the above within an hour regained rankings back to Pg2 middle.
On-Page Optimization | | followuk1 -
So I know my website needs work, how do I go about fixing it now and improve my ranks?
I thought at first it was going to be as simple as adding a few keywords here or there. But apparently not. I see where I messed up in the errors, warnings, and notices. How do I go about fixing them?
On-Page Optimization | | taychatha0 -
Some of my pages have KWs in common and google is switching those resulting in rank loss, how to fix that?
Hello, my site deals with pictures you can download and print for free. Because people search for it I have several page with similar content. For example: picture of dog free printing image of dog free printing painting of dog free printing picture of cat free printing image of cat free printing painting of cat free printing Just as an example. Now after tracking my ranks I found out that google changes those sites in the rankings. For example for the kw picture of dog free printing the site for picture of dog free printing is in the rankings but on another day it is image of dog free printing and on another painting of dog free printing. So those Google is switching the pages for one kw some times and it seems each time it does that I'm loosing rankings. I'm thinking it is because image of dog free printing and painting of dog free printing isn't as well optimized as the page that should appear: picture of dog free printing. With the more general kw of free printing it is even more complicated. I get some traffic through free printing (just an example) and all 6 example pages are being shown in the google search results from one day to another and again it looses some rank each time. What can I do to stop this?
On-Page Optimization | | SeeSharp10 -
301 Redirect to mobile site effecting rankings
On my site I redirect users when on a mobile device to be redirected to my mobile version of my site. In the last couple weeks I added that redirect in my htaccess file. I just realized today that I am doing a 301 redirect. In the last few weeks my ranking have gotten lower. I am starting to think that because of the redirect my mobile site is not as optimized as my regular site and therefor is effecting my rankings. My question is how can I redirect my users to my mobile site but not use a 301 AND how do I get google to use my main site content for ranking on the mobile side. I want to go back to getting my search rankings on the mobile side using my main sites content. Can I simply remove the 301 on the redirect? Hope that makes sense.
On-Page Optimization | | cbielich0 -
What is causing Bing and Google Rankings to Differ by so much?
Does anyone know the trick to have Google Rankings to be as good as Bing/Yahoo Results?
On-Page Optimization | | hfranz0