On site links triggering anchor text algorithmic penatly?
-
I'm trying to figure out why a drop in ranking occurred and think it may be related to an increase in on site links. I've attached images of the SEO moz report showing a jump in links from a few hundred to around 15,000 within the space of a week. I think this may be due to some on site work I did when I created categories (I use wordpress) for a large number of cities and towns in the UK. I soon realised I'd run into duplicate content issues and removed all these categories within a few days. As I added categories I also ran into 'too many on page links' warnings as each category I added created a new link and I ended up with hundreds on each page.
If you look at the analytics reports I suffered a huge drop in rankings on the 10th March and think this could be due to an on site anchor text problem that was caused by adding the categories and in turn creating many on site links. SEO moz found these links on the 11th and 25th Feb but my guess is that Google found them around at the same time but if these links are the problem then why didn't my rankings drop until the 10th March? Surely they would have dropped sooner? Would this cause a drop in rankings?
I've recieved an email from google saying that no manual penalty was applied to the site after I submitted a reconsideration request. Therefore it must be some kind of algorithmic penalty. Could this be the problem and if not what else should I look at. My baclink profile appears to be okay and I've been careful to vary my anchor text with inbound link building.
I'm at a loss as to what to do next. Any help will be much appreciated!
-
Ok thanks.
Sam.
-
I'll need to wait until tomorrow to check on this in OSE when they revert to the newer index once again. All of my link exports are currently showing link count prior to the increase. Should be able to update you tomorrow after I get a chance to look.
Ok, to update my response here, OSE is showing 14,000+ links as a result of your on-site changes. You can see that as a list of 745 top pages: http://www.opensiteexplorer.org/pages.html?page=16&site=www.top-10-dating-reviews.com&sort=page_authority. Looks like those pages have at least 70 links each, which easily exceeds 14,000 possible links being found.
Open Site Explorer is updated roughly 1-2 times per month, and shows data that is roughly 20-50 days old depending on when you look at it and when the index was crawled. That's the explanation for why you're still seeing this in the search results. If it doesn't go away within the next 1-2 OSE updates then I'd look into it further.
--
Regarding the original question about whether internal links can hurt the domain, a Matt Cutts video was released yesterday partially addressing this:
Will multiple internal links with the same anchor text hurt a site's ranking?: http://www.youtube.com/watch?v=6ybpXU0ckKQ
That doesn't mean all of those pages of duplicate content may not have hurt rankings, but the links themselves were not the issue.
--
I'm still confused by the Analytics drop but that could be due to a number of things. I'd say the answer lies in digging through Analytics and finding out what exactly dropped that day.
-
Thanks for your reply. creating an extreme number of categories is what I did. I've deleted them now but but still on my seomoz link analysis it says over 14,000 links? I have no idea why? The site is http://www.top-10-dating-reviews.com ( there is some adult content there) . Any ideas appreciated!
-
OK, so assuming that the large jump in links is coming from internal links, here are a few ways that Wordpress might create that many pages:
- Creating an extreme number of categories (more than 20-30) while using permalinks that contain /%category%/ and applying posts to multiple categories.
- Using a theme that contains parameterized URLs such as ?reply-to-comment at the end of every comment reply button.
- Using a strange permalink setting that causes issues.
If all of those pages are really new internal URLs then I suppose it could have confused Google and affected your rankings but since I have not dealt with such an extreme amount of duplicate content added so quickly I couldn't say for sure.
There are also plenty of ways that you could have triggered that many external links. Any sidebar or footer link on a large site could easily add thousands of links. I highly doubt this type of link would have caused a ranking drop on its own - it's no different than someone adding you to their blogroll.
This is a difficult question to answer properly without looking at the site or the exact links, because all I can do is list of lots of hypothetical causes. If you'd like to include the domain or PM it to me I'm happy to look at the website itself.
-
Thanks for your reply. The urls I removed are 404'ing so should I remove these urls in webmaster tools or let them drop out of the index naturally? They keep popping up in webmaser tools as crawl errors.
-
It's a tricky situation, it seems like you were making many changes to your site, it's always risky to put links with keyword rich anchors, and when they're too many and built in a short time period that's definitely dangerous.
First of all get rid of everything you made in a "dangerous way" like your many internal links, normally google has itsrict parameters to check out a page and when you're above a certain threshold you get hit. However I think that to recover the threshold is even lower, it seems like, google is more strict with you since you've tried to game their algo.
Now these are just my ideas and nothing confirmed but I think that you should try to clean up all the new links first, then have a look at your pages, that way to create a lot of pages in such short time, seems that they're programmed pages without any valuable content so they may be toxic for your recovery. Try to make a step back, and restart creating them on a slower pace and maybe hope google to reconsider your position. However if you don't have any manual penalty you'll have to wait until you get recovered. Reconsideration requests won't help you at all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
How can I find all broken links pointing to my site?
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code? Thank you in advance!
Intermediate & Advanced SEO | | StevenLevine2 -
Using rel="nofollow" when link has an exact match anchor but the link does add value for the user
Hi all, I am wondering what peoples thoughts are on using rel="nofollow" for a link on a page like this http://askgramps.org/9203/a-bushel-of-wheat-great-value-than-bushel-of-goldThe anchor text is "Brigham Young" and the page it's pointing to's title is Brigham Young and it goes into more detail on who he is. So it is exact match. And as we know if this page has too much exact match anchor text it is likely to be considered "over-optimized". I guess one of my questions is how much is too much exact match or partial match anchor text? I have heard ratios tossed around like for every 10 links; 7 of them should not be targeted at all while 3 out of the 10 would be okay. I know it's all about being natural and creating value but using exact match or partial match anchors can definitely create value as they are almost always highly relevant. One reason that prompted my question is I have heard that this is something Penguin 3.0 is really going look at.On the example URL I gave I want to keep that particular link as is because I think it does add value to the user experience but then I used rel="nofollow" so it doesn't pass PageRank. Anyone see a problem with doing this and/or have a different idea? An important detail is that both sites are owned by the same organization. Thanks
Intermediate & Advanced SEO | | ThridHour0 -
Site wide links removal
A website of mine has about 4,000 backlinks of which 2,500 of them are coming from one website to the homepage and about 6 internal pages. These have been built up over about 5 years, mainly via article posts. The site was recently hit via penguin 2.0 but has only had natural links built so i'm wondering if the sitewide links are in fact the issue? The website linking to mine is an authority source within its niche but the concern is the amount of backlinks coming from this one site and if it may now be seen as having a negative impact. When ive reviewed the links from this one site via a backlink removal tool about 80% seem fine and suggestions are to remove about 20% of the backlinks. Would you keep all the sitewide backlinks or remove them?
Intermediate & Advanced SEO | | jazavide
Have you come across a similar situation and how did it affect ranking/traffic?0 -
Anchor text diversity for internal links?
Should I be worried about anchor text diversity for my internal links? It seems like I should be ok but I just wanted to double check... you know how google can be
Intermediate & Advanced SEO | | KenyonManu3-SEOSEM0 -
Where does "Pages Similar" link text come from?
When I type in a competitor name (in this case "buycostumes") Google shows several related websites in it's "Pages Similar to..." section at the bottom of the page:
Intermediate & Advanced SEO | | costumeMy question, can anyone tell me where the text comes from that Google uses as the link. Our competitors have nice branded links and our is just a keyword. I can find nothing on-page that Google is using so it must be coming from someplace off-page, but where?
0 -
How to retain link juice moving to new site, cms and servers?
We have been hosting our website with a provider (their design and CMS) and we are now moving to a new design, better content focussing on keywords in a different CMS platform on different servers but want to retain the link juice from the old site. We have used Open Site Explorer Report to determine all the links to the old site and the pages they link to. What is the best strategy to keep the link juice flowing to the new site? Example This site <http: www.dogslifedownunder.com="" what-is-worse-then-going-to-the-v-e-t="">links to this page <http: 19105="" www.sydneyanimalhospitals.com.au="" ourstaff="" thevets="" tabid="" default.aspx="">on the old site.</http:></http:> We will have a similar page on the new site with the same staff members called for example: How do we ensure that the we retain the link juice? Any thoughts most welcome.
Intermediate & Advanced SEO | | Peter.Huxley590 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0