Why am I not getting my allowance of 10,000 inbound links in csv download file? 370 out of 4700??
-
Hi,
I'm desparately trying to audit my backlinks to remove a penguin penalty on my site livefit.co.uk
When I do the inbound link report i'm not getting all the links in the download. I know there is a limit of 25 links from each linking site so we get the full picture of links bu:
-
I have 4700 links so why does it need to limit it when we are supposed to see up to 10,000?
-
When you check the link profile on the report it doesn't seem there are many sites with anything close to 25, so surely that rule is invalid as an explanation here?
Should I just work off OSE? But there is less useful info than on the csv..
I'd be very grateful for your thoughts.
Thanks!
James
-
-
Hi James,
Kyle provided some very good responses. Has your issue been resolved?
Cheers,
Christy
-
The rule of thumb is to try and keep below 100 links per page as well as trying to limit the amount of site wide links.
Check out the crawl section in the pro campaign - they have a warning if you have to many on-page links. That would be a great starting place to check.
-
Thanks Kyle. I was already looking at the root domain. But your response triggered me to check external vs. internal and I found that the 370 links in the csv were my external AND the other 4300 were internal... is having that many internal links ok? The site has 92 indexed pages according to GWMT..
Thanks for your help
James
-
This has happened to me in the past, make sure when you are setting your filter you have everything set correctly to get your full link profile. Make sure you have the inbound links tab selected and your filter is set like this:
Show "all" links from "only external" pages to "pages on this root domain" and "show links ungrouped".
The biggest issue is making sure you have it set to root domain, if not you will only be pulling the backlink profile to your homepage. Let me know if that did the trick, if not i would submit a help ticket to help@semoz.org. They are normally really responsive!
Regards - Kyle
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Profile - What is Good or Bad?
How can I tell (using Moz tools) which links are okay or not? Looking at them, some are clearly just for rank, but are they worth my effort in removing them? And, if I remove a bunch of links suddenly, will Google penalize me? And how do I know if my link profile is to blame? My site was ranking pretty well in June and has steadily fallen around 4-5 spots across all of my keywords. Some of my competitors without as good of DA, PA and links are outranking me.
Moz Pro | | CalicoKitty20000 -
Difference Between equity passing and follow links
Hi, I am recently seeing two more new options in the Opensiteexplorer filter options. equity passing links
Moz Pro | | Dexjj
non equity passing links
nofollow
dofollow What is the difference between an equity passing links and dofollow links. Can you guys help me.4 -
No link data for my domains + still no adwords data
All 3 sites In my campaign have no link data analysis done. 5 to 6 months after you said you were working on it Adwords data is still not available in Keyword analysis Partial service is understandable for a few days but not for months. Beyond the nice screens, what data & services do you exactly provide and what are we paying for?
Moz Pro | | ResourceLab0 -
Does Rogerbot respect the robots.txt file for wildcards?
Hi All, Our robots.txt file has wildcards in it, which Googlebot recognizes. Can anyone tell me whether or not Rogerbot recognizes wildcards in the robots.txt file? We've done a Rogerbot site crawl since updating the robots.txt file and the pages that are set to disallow using the wildcards are still showing. BTW, Googlebot is not crawling these pages according to Webmaster Tools. Thanks in advance, Robert
Moz Pro | | AC_Pro0 -
Does Google have a direct link with facebook and twitter?
Google monitor social media. What I'm wondering is do Google use the same tools we have on Facebook's API, Twitter's API etc to use in their SERPs Or do Facebook grant Google more detailed access to see who has liked links etc. I think it's quite an interesting point as surely I can push up my own count by repeatedly sharing my own links, which wouldn't be genuine. If Google had better access they could then determine what's been faked etc.
Moz Pro | | PhotoGazza0 -
What is the best link building management tool ?
What is the best link building management tool that can automatically fill in content for submission for me, check whether my sites are submitted on other sites, propose new lists of submission sites, organize my link building (by date, anchor text, url, page rank, both back links, reciprocal, paid, etc), organize my social media profiles and connect them to each other ?
Moz Pro | | CretanDevelopments0 -
Too Many On-Page Links: Crawl Diag vs On-Page
I've got a site I'm optimizing that has thousands of 'too many links on-page' warnings from the SeoMoz crawl diagnostic. I've been in there and realized that there are indeed, the rent is too damned high, and it's due to a header/left/footer category menu that's repeating itself. So I changed these links to NoFollow, cutting my total links by about 50 per page. I was too impatient to wait for a new crawl, so I used the On Page Reports to see if anything would come up on the Internal Link Count/External Link Count factors, and nothing did. However, the crawl (eventually) came back with the same warning. I looked at the link Count in the crawl details, and realized that it's basically counting every single '<a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p=""></a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">1. Is no-follow a valid strategy to reduce link count for a page? (Obviously not for SeoMoz crawler, but for Google)</a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">2. What metric does the On-Page Report use to determine if there are too many Internal/External links? Apologies if this has been asked, the search didn't seem to come up with anything specific to this.</a>
Moz Pro | | icecarats0 -
How come there are no links to my website according to SEOmoz Competive domain analysis, while in google webmaster i do see links.?
I dont see any links to at all when i do a Competitive Domain Analysis in SEOmoz. However i do see links in google webmaster tools. this strikes me as odd. Also when i use open site exployer my website dont seem te be found. In google im on page 9 on my focus keyword so i do think there are links to my site. I would like to know what i can do so i can analyse my links in seomoz Competitive domain analysis. Many thanks. url: http://www.sadpanda.nl
Moz Pro | | Aquive0