Most of the time getting error.
-
Hi,
i am getting this error most of the time in linkscape since last month.
Sorry dude, no inlinks found matching this criteria.
Pl guide is this a bug and the sites I am trying to use linkscape for were having lot of pages crawled earlier by SEOMOZ.
Thanks,
Preet
-
Hey Preet,
That's a good question, with a lot of information involved, actually! I'm so sorry that you still haven't been able to see your links in Linkscape. Most new sites and links will be indexed by our spiders and available in Linkscape and Open Site Explorer within 60 days, but some take even longer for a plethora of reasons, including crawl-ability of sites, the amount of inbound links to them, and the depth of pages in subdirectories. Just so you know, here's how we do our index: we take the last index, take the 10 billion URLs with the highest mozrank (with a fixed limit on some of the larger domains), and start crawling from the top-down until we've crawled 40,000,000,000 pages (which is about 1/4 of the amount in Google's index). Therefore, if the site is not linked to by one of these seed URLs (or one of the URLs linked to by them in the next update) then it won't show up in our index
We update our Linkscape Index every 3 to 5 weeks. Crawling the whole internet to look for links takes 2-3 weeks. And then we've got 1-2 weeks of processing to do on those links to determine which are the most important links etc. You can see a schedule of how often we update, and planned updates here: http://seomoz.zendesk.com/entries/345964-linkscape-update-schedule
Linkscape focuses on a breadth-first approach, and thus we nearly always have content from the homepage of websites, externally linked-to pages and pages higher up in a site's information hierarchy. However, deep pages that are buried beneath many layers of navigation are sometimes missed and it may be several index updates before we catch all of these.
If our crawlers or data sources are blocked from reaching those URLs, they may not be included in our index (though links that points to those pages will still be available). Finally, the URLs seen by Linkscape must be linked-to by other documents on the web or our index will not include them.
For now, the best thing you can do to help your domain become indexed is to work on link building for links from sites with high mozrank. If you need help with that, you may want to ask the PRO Q&A community here!
I hope this information helps! While the site and links may not be indexed yet, give it some time - maybe we'll see it in OSE next month.
Best of luck,
Aaron -
Can you share the site you are getting that error for?
You might also want to email help@seomoz.org
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do i get the crawler going again?
The initial crawl only hit one page. Set up another campaign for another site and it crawled 260 pages. How can I get the crawler started up again or do I really have to wait a week ?
Moz Pro | | martJ0 -
Error on duplicated content, but when checking shouldn't been possible
Dear all, Every week I look at the different crawl reports for our website, since the start of my SeoMoz membership the Errors for duplicated content and duplicated Title is rising. But if I take out the .csv file and look in more detail, and select a pages which is marked as duplicated content, a canonical is actually existing on this page. So it shouldn't be an warning, I have no idea what the issue could be. For example pagesare marked as duplicated content, <colgroup><col width="966"></colgroup>
Moz Pro | | Letty
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?sortby=2 |
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?startnumber=60&sortby=2 |
| http://www.zylom.com/es/descargar-juegos/3-en-raya/?startnumber=80&sortby=2 | the parameters after '?' (question mark) are necessary for our internal system. To overcome duplicated content we coded that a canonical tag onis placed on every page with parameters and the main page is http://www.zylom.com/es/descargar-juegos/3-en-raya/ but it doesn't seem to work, because my error warnings are still rising. Please advice me Kind regards, Ms Letty van Eembergen0 -
Campaign web crawl has failed last 4 times
I have 4 websites setup in my pro dashboard. The only site that isn't getting crawled is an HTTPS site. It has worked for over a year, but the past 4 crawls (an entire month now) has returned only one page crawled. Is there something going on with the crawler? I really need to be able to see these stats. Has anyone else experienced this issue?
Moz Pro | | nbyloff0 -
Status Errors generated from xml site map
I just ran a crawl test on our site and I'm seeing a lot of 404 errors that are referredt from the xml sitemap.. Anyone know how to fix it?
Moz Pro | | IITWebTeam0 -
Getting PA & DA off of a list of links
I have a list of links that I want to get PA and DA for each individual link, can this be done in some way other than one at a time? I've heard this can be done with excel and using api but I don't know the specifics of this.. Help would be appreciated
Moz Pro | | Fergclaw2 -
How do I get a list of my non-followed links
Our site has a lot of non-followed links showing up on the latest SEOMoz Competitive Link Analysis update and I would like to know how to get a list of the sites that aresettin no-follow.
Moz Pro | | oznappies0 -
4xx (not found) errors seem spurious, caused by a "\" added to the URL
Hi SEOmoz folks We're getting a lot of 404 (not found) errors in our weekly crawl. However the weird thing is that the URLs in question all have the same issue. They are all a valid URL with a backsalsh ("") added. In URL encoding, this is an extra %5C at the end of the URL. Even weirder, we do not have any such URLs in our (Wordpress-based) website. Any insight on how to get rid of this issue? Thanks
Moz Pro | | GPN0