Open Site Explorer doesn't update correctly
-
We have been using OSE for a little over 2 Months now and our link demographics haven't changed although we have been getting lots of backlinks from other blogs and webpages.
Google Webmaster Tools displays these links but OSE, even after the updates, doesn't.
Our domain is about 1 Year old and has had indexed content on it since June this year.Google Webmaster Tools shows 24 Links to our domain but OSE only 3. The domain is http://www.wellbo.de
-
Yep i found majestic pretty useful, i also am now using a combination.
-
I posted the same question to the latest blog post on the linkscape update and i got some great advice. Essentially, don't rely simply on one source (even though SEOMOz rocks) for reviewing backlinks and authority. Some referred me to Majestic SEO and so far it is extremely close to YSE and WT. Check it out:
The free account will let you explorer basic backlink stats for any URL.
Good Luck - Kyle
-
Cannot say i'm happy with ose, inaccuracy to me.
-
OSE requires up to 60 days to find new links. The updates are based on the Linkscape crawl of the web which takes 2-3 weeks to complete. Once the crawl is complete, it takes 1-2 weeks to process and publish the data. Depending on when your link is published and the importance of the web page involved (i.e. PA/DA) the link may not be discovered during the current crawl cycle.
Also keep in mind Linkscape only crawls the top 25% of web pages on the internet. For the most part, if a link does not appear in OSE after 2 cycles the link has little to no value.
The index was updated last on November 28th. It will be updated again on January 4th. The update calendar can be seen here: http://seomoz.zendesk.com/entries/345964-linkscape-update-schedule
It had been updated previously on November 2nd. If you are aware of a link which was definitely in place in October but does not appear in OSE, please share the link and we can offer feedback. Most often the link is on a page with no importance. A few other likely possibilities:
-
the page is noindexed or blocked by robots.txt
-
the page is buried deep on the site
-
the site is an island page with no links to it
-
the site has indexing / navigation issues
-
the site has low DA/PR and the page is too many clicks from the home page to be crawled.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
Why Doesn't All Structured Data Show in Google Webmaster?
We have more than 80k products, each of them with data-vocabulary.org markup on them, but only 17k are being reported as having the markup in Google Webmaster (GW). If I run a page that GW isn't showing as having the structure data in the structured data testing tool (http://www.google.com/webmasters/tools/richsnippets), it passes. Any thoughts on why this would be happening? Is it because we should switch from data-vocabulary.org to schema.org? Example of page that GW is reporting that has structured data: https://www.etundra.com/restaurant-equipment/refrigeration/display-cases/coutnertop/vollrath-40862-36-inch-cubed-glass-refrigerated-display-cabinet/ Example of page that isn't showing in GW as having structured data: https://www.etundra.com/kitchen-supplies/cutlery/sandwich-spreaders/mundial-w5688-4-and-half-4-and-half-sandwich-spreader/
Technical SEO | | eTundra0 -
How is this possible? A 200 response and 'nothing' to be seen? Need help!
On checking this website http://dogtraining.org.uk/ I get a 200 response. But an Oops! Google Chrome could not find dogtraining.org.uk . Same with Firefox (Server not found). Obviously there is a problem - I just don't know where to 'start' investigating to spot the error. Can someone help me? Thank you!
Technical SEO | | patrihernandez0 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Google doesn't rank the best page of our content for keywords. How to fix that?
Hello, We have a strange issue, which I think is due to legacy. Generally, we are a job board for students in France: http://jobetudiant.net (jobetudiant == studentjob in french) We rank quite well (2nd or 3rd) on "Job etudiant <city>", with the right page (the one that lists all job offers in that city). So this is great.</city> Now, for some reason, Google systematically puts another of our pages in front of that: the page that lists the jobs offers in the 'region' of that city. For example, check this page. the first link is a competitor, the 3rd is the "right" link (the job offers in annecy), but the 2nd link is the list of jobs in Haute Savoie (which is the 'departement'- equiv. to county) in which Annecy is... that's annoying. Is there a way to indicate Google that the 3rd page makes more sense for this search? Thanks
Technical SEO | | jgenesto0 -
Business Site Hit Hard from The Penguin Update
Over the last week or so I have been swimming through all the forum posts regarding the latest google penguin update and I am still unclear as to why my site was hit so hard. I have lost 90% of my traffic and my target keywords have disappeared. Maybe I am missing something hear but if you could have a look at my site in question here : http://www.websitetemplatedesign.com/ Is there something that just stands out that I have missed as to why this site would have been hit so hard? Any input would be appreciated.
Technical SEO | | jmccommas1 -
Partial Site Move -- Tell Google Entire Site Moved?
OK this one's a little confusing, please try to follow along. We recently went through a rebranding where we brought a new domain online for one of our brands (we'll call this domain 'B' -- it's also not the site linked to in my profile, not to confuse things). This brand accounted for 90% of the pages and 90% of the e-comm on the existing domain (we'll call the existing domain 'A') . 'A' was also redesigned and it's URL structure has changed. We have 301s in place on A that redirect to B for those 90% of pages and we also have internal 301s on A for the remaining 10% of pages whose URL has changed as a result of the A redesign What I'm wondering is if I should tell Google through webmaster tools that 'A' is now 'B' through the 'Change of Address' form. If I do this, will the existing products that remain on A suffer? I suppose I could just 301 the 10% of URLs on B back to A but I'm wondering if Google would see that as a loop since I just got done telling it that A is now B. I realize there probably isn't a perfect answer here but I'm looking for the "least worst" solution. I also realize that it's not optimal that we moved 90% of the pages from A to B, but it's the situation we're in.
Technical SEO | | badgerdigital0 -
How long does it take open site explorer to recognize new links?
I'm building a steady link profile to one of my websites and the new links still haven't shown up in open site explorer even after 2 months. How long does it take OSE to recognize new backlinks?
Technical SEO | | C-Style2