Does the seomoz crawler that crawls for the onpage reports have a set ip?
-
I would like to test my site, but its not launched yet and don;t want anybody to see it. But I can allow myself and other to view the site if I have there ip address.
So does the seomoz crawler have a static one or range?
James
-
Thanks Istvan.
-
Hi James,
With this question I would contact the Help Desk Team.
You can go to: https://seomoz.zendesk.com/home and submit a ticket or contact them directly via help@seomoz.org
I am sure they will come up with an answer or advice how to do resolve your issue
Good luck,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify reported by Moz
According to Moz crawl report, there are hundreds of duplicate pages in our Shopify store ewatchsale.com. The main duplicate pages are:
On-Page Optimization | | ycnetpro101
https://ewatchsale.com/collections/seiko-watches?page=2
https://ewatchsale.com/collections/all/brand_seiko
(the canonical page should be https://ewatchsale.com/collections/seiko-watches) https://ewatchsale.com/collections/seiko-watches/gender_mens
(the canonical page should be https://ewatchsale.com/collections/seiko-watches/mens-watches) Also, I want to exclude indexing of pages URLs with "filter parameters" like https://ewatchsale.com/collections/seiko-watches/color_black+mens-watches+price_us-100-200 Shopify advised we can't access our robots.txt file. How can we exclude SE crawling of the page URLs with filter names?
How can we access the robots.txt file?
How can we add canonical code to the preferred collection pages? Which templates and what codes to add? Thanks for your advice in advance!0 -
Help: my WordPress Blog generates too many onpage links and duplicate content
I have a WordPress Blog since November last year (so I'm pretty new to WordPress) and the effects on ranking for some keywords are really good. So I thought tag clouds are good. Crawl Diagnostics tell me now that I have too many onpage links for example my author page breaks the record: 256
On-Page Optimization | | inlinear
http://inlinear.com/blog/author/inlinear/ I think thats because there are links for each word in the tag cloud generated ... On this page (and many other pages) WordPress displays (teasers) the beginning of each post (read more ...) producing duplicate content and even new canonical tags.... The page titles are also too long because I installed "All in One SEO Pack" and now this plugin and wordpress itself mixes titles together ... But what can I do to avoid all this. Is there a PlugIn that can help... I think millions of blogs will have the same problems... I my blog yet has very few content. Thanks for your answers :))0 -
Seomoz.org Category and Tags practice
Hello, I have been checking seomoz sourcecode and architecture these days in order to learn and to apply it in my site but I havent managed to find "tags" at all. Just the "Posts by Categories" on the right sidebar. Is this the only way you are categorising content? In this case, the only way spiders have to find your content is via these category archive pages and the general sitemap? Thanks!
On-Page Optimization | | antorome0 -
SEOmoz's On-page Checker upto date?
Helllo Mozzers, Just wondering if SEOmoz's on-page optimisation checker is upto date with google recent updates? If not... what do you suggest?
On-Page Optimization | | Prestige-SEO0 -
What does this mean on first step up setting up a campaign? "Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here."
I am BRAND new to this, and setting up my first campaign. I choose subdomain, and entered www.pdsaz.com. This is the message I receive: We have detected that the domain www.pdsaz.com and the domain pdsaz.com both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here.
On-Page Optimization | | cschwartzel0 -
Is it possible to have the crawler exclude urls with specific arguments?
Is it possible to exclude specific urls in the crawl that contain certain arguments - like you can do in google webmaster tools?
On-Page Optimization | | djangojunkie0 -
Only 1 Page Being Crawled
I have a website I'm tracking www.alhi.com. But my report is saying that only 1 page is being crawled each update. My campaign is set up for the sub domain www.alhi.com, so I'm not sure why I would have this issue. Can you help? Thanks!
On-Page Optimization | | LeslieVS0 -
Not making a change of the 100's in crawl Diagnostic
Based on the PRO crawl Diagnostics – if we don’t make a change on 1 page, does that just affect the SEO on that one page, or does it affect the SEO on all pages of the site? E.g. If we get a “Too many on page links” for a certain page that we don’t really want to rank for – does not fixing that particlaur page affect the site as a whole? Hope I explained this ok..
On-Page Optimization | | inhouseninja0