We are using Hotlink Protection on our server for jpg mostly. What is moz.com address to allow crawl access?
-
We are using Hotlink Protection on our server for jpg mostly. What is moz.com crawl url address so we can allow it in the list of allowed domains? The reason is that the crawl statistics gives our a ton of 403 Forbidden errors.
Thanks.
-
Hi there!
Thanks for reaching out to us! I can certainly understand your need to have our crawler be accepted into your link protection program. Unfortunately our crawler doesn't operate using a url to crawl your site, we use a collections of IP addresses behind the scenes to mimic a search engine crawler to provide the best diagnoses around. With that said, a lot of our customer has had some success allowing our crawler on their server level through either their HTTP access file or other methods. Unfortunately I am not a Web developer or a server admin, so I couldn't exactly let you know how to implement it. I would recommend you to perhaps change up your question and post a new question regarding some work around for your software.
Thanks for your time, I hope that helps.
Peter
Moz Help Team.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unable to site crawl
Hi there, our website was revamped last year and Moz is unable to crawl the site since then. Could you please check what is the issue? @siteaudits @Crawlinfo gleneagles.com.my
Technical SEO | | helensohdg380 -
How to create sitemap for example.com and blog.example.com ?
Hi I try to create sitemap for www.example.com, this website has link www.blog.example.com. after creating the sitemap using different tool. the sitemap not include www.blog.example.com and its relative files how can i get both example.com and blog.example.com in one sitemap
Technical SEO | | fogtheagency0 -
Redirect Impact - Moving From SEOmoz to Moz
Hey Guys, This has been on my mind ever since the big announcement, so today I did some searching around for some posts/talk about what the impact of their full site redirect has been for them and didn't find anything. Have they posted on this or are there any threads that I'm missing out on? I'd love to hear more about what the impact has been or any general thoughts/insights people may have. Thanks!
Technical SEO | | TakeLessons
Jon1 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Blocking AJAX Content from being crawled
Our website has some pages with content shared from a third party provider and we use AJAX as our implementation. We dont want Google to crawl the third party's content but we do want them to crawl and index the rest of the web page. However, In light of Google's recent announcement about more effectively indexing google, I have some concern that we are at risk for that content to be indexed. I have thought about x-robots but have concern about implementing it on the pages because of a potential risk in Google not indexing the whole page. These pages get significant traffic for the website, and I cant risk. Thanks, Phil
Technical SEO | | AU-SEO0 -
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
Technical SEO | | RobertFisher0 -
Mobile site - allow robot traffic
Hi, If a user comes to our site from a mobile device, we redirect to our mobile site. That is www.mysite/mypage redirects to m.mysite/mypage. Right now we are blocking robots from crawling our m. site. Previously there were concerns the m. site could rank for normal browser searches. To make sure this isn't a problem we are planning on rel canonical our m. site pages and reference the www pages (mobile is just a different version of our www site). From my understanding having a mobile version of a page is a ranking factor for mobile searches so allowing robots is a good thing. Before doing so, I wanted to see if anyone had any other suggestions/feedback (looking for potential pitfalls, issues etc)
Technical SEO | | NicB10 -
Appropriate Use of Rel Canonical
When using the On page report card I get a critical error on Rel Canonical Im not sure if I have understood this right but I think that my problem is that I own a Norwegian Domain name which is www.danske-båten.no This domain works great in norwegian, but I get problems with english (foreign) browsers. My english domain name is http://www.danske-båten.no. When you buy a domain name with the letter Å you get a non norwegian domain name as well. (dont quite get the tecnical aspect of it) Så when I publish a page (using wordpress if that means anything) I get this message: Appropriate Use of Rel Canonical Moderate fix <dl> <dt>Canonical URL</dt> <dd>"http://www.danske-båten.no/ferge-oslo-københavn/"</dd> <dt>Explanation</dt> <dd>If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL.</dd> <dt>Recommendation</dt> <dd>We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply.</dd> <dd>So What to do to fix this?
Technical SEO | | stlastla
</dd> </dl>0