Cloudflare - Should I be concerned about false positives and bad neighbourhood IP problems
-
I am considering using cloudflare for a couple of my sites.
What is your experience?I researched a bit and there are 3 issues I am concerned about:
-
google may consider site bad neighbourhood in case other sites on same DNS/IP are spammy.
Any way to prevent this? Anybody had a problem? -
ddos attack on site on same DNS could affect our sites stability.
-
blocking false positives. Legitimate users may be forced to answer captchas etc. to be able to see the page. 1-2% of legit visitor were reported by other moz member to be identified as false positive.
Can I effectively prevent this by reducing cloudflare basic security level?
Also did you experience that cloudflare really helped with uptime of site? In our case whenever our server was down for seconds also cloudflare showed error page and sometimes cloudflare showed error page that they could not connect even when our server response time was just slow but pages on other domains were still loading fine.
-
-
Thanks Cyrus.
-
You may be interested in this post titled "Cloudflare and SEO" : https://blog.cloudflare.com/cloudflare-and-seo/
"We did a couple things. First, we invented a new technology that, when it detects a problem on a site, automatically changes the site's CloudFlare IP addresses to isolate it from other sites. (Think of it like quarantining a sick patient.) Second, we worked directly with the crawl teams at the big search engines to make them aware of how CloudFlare worked. All the search engines had special rules for CDNs like Akamai already in place. CloudFlare worked a bit differently, but fell into the same general category. With the cooperation of these search teams we were able to get CloudFlare's IP ranges are listed in a special category within search crawlers. Not only does this keep sites behind them from being clustered to a least performant denominator, or incorrectly geo-tagged based on the DNS resolution IP, it also allows the search engines to crawl at their maximum velocity since CloudFlare can handle the load without overburdening the origin."
-
Thanks Tom.
I will move now one of my main domains and will use their PRO plan. Noticed they have quite a number of settings to address the false positives. Our problem with cloudflare error pages may have been a temporary one while they where building the cache of the site. Anyway it is easy to enable/disable the cloudflare protection. So not much risk here. Could save us of a lot of potential headache in the future if it works as advertised. -
Hi,
-
I have used CloudFlare for a few sites and never had an issue with this. It is a risk/concern with all shared hosting, but CloudFlare are very proactive about addressing anything impacting their customers, so I would not have a concern on this side of things at all.
-
Again, I wouldn't have concerns here. CloudFlare are very adept at handling large-scale DDOS attacks . Having read some of their post-attack analysis reports, they usually mitigate any impact to customers very quickly. They have loads of customers, and if this sort of thing was an issue I think we'd hear about it fairly often.
-
I can't speak to the % of users that might get falsely identified as a risk and presented a CAPTCHA, but I'd be very surprised if it was as high as 1-2%; I've rarely seen that CAPTCHA screen myself. You should check what CloudFlare have to say on this issue, but I would have no concern here either.
I have never had an issue with CloudFlare impacting SEO performance or impacting the user experience. It has generally performed well for me, but the biggest issue I see with it is people hoping it is a 'cure all' and means they don't need to properly address issues affecting the performance of their site. If your database performance is very poor, meaning dynamic pages take a long time to load, then CloudFlare is not the answer (it may help - but you should address the underlying issue).
I am unsure about the issue with CloudFlare failing when your server is slow - I'd imagine CloudFlare support could help you with this - there may be a configuration option somewhere.
Overall - my suggestion would be that you go for it.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting Same Press Release Content to Multiple PR Sites - Good or Bad Practice?
I see some PR (press release) sites where they distribute the same content on many different sites and at end they give the source link is that Good SEO Practice or Bad ? If it is Good Practice then how Google Panda or other algorithms consider it ?
Intermediate & Advanced SEO | | KaranX0 -
Link cloaking in 2015\. Is it a bad idea now?
Hi everyone, I run a travel-related website and work with various affiliate partners. We have thousands of pages of well-written and helpful content, and many of these pages link off to one of our affiliates for booking purposes. Years ago I followed the prevailing wisdom and cloaked those links (bouncing them into a folder that was blocked in the robots.txt file, then redirecting them off to the affiliate). Basically, doing as Yoast has written: https://yoast.com/cloak-affiliate-links/ However, that seems kind of spammy and manipulative these days. Doesn't Google talk about not trying to manipulate links and redirect users? Could I just "nofollow" these links instead and drop the whole redirect charade? Could cloaking actually work against you? Thoughts? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
301 redirects broken - problems - please help!
Hi, I have a bit of an issue... Around a year ago we launched a new company. This company was launched out of a trading style of another company owned by our parent group (the trading style no longer exists). We used a lot of the content from the old trading style website, carefully mapping page-to-page 301 redirects, using the change of address tool in webmaster tools and generally did a good job of it. The reason I know we did a good job is that although we lost some traffic in the month we rebranded, we didn't lose rankings. We have since gained traffic exponentially and have managed to increase our organic traffic by over 200% over the last year. All well and good. However, a mistake has recently occurred whereby the old trading style website domain was deleted from the server for a period of around 2-3 weeks. It has since been reinstated. Since then, although we haven't lost rankings for the keywords we track I can see in webmaster tools that a number of our pages have been deindexed (around 100+). It has been suggested that we put the old homepage back up, and include a link to the XML sitemap to get Google to recrawl the old URLs and reinstate our 301 redirects. I'm OK with this (up to a point - personally I don't think it's an elegant solution) however I always thought you didn't need a link to the xml sitemap from the website and that the crawlers should just find it? Our current plan is not to put the homepage up exactly as it was (I don't believe this would make good business sense given that the company no longer exists), but to make it live with an explanation that the website has moved to a different domain with a big old button pointing to the new site. I'm wondering if we also need a button to the xml sitemap or not? I know I can put a sitemap link in the robots file, but I wonder if that would be enough for Google to find it? Any insights would be greatly appreciated. Thank you, Amelia
Intermediate & Advanced SEO | | CommT0 -
IP Address: Ownership Location Versus IP Resolve
We are a US based ecommerce company that recently switched hosting to a Canadian owned company. I was told we would have a US based IP address but noticed yesterday that the MOZ bar is listing my website, 1800doorbell.com as a Canadian company. I've researched this online and what's typically stated is that your IP location needs to be in the Geo area you serve. When I brought his up to my host they stated: "The location being reported by many of these tools will be the one from the WHOIS. Since our corporation is registered in Canada, it will return a matching result. You can verify the location of the address by issuing a traceroute and examining the location codes at the end of the traceroute. For example, on: 96.125.180.207" So now I am really confused. What matters to me is how the search engines see my IP address. Will/do they see it as a US IP address? Below is the output from DNSstuff and thanks for any help: This is what I received back from DNSstuff: | ASN | 12179 |
Intermediate & Advanced SEO | | jake372
| Name | INTERNAP-2BLK |
| Description | - Internap Network Services Corporation |
| # Peers | 11 |
| # IPv4 Origin Ranges | 32 |
| # IPv6 Origin Ranges | 2 |
| Registrar | ARIN |
| Allocation date | Apr 13, 1999 |
| Country Code | US | | |
| Reverse | unknown.static.dal01.cologlobal.com. |
| Reverse-verified | No |
| Origin AS | - Internap Network S... |
| Country Code | CA |
| Country | Canada |
| Region | North America |
| Population | 31592805 |
| Top-level Domain | CA |
| IPv4 Ranges | 5944 |
| IPv6 Ranges | 336 |
| Currency | Canadian Dollar |
| Currency Code | CAD |
| IP Range - Start | 96.125.176.0 |
| IP Range - End | 96.125.191.255 |
| Registrar | ARIN |
| Allocation date | May 10, 2011 |0 -
Keywords loosing positions whats best can be done?
Hi, Keywords loosing positions whats best can be done? We have 1,000's of keywords we receive traffic each day and new ones added daily and many loose from position 1 to 10 or more What can we do to get back to position1 or the first page? Any tips i could get? Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
301 Redirection problems
A couple of days ago we did a restructure of our e-commerce site (wordpress + woocomerce) where some product categories needed to change names. I used Yoast SEO plugin to do 301 redirects in the .htaccess file.Today I noticed that we had two hits in the SERP on the phrase "dildos med vibrator". See the attached screenshot (first two results).One goes to http://www.oliverocheva.se/kategori/sexleksaker/dildos/dildos-med-vibrator/ which is the right URL. One goes to http://www.oliverocheva.se/kategori/sexleksaker/dildosdildos-med-vibrator-dildos-for-honom/ which is a corrupt URL that has never been in use. The old one we did a redirect from was /kategori/for-honom/dildos-for-honom/dildos-med-vibrator-dildos-for-honom/The command in the .htaccess file was: Redirect 301 /kategori/for-honom/dildos-for-honom/dildos-med-vibrator-dildos-for-honom/ http://www.oliverocheva.se/kategori/sexleksaker/dildos/dildos-med-vibratorWhat has happened here? Why does the 301 create entirely new URL:s in the SERP?Tz0TULT.png
Intermediate & Advanced SEO | | kisen0 -
Sitemap contains Meta NOINDEX pages - Good or bad?
Hi, Our sitemap is created by our e-commerce software - Magento - We are probably going to make a lot of products Meta No Index for the moment, until all the content has been corrected on them - but by default, as they are enabled, they will appear in Sitemap. So, the question is: "Should pages that are Meta NOINDEX be listed in a sitemap"? Does it matter? thanks!
Intermediate & Advanced SEO | | bjs20100 -
Separate IP Address for Blog
Our developers are recommending we sign up for a cloud based LAMP (Linux, Apache, MySQL, & PHP) server to install 3<sup>rd</sup> party software (Wordpress). They said "the blog will be on a separate IP address and potentially can have some impact with SEM/SEO." Can anyone expand on what impact this might have versus having it on the same IP?
Intermediate & Advanced SEO | | pbhatt0