Can a cloud based firewall affect my search ranking?
-
Hi,
I recently implemented a firewall on my website to prevent hacking attacks. We were getting a crazy amount of people per day trying to brute force our website. I used the sucuri cloud proxy firewall service which they claim because of the super fast caching actually helps SEO. I was just wondering is this true? Because we're slowly falling further and further down the SERPS and i really don't know why. If not, is there any major google update recently I don't know about?
Thanks,
Robert
-
The only thing I can think of is if you are located in the UK. It might of affected your rankings if the firewall uses US Data centres?
If this is your site: http://bearsbys.com/
You are using a .com domain for a UK website so having your IP change to US could mess things up a little. The first thing I would do is go to Google Webmaster Tools and make sure you have set your Location to the United Kingdom.
Check to see if your site is hosted on a UK Hosting provider. Then check and see if Sucuri only uses US Data Centres.
-
I don't think so.. that's what the whole rebuild was for was to mainly to improve all of the content and the design of the site... i'm not sure what it is but it's massively affected me
-
With them there is no telling. Maybe a quality update? Who knows.
-
Hey thanks so much for getting back to me, according to the link you sent google did a core algorithm update in January which actually sounds about right. We put a brand new site up at the beginning of Feb so i think we might've got some false placement for a while and now it's settled back into it and we're losing traction. I wonder what they've done that's changed so much. hmmm..
-
Hey,
I cant speak on whether or not the firewall offers any seo benefits but I would imagine it would especially if it is caching faster. However Moz has compiled a listing of algorithm updates. I would look at when your rankings started to fall and see if it matches up with the updates and check your backlink profile and do some competitor research to see if anyone is doing anything different.
I hope that helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Core Web Vitals hit Mobile Rankings
Hey all, Ever since Google announced "Core Web Vitals" are mobile rankings have nose-dived. At first, I thought it was optimisation changes to the page titles we had made which might still be part of the issue. However, Desktop rankings actuallyy increased for the same pages where mobile decreased. There is the plan to introduce a new ranking signal into the Google algorithm called the "core web vitals: and this was discussed around late May. even though it's supposed to get fully indexed into a ranking signal later this year or early next; I think Google continuously test and release this items before any official release. If you weren't aware, there is a section in Google Webmaster Tools related to "core web visits", which looks at:1. Loading2. Interactivity3. Visual StabilityThis overlays some of the other basic requirements of a good website and mobile experience. Taking a look at our Google Search Console, it appears to be the following:1. Mobile- 1,006 poor URLs, 100URLs need improvement and 475 good URLs.2. desktop- 0 poor URLs, 379 need improvements and 1,200 good URLsSOURCE: https://search.google.com/search-console/core-web-vitals?resource_id=https%3A%2F%2Fwww.griffith.ie%2FIn the report, we can see two distinct issues with the mobile pages:CLS Issue: more than 0.25 (mobile)- 1,006 casesLCP issue: longer than 4secs (mobile) - 348 case_CLS (Cumulative Layout Shift)This is a developer issue, and needs fixing. It's basically when a mobile screen jumps for the user. It is explained in this article: https://web.dev/cls/Seems to be an issue with all pages. **LCP (Largest Contentful Paint)_**Again, another developer fix that needs to be implemented. It's connected to page speed, and can be viewed here: https://web.dev/lcp/Looking at GCS, it looks like the blog content is mostly to blame.It's worth fixing these issues and again looking at the other items on page speed score tests:1. Leverage browser caching- https://gtmetrix.com/reports/griffith.ie/rBtvUC0F2. https://developers.google.com/speed/pagespeed/insights/?url=griffith.ie- mobile score for home page is 16/100, https://www.griffith.ie/people/thamil-venthan-ananthavinayagan is 15/100I think here is the biggest indicator of the issue at hand. Has anybody else noticed their mobile rankings go down and desktop stay the same of increase.Kind regards,
Web Design | | robhough909
Rob0 -
How does ARIA-hidden text appear to search engines
I'm having trouble getting my accessibility team to add alt text to our site's images for SEO benefits as they feel some of it would add additional noise for screen readers. They proposed using ARIA-hidden attributes to hide the text but I'm wondering if will that be interpreted as a cloaking tactic to search engines? Also, I'm wondering if it the alt text will carry the same weight if ARIA-hidden is used. Has anyone had any experience with this? I'm having trouble finding any research on the topic.
Web Design | | KJ6001 -
Ranking of non-homepage leads to decrease in website ranking?
Hi all, Google picks up a non-homepage to rank for primary keyword where homepage is actually optimised to rank for same keyword. This means Google is ignoring the actual page and ranking other page. Does this scenario means that we are ranking lower as the homepage is not considered here? We may rank much better if homepage is preferred by Google? Thanks
Web Design | | vtmoz0 -
Curious why site isn't ranking, rather seems like being penalized for duplicate content but no issues via Google Webmaster...
So we have a site ThePowerBoard.com and it has some pretty impressive links pointing back to it. It is obviously optimized for the keyword "Powerboard", but in no way is it even in the top 10 pages of Google ranking. If you site:thepowerboard.com the site, and/or Google just the URL thepowerboard.com you will see that it populates in the search results. However if you quote search just the title of the home page, you will see oddly that the domain doesn't show up rather at the bottom of the results you will see where Google places "In order to show you the most relevant results, we have omitted some entries very similar to the 7 already displayed". If you click on the link below that, then the site shows up toward the bottom of those results. Is this the case of duplicate content? Also from the developer that built the site said the following: "The domain name is www.thepowerboard.com and it is on a shared server in a folder named thehoverboard.com. This has caused issues trying to ssh into the server which forces us to ssh into it via it’s ip address rather than by domain name. So I think it may also be causing your search bot indexing problem. Again, I am only speculating at this point. The folder name difference is the only thing different between this site and any other site that we have set up." (Would this be the culprit? Looking for some expert advice as it makes no sense to us why this domain isn't ranking?
Web Design | | izepper0 -
New Re-design will my website rankings drop?
Hi guys, I have had to re-design my site although we are only 4 months into the seo game we have seen some good progress with our rankings. My question is there anything I need to consider before implementing the new designs so it doesn't effect my current rankings or any of our SEO work. Our current designs are content thing and so we have had to create more content to better optimize our site, however if doing so will this loose our current ranking position? Apperciate any advice around this Thanks
Web Design | | edward-may0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
Will changing our URL's to MVC friendly URL's have a positive or negative affect on our rankings and link juice?
We've recently changed our site over to a new hosting system, we've got similar pages and are now looking at changing the URL's to ensure we do not loose our link juice from our previous site. My question is regarding the URL's, is it worth us changing our URL's to MVC friendly URL have a good or bad affect on our rankings and or link juice? Thanks
Web Design | | SimonDixon0