How do I fix the 500 error when trying to use the page optimization tool?
-
I keep getting an error when using the page optimization tool - Moz staff replied when I used the chatbot and said that they're receiving a 500 error from my server and to whitelist pagella however my server is not blocking anything. I don't know how to fix this issue any ideas?
I've attached a picture of the error message I'm receiving for reference.
-
Start a new moz campaign with https as the URL
-
This is the best guide
Use this setup, you don't want to duplicate your web site HTTP to HTTPS if needed.
-
Put your http URL in search & https in replace
i hope this will help
https://devnet.kentico.com/articles/url-redirection
https://devnet.kentico.com/articles/real-world-examples---part-ii
-
Yes we're using Kentico.
-
Hi Brittany,
are you useing a CMS
here is information on ms azure
-
Hi Tom,
Yes we just moved to https. When I look at the Moz campaign it just says it's tracking the subdomain business.gogoair.com. It doesn't say whether its http or https, so I started a new campaign using https and tried using the page optimization tool and got the same error as before.
We are hosted on Azure using a Web Application Gateway, are there any specific things we need to do in moz to set this up properly?
Thanks,
Brittany
-
You just made the move to https right?
I ran a 301 check http 301’s to https
google shows that the site is not yet In the SERPS as https
what I think is happening is that you have to run a search & replace for all http URL’s making them https URLs in the database so the bot will not have to crawl http and do a 301 to https.
I will test useing DeepCrawl if you say ok?
Is the moz campaign setup to crawl http or https?
The site should work just fine with a search & replace
I will send you the DeepCrawl
hope Le that helps,
Tom
-
This error is occurring on all our URLs
-
Thanks I will test it dis this happen on all ur or just one or two?
-
Sure our domain is https://business.gogoair.com
-
If it's 301, perhaps you'll get lucky and the htaccess will be the cause.
-
This might be a 301 please send the URL via PM?
-
Can you send me your domain? PM or here if you're ok with that?
-
Hi Tom,
Thanks for your response, we ran our site through the links you provided and didn't have any issues. My developer is still not seeing any issues in our error logs on our side either. Any other ideas on how to fix this?
Thanks,
Brittany
-
Try running your site through https://technicalseo.com/seo-tools/fetch-render/ or https://redbot.org or https://www.screamingfrog.co.uk/seo-spider/ only for 500 pages
I have never seen any web application firewall make 500 errors it sounds to me like your server is underpowered or is going through some sort of code issue definitely look through your logs.
If you can't get through redbot.org call the developer
Hope this helps,
Tom
-
My experiences with a code 500 error on the Wordpress platform, have in most circumstances been caused by a memory limit issue set forth by Wordpress. Assuming your using wordpress the way you would increase your memory limit is within wp-config.php by changing the values on the memory limit to something along these lines:
define( 'WP_MEMORY_LIMIT', '256M' );
I'm pretty sure even the AWS ec2 micro has enough ram to cover 256. If it's not that, I would just run down the list of diagnostics:
https://www.lifewire.com/500-internal-server-error-explained-2622938
-
Hi there,
Tawny from Moz's Help Team here. I can confirm that we are getting a 500 response from your server. Unfortunately, other than making sure you're not blocking AWS and that Rogerbot is allowed, I'm not sure what else you can do. Have you checked over your server logs with your web developer to see if you can find any additional information as to why your server responded with a 500 error when we tried to crawl your site?
If you've got any more questions, feel free to reach out to us at help@moz.com and we'll do our best to sort things out!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solved Duplicate content error affecting 142 pages
Hello,
Product Support | | EISMarketing
Recently I noticed a new duplicate error notification.
This page: https://www.earley.com/insights/internet-things-and-product-data
is flagged as 'duplicate content' with 142 affected pages.
Here's an example of one of the affected pages:
https://www.earley.com/insights/how-ontologies-drive-digital-transformation
This is not an ecommerce site. The affected pages are blog posts. We are pretty prolific writers and over the years we have produced nearly 300 articles. We are a consulting firm and the articles are about our area of expertise and cover a wide range of topics within that space.
I just don't understand why this would be flagged as duplicate or what I'm supposed to do with this information!
Help!
Thanks!
Sharon0 -
Unsolved Almost every new page become Discovered - currently not indexed
Almost every new page that I create becomes Discovered - currently not indexed. It started a couple of months ago, before that all pages were indexed within a couple of weeks. Now there are pages that have not been indexed since the beginning of September. From a technical point of view, the pages are fine and acceptable for a Google bot. The pages are in the sitemap and have content. Basically, these are texts of 1000+ or 2000+ words. I've tried adding new content to pages and even transferring content to a new page with a different url. But in this way, I managed to index only a couple of pages. Has anyone encountered a similar problem?
Product Support | | roadlexx
Could it be that until September of this year, I hadn't added new content to the site for several months?
Please help, I am already losing my heart.0 -
Does Moz use Stripe as it Credit Card Processor
Hi , What Credit Card Processors does Moz use? Do they use Stripe.com ?
Product Support | | stuartcanvas0 -
MOZ refusing access to certain sections of the website if using VPN?
I've used PIA and expressvpn before without any problems. Today I got nordvpn and when I try to access my campaigns and link explorer but the connection is being refused while I can access billing and the forum section. I don't have this problem anywhere else atm so I think this might not be a problem from my side but rather from these specific sections. Edit: Any idea what's going on?
Product Support | | Isildur0 -
With the new update on the Page Optimization all my previous page scores and rankings are not longer showing up. Why?
Since the page optimization tab has updated all my previous page scores and rankings are no longer there. But if I am looking at the pages on the dashboard tab, they show up. When in the page optimization tab and try to click on a page to see the ranking, score, and tips to improve, it doesn't show any data. The page is almost blank. What happened and how do I fix it?
Product Support | | leannegoff1 -
I have removed a subdomain from my main domain. We have stopped the subdomain completely. However the crawl still shows the error for that sub-domain. How to remove the same from crawl reports.
Earlier I had a forum as sub-domain and was mentioned in my main domain. However i have now discontinued the forum and have removed all the links and mention of the forum from my main domain. But the crawler still shows error for the sub-domain. How to make the crawler issues clean or delete the irrelevant crawl issues. I dont have the forum now and no links at the main site, bu still shows crawl errors for the forum which doesnt exist.
Product Support | | potterharry0 -
Receiving incapsula error codes when trying to respond to questions
I have been receiving multiple error codes from incapsula when trying to edit answers or add an additional answer. I want to save the code I have seen is 112 or 12 I will take a screenshot the next time I see it. However, it has happened to me quite a few times and I have to refresh my browser every time I choose to post, edit basically do anything on the site. Using both Safari, Chrome and Firefox have yielded the same results. Tom PS I have tried to replicate the results when posting this no luck.
Product Support | | BlueprintMarketing0 -
Duplicate Page Content Report on Moz - Still ranking in Google Results?
Hi, I am experiencing 2 issues with the Duplicate Page Content in Moz. Every week it is notifying me of new duplicate content - so it seems to be missing duplicate content each week and the crawl is never above the 5000-6000 page mark, which means that it is under 10K which is the limit, so presumably everything is crawled in that one go so surely it should detect all of the dupe content on 1 crawl as opposed to having to do various crawls to detect it no? The dupe content report shows me pages that indeed have duplicate content but after checking, these are ranking on Google for their terms... ? Is the duplicate page content report incremental ? Will it add more duplicate content and increase the report every week or does it just show that week's results? If so, it's a bit like chasing my tail... Help!
Product Support | | bjs20100