Redirects and site map isn't showing
-
We had a malware hack and spent 3 days trying to get Bluehost to fix things. Since they have made changes 2 things are happening:
1. Our .xml sitemap cannot be created https://www.caffeinemarketing.co.uk/sitmap.xml we have tried external tools
2. We had 301 redirects from the http (www and non www versions) nad the https;// (non www version) throughout the whole website to https://www.caffeinemarketing.co.uk/ and subsequent pages
Whilst the redirects seem to be happening, when you go into the tools such as https://httpstatus.io every version of every page is a 200 code only whereas before ther were showing the 301 redirects
Have Bluehost messed things up? Hope you can help
thanks
-
I agree with what effectdigital said. It looks like everything is in place and your non-www and you http versions of the website are redirecting to the https-www version of the site.
-
That attachment shows that non HTTPS and non WWW URLs are being 301 redirected to the HTTPS-WWW version(s). That's what you want right? From your screenshot it seems like it is working how you want
Just so you know, when you put one architecture into Screaming Frog (e.g: you put in HTTP with no WWW), it doesn't limit the crawl to that specific architecture. If the crawler is redirected from non-WWW non HTTPS to HTTPS with WWW, then the crawler will carry on crawling THAT version of the site
If you wanted to crawl all of the old HTTP-non-WWW URLs, you would need to list all of them for SF in list mode and alter the crawlers settings to 'contain' it to just the list of URLs which you entered. I'm pretty sure then, you would see that most of the HTTP-non-WWW URLs are properly redirecting as they should be
As for the XML thing it's very common especially for people using Yoast. I think Yoast is really good by the way, but for some reason, on some hosting environments the XML sitemap starts blank-rendering. Most of the time hosting companies say they can't fix it and it's Yoast's fault but I don't really believe that. If a file (e.g: sitemap.xml) cannot be created, it's more likely they went in via FTP and changed some file read/write permissions and due to it being more locked down, the XML cannot be created anymore. If you were hacked by malware, they were likely over-zealous when locking your site back down and it's causing problems for your XML feed(s)
-
see attachement
-
Hi, are you able to please interpret this for me. It looks like the non www versions are showing as https://www version on 200. the home page looks like the only 301???
-
Hi Carrie,
For your 301 redirects on the root level, it sounds like the .htaccess file has changed on the server. Can you try validating those other http and non-www versions of the website through other tools like ScreamingFrog? If you're still getting 200 response codes, I would advise raising the issue with Bluehost as this is something they can fix.
As for the XML sitemap, do you mean that you're unable to upload a file to that location? Have you tried sFTP?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'domain:example.com/' is this line with a '/' at the end of the domain valid in a disavow report file ?
Hi everyone Just out of curiosity, what would happen if in my disavow report I have this line : domain:example.com**/** instead of domain:example.com as recommended by google. I was just wondering if adding a / at the end of a domain would automatically render the line invalid and ignored by Google's disavow backlinks tool. Many thanks for your thoughts
Technical SEO | | LabeliumUSA0 -
Site Migration between CMS's
Hi There, I have a technical question about migrating CMS's but not servers. My client has site A on Joomla install, He want's ot migrate to Wordpress and we will call this site B. As he has a lot of old content on site A he doesn't want to lose, he has put site B (wordpress install) on a subdirectory site.com/siteb (for example). and will use a htaccess to forward the root domain to this wordpress site. Therefore anyone going to www.site.com will see the new wordpress site and the old content and joomla install will sit on the root of the server. Will Google have an issue with this? Will it even find the old content? what are the issues for the new site and new content? Look forward getting your guys input
Technical SEO | | nezona1 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Can't get Google to Index .pdf in wp-content folder
We created an indepth case study/survey for a legal client and can't get Google to crawl the PDF which is hosted on Wordpress in the wp-content folder. It is linked to heavily from nearly all pages of the site by a global sidebar. Am I missing something obvious as to why Google won't crawl this PDF? We can't get much value from it unless it gets indexed. Any help is greatly appreciated. Thanks! Here is the PDF itself:
Technical SEO | | inboundauthority
http://www.billbonebikelaw.com/wp-content/uploads/2013/11/Whitepaper-Drivers-vs-cyclists-Floridas-Struggle-to-share-the-road.pdf Here is the page it is linked from:
http://www.billbonebikelaw.com/resources/drivers-vs-cyclists-study/0 -
WMT only showing half of a newly submitted XML site map
After upgrading design and theme on a relatively high traffic wordpress site, I created an XML site map through Yoast SEO since WP Engine didn't allow the old XML site map plugin I was using. A site:www.mysite.com search shows Google is indexing about 1,100 pages on my site, yet the XML site map I submitted shows "458 URLs submitted and 467 URLs indexed." These numbers are about 1/2 of what they should be. My old site map had about 1,100 URLs and 965 or so indexed (used noindex on some low value pages.) Any ideas as to what may be wrong?
Technical SEO | | JSOC0 -
Is buying a domain with a high PR and redirecting it to your site considered black hat?
I want to buy a domain that has clean backlinks and then redirect it to my new domain to bump up my PR. Is this considered a black hat technique? Thanks Carla
Technical SEO | | Carla_Dawson1 -
Does adding Tool Tips to a site hurt it's SEO?
I'm wanting to add tool tips to my site as it's intended for non-technical people that are wanting high tech equipment and services. I thought that by adding tool tips, I could clear any confusion they may have about a particular word right there rather then them having to search for what it means. I did some research online and saw that it may hurt SEO ratings but wanted to verify here first before deciding.
Technical SEO | | sDevik0 -
Www. version of my site shows nothing in Open Site Explorer
When I first setup my site the domain was learnbonds.com. I moved hosts a couple of months ago and as part of the process I asked them to make the site show as www.learnbonds.com which they did. Now however when I goto www.learnbonds.com in open site explorer it says there is no data. When I enter learnbonds.com into open site explorer it gives me data but says that the site has been redirected to the www. version which shows no data. Also in google webmaster when I try to set the preferred domain as the www. version it gives me the following message: Part of the process of setting a preferred domain is to verify that you own http://www.learnbonds.com/. Please verify http://www.learnbonds.com/. I am concerned that this is hurting my SEO and would appreciate any advice you can give. Thanks Dave
Technical SEO | | fxtrader19790