Sorry, but that URL is inaccessible
-
Our site is all on the https protocol so every time I use the on-page grader it tells me the link is unavailable. What can we do? When I use the http protocol (which is 301 redirected to the https) it still gives me the same message.
-
Seconded.
-
I would also definately like to see this added as ours is also in the finance sector and we have every page using https
-
I Would definately like to see this added as I have about 6 sites in the finance sector and we now have nearly every page using https
Cheers
-
It worked on the previous version, didn't it? I remember running a few reports for a clients website that now doesn't work. I seem to be getting a lot of "Website cannot be accessed" errors on websites that can be accessed.
I, too, think there is a bug in the system. Not one to cause a fuss but I don't like the thought of paying a lot of money for a service where my main usage is with a product that just doesn't work and the response being "Well, yeah sorry it just doesn't work. We MIGHT add it in".
I know it's not your fault, at all, it's just a shame that a service doesn't work on very common websites.
-
Abe,
Thanks for the response. The only thing I find weird is the reports that Moz run weekly are able to provide me On-Page Grades even hitting the pages on the same site. It seems like there might be something else going on. Any ideas?
bart
-
Hello Bart,
At this tim, our On Page Grader is not able to recognize https secure sites. This is the reason for the error you are seeing. Because the http site you are trying to enter actually redirects to this https site, our On Page Grader will still recognize this as an https site. We do hope to add this functionality in the future. I am truly Sorry for any inconvenience this may cause.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing a page url
I have a page that ranks well (#4) for a good keyword. However, the url has the keyword in it but is misspelled. I would like to change the url to have the correct spelling but do not want to lose the ranking that I have. What is the best and safest way to proceed?
On-Page Optimization | | bhsiao0 -
Content with changing URL and duplicate content
Hi everyone, I have a question regarding content (user reviews), that are changing URL all the time. We get a lot of reviews from users that have been dining at our partner restaurants, which get posted on our site under (new) “reviews”. My worry however is that the URL for these reviews is changing all the time. The reason for this is that they start on page 1, and then get pushed down to page 2, and so on when new reviews come in. http://www.r2n.dk/restaurant-anmeldelser I’m guessing that this could cause for serious indexing problems? I can see in google that some reviews are indexed multiple times with different URLs, and some are not indexed at all. We further more have the specific reviews under each restaurant profile. I’m not sure if this could be considered duplicate content? Maybe we should tell google not to index the “new reviews section” by using robots.txt. We don’t get much traffic on these URLs anyways, and all reviews are still under each restaurant-profile. Or maybe the canonical tag can be used? I look forward to your input. Cheers, Christian
On-Page Optimization | | Christian_T2 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
Errors in URL´s
SEOMOZ is showing quite a lot of URL Errors like this: http://trampoliny.net.pl/akcesoria/pokrowiec-basic?frontend=1825cb1eea3af8ee6ee2d96617d32ff6 All these URL´s use the parameter "?frontend=". In webmaster tools we told google not to index this parameter. Unfortunately at the moment we cannot set this parameter as "NOINDEX". We also dont want to use a robots.txt file. How to get rid of the URLS in Seomoz?
On-Page Optimization | | drgoodcat0 -
What are the best practices for google indexing ajax response urls?
We just did a build to our site and our server erros went up to over 9,500. After looking into it, it seems like google is crawling the ajax urls and coming back with the errors. Here is one example http://www.rockymountainatvmc.com/productDetail.do?navType=type&vehicleId=1423&webTypeId=68&navTitle=Drive&webCatId=9&prodFamilyId=29660 If you know of any good articles on this please send them my way.
On-Page Optimization | | DoRM0 -
URL and SEO
How much weight do search engines give the URL? We're a medical call center provider and medicalcallcenter is part of our URL. Does that help us much? Thanks!!
On-Page Optimization | | THMCC0 -
Duplicate Content- Best Practise Usage of the canonical url
Canonical urls stop self competition - from duplicate content. So instead of a 2 pages with a rank of 5 out of 10, it is one page with a rank of 7 out of 10.
On-Page Optimization | | WMA
However what disadvantages come from using canonical urls. For example am I excluding some products like green widet, blue widget. I have a customer with 2 e-commerce websites(selling different manufacturers of a type jewellery). Both websites have massive duplicate content issues.
It is a hosted CMS system with very little SEO functionality, no plugins etc. The crawling report- comes back with 1000 of pages that are duplicates. It seems that almost every page on the website has a duplicate partner or more. The problem starts in that they have 2 categorys for each product type, instead of one category for each product type.
A wholesale category and a small pack category. So I have considered using a canonical url or de-optimizing the small pack category as I believe it receives less traffic than the whole category. On the original website I tried de- optimizing one of the pages that gets less traffic. I did this by changing the order of the meta title(keyword at the back, not front- by using small to start of with). I also removed content from the page. This helped a bit. Or I was thinking about just using a canonical url on the page that gets less traffic.
However what are the implications of this? What happens if some one searches for "small packs" of the product- will this no longer be indexed as a page. The next problem I have is the other 1000s of pages that are showing as duplicates. These are all the different products within the categories. The CMS does not have a front office that allows for canonical urls to be inserted. Instead it would have to be done going into the html of the pages. This would take ages. Another issue is that these product pages are not actually duplicate, but I think it is because they have such little content- that the rodger(seo moz crawler, and probably googles one too) cant tell the difference.
Also even if I did use the canonical url - what happened if people searched for the product by attributes(the variations of each product type)- like blue widget, black widget, brown widget. Would these all be excluded from Googles index.
On the one hand I want to get rid of the duplicate content, but I also want to have these pages included in the search. Perhaps I am taking too idealistic approach- trying to optimize a website for too many keywords. Should I just focus on the category keywords, and forget about product variations. Perhaps I look into Google Analytics, to determine the top landing pages, and which ones should be applied with a canonical. Also this website(hosted CMS) seems to have more duplicate content issues than I have seen with other e-commerce sites that I have applied SEO MOZ to On final related question. The first website has 2 landing pages- I think this is a techical issue. For example www.test.com and www.test.com/index. I realise I should use a canonical url on the page that gets less traffic. How do I determine this? (or should I just use the SEO MOZ Page rank tool?)0 -
20 x '400' errors in site but URLs work fine in browser...
Hi, I have a new client set-up in SEOmoz and the crawl completed this morning... I am picking up 20 x '400' errors, but the pages listed in the crawl report load fine... any ideas? example - http://www.morethansport.co.uk/products?sortDirection=descending&sortField=Title&category=women-sports clothing
On-Page Optimization | | Switch_Digital0