Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
PageSpeed Insights DNS Issue
-
Hi
Anyone else having problems with Google's Pagespeed tool? I am trying to benchmark a couple of my sites but, according to Google, my sites are not loading. They will work when I run them through the test at one point but if I try again, say 15 mins later, they will present the following error message
An error has occured
DNS error while resolving DOMAIN. Check the spelling of the host, and ensure that the page is accessible from the public Internet. You may refresh to try again.
If the problem persists, please visit the PageSpeed Insights mailing list for support.
This isn't too much an issue for testing page speed but am concerned that if Google is getting this error on the speed test it will also get the error when trying to crawl and index the pages.
I can confirm the sites are up and running. I the sites are pointed at the server via A-records and haven't been changed for many weeks so cannot be a dns updating issue. Am at a loss to explain.
Any advice would be most welcome. Thanks.
-
Hi
Thanks for looking at the issue. There should be four working nameservers. I have four set in both WHM and at my domain registrar. I added two more two (3 and 4) so maybe they are taking a while to resolve around the web.
Will look at the SOA, thanks. Server and domain set up isn't at the top of my skill set. The domain you mention in this thread is just a testing domain to see what happens with a certain kind of content so it hasn't been treated too seriously to be honest.
-
It looks like you have some NS problems according to IntoDNS. The largest issue is that you have 4 nameservers listed with your registrar but only 2 are listed in your DNS.
~]$ dig NS yoactive.co.uk
;; ANSWER SECTION:
yoactive.co.uk. 86400 IN NS ns2.ccgdigitalmarketing.com.
yoactive.co.uk. 86400 IN NS ns1.ccgdigitalmarketing.com.You should have a NS record for each DNS server. Also, your SOA is set to expire in... 5 weeks?! That's crazy long.
If you can't get this resolved with your host, you can always try CloudFlare, a free DNS + proxy service.
-
thanks, very useful tool. None of my domains passed but i can still access them. Will try moving away from A record set up and use nameservers instead
-
Hi, you can check your dns config here:
http://dnscheck.pingdom.com. If problems with resolving your domain. ask you hostingprovider to set the dns right. otherwise you have to do it yourself in you hosting configuration tool like direct admin or plex.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Question about a Screaming Frog crawling issue
Hello, I have a very peculiar question about an issue I'm having when working on a website. It's a WordPress site and I'm using a generic plug in for title and meta updates. When I go to crawl the site through screaming frog, however, there seems to be a hard coded title tag that I can't find anywhere and the plug in updates don't get crawled. If anyone has any suggestions, thatd be great. Thanks!
Technical SEO | | KyleSennikoff0 -
Robots.txt in subfolders and hreflang issues
A client recently rolled out their UK business to the US. They decided to deploy with 2 WordPress installations: UK site - https://www.clientname.com/uk/ - robots.txt location: UK site - https://www.clientname.com/uk/robots.txt
Technical SEO | | lauralou82
US site - https://www.clientname.com/us/ - robots.txt location: UK site - https://www.clientname.com/us/robots.txt We've had various issues with /us/ pages being indexed in Google UK, and /uk/ pages being indexed in Google US. They have the following hreflang tags across all pages: We changed the x-default page to .com 2 weeks ago (we've tried both /uk/ and /us/ previously). Search Console says there are no hreflang tags at all. Additionally, we have a robots.txt file on each site which has a link to the corresponding sitemap files, but when viewing the robots.txt tester on Search Console, each property shows the robots.txt file for https://www.clientname.com only, even though when you actually navigate to this URL (https://www.clientname.com/robots.txt) you’ll get redirected to either https://www.clientname.com/uk/robots.txt or https://www.clientname.com/us/robots.txt depending on your location. Any suggestions how we can remove UK listings from Google US and vice versa?0 -
Issues with Magento layered navigation
Hi, We use Magento v.1.7 for our store. We have recently had an SEO audit and we have uncovered 2 major issues which can be pinpointed to our layered navigation. We use the MANAdev layered navigation module. There are numerous options available to help with SEO. All our filtered urls seem to be fine ie. https://www.tidy-books.co.uk/childrens-bookcases-shelves/colour/natural-finish-with-letters/letters/lowercase have canonical url correctly setup and the meta tags as noindex, follow but Magento is churning out tons of 404 error pages like this https://www.tidy-books.co.uk/childrens-bookcases-shelves/show/12/l/colour:24-4-9/letters:6-7 which google is indexing I'm at lost at how to solve this any help would be great. Thank you **This is from our SEO audit report ** The faceted navigation isn’t handled correctly and causes two major issues:● One of the faceted navigation filters causes 404 error. This means that the error isappended each sequence of the navigation options, multiplying the faulty URLs.● The pages created by the faceted nav are all accessible to the search engines. Thismeans that there are hundreds of duplicated category pages created by one of theparameters. The duplication issues can seriously hinder the organic visibility.The amount of 404 errors and the duplicated pages created by faceted navigation makes italmost impossible for a search engine crawler to finish the crawl. This means that the sitemight not be fully indexed and the newly introduced product pages or content won’t bediscovered for a very long time.
Technical SEO | | tidybooks0 -
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | | ccox10 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
<sub>& <sup>tags, any SEO issues?</sup></sub>
Hi - the content on our corporate website is pretty technical, and we include chemical element codes in the text that users would search on (like S02, C02, etc.) A lot of times our engineers request that we list the codes correctly, with a <sub>on the last number. Question - does adding this code into the keyword affect SEO? The code would look like SO<sub>2</sub>.</sub> Thanks.
Technical SEO | | Jenny10 -
Duplicate Title Tag issue due to Shopify CMS
Hi guys, I'm a novice really when it comes to SEO, yet have taken it in house for the next year or so, firstly because I have had my fingers burnt twice...and secondly, to allow me to recoup some of the loss from my prior campaigns. One thing I have noticed on my site (which uses a Shopify E-commerce CMS), is that Shopify duplicates a url for each my products. An example of this is http://www.vidahomes.co.uk/collections/designer-radiators-heating/products/reina-aliano
Technical SEO | | philscott2006
http://www.vidahomes.co.uk/products/reina-aliano Both products provide exactly the same information, yet appear in different ways subject to how the customer finds them. I contacted Shopify to find a fix to this issue when I noticed a high amount of Duplicate Title Tags in my SEO crawl. Their response was as follows. Using a rel canonical link will help prevent duplicate content issues with search engines. All you need to do is add this line of code: **<link rel="canonical" href="{{ canonical_url }}" />** ** before the tag in the theme.liquid file. It’s that simple :)** The theme liquid file basically generates the outer template for the whole site, and is only compromised when over-ruled. This all seems a little too easy for me, so I am hoping whether someone can elaborate as to whether this will work or not, as I'm not entirely sold on their response. I was always under the impression with canonical tags, that they should be added to the header section of the duplicate page in question, which refers back to the original page. The code I have been told to add above implies that the canonical tag would be added to every page in my site so the Google robot would have a hard time in finding anything at all of relevance Thanks in advance for any assistance with this. Kind Regards Phil Scott Vida Homes0 -
Same Video on Multiple Pages and Sites... Duplicate Issues?
We're rolling out quite a bit of pro video and hosting on a 3-party platform/player (likely BrightCove) that also allows us to have the URL reside on our domain. Here is a scenario for a particular video asset: A. It's on a product page that the video is relevant for. B. We have an entry on our blog with the video C. We have a separate section of our site "Video Library" that provides a centralized view of all videos. It's there too. D. We eventually give the video to other sites (bloggers, industry educational sites etc) for outreach and link-building. A through C on our domain are all for user experience as every page is very relevant, but are there any duplicate video issues here? We would likely only have the transcript on the product page (though we're open to suggestions). Any related feedback would be appreciated. We want to make this scalable and done properly from the beginning (will be rolling out 1000+ videos in 2010)
Technical SEO | | SEOPA0