A website that will not load on a particular computer? Help Me Please!
-
We took on a new client about two weeks ago, took them off a proprietary CMS, placed them on a WordPress site, optimized the site, etc. and were finishing up small details three days ago. My PC in my personal office all of a sudden would not load the site from a Google search, from a direct url, etc.
Our office was using a D-Link wireless router but my PC is hardwired in the office. I cranked up my MacBook Pro with solid state drive (6 months old), got on wireless, and....site would not load. PC's and Macs in offices around me would all load the site.A search online brought up a fix for the PC and tried it - did not work, had lead dev try it - did not work, called a server side friend and he had never heard of such a thing. Every fix revolved around changing IP addresses, etc. I uninstalled my antivirus programs on my PC, installed every update that was outstanding, there was no new software installed on either box prior to problem.
Can you help??? Is there any chance someone not associated with us and just looking for my client or someone entering a direct url could experience?
-
Yes, Woj, we were able to get it from other PC's in the office. But good way to check, thanks.
-
Can you get to it using a proxy service like proxify.com?
-
Ok will be working till late so let us know how you go.
-
Thanks Alan,
Will try that when I go into office in an hour or so. -
Think it is around proxy srvr / DNS setting Steve. Tried changing based on a fix for XP but to no avail.
Mac is working from out of office and will recheck in an hour or so at office. (OMG, working on Sat again).
Thanks for help.
-
Thanks EGOL, did both and no good. Completely uninstalled AVG. Disabled windows firewall. Still no good. Only thing online is fix for Windows (I use XP pro on that mach.) and even that did not work. ???
-
Disable firewall or antivirus for a moment and try visiting the site. My firewall blocks a couple of sites or makes them perform poorly.
-
Rob, you will need to do the NSlookup on the PC with the problem.
It may be that other un-affected machines got dns cached, and will eventualy have same problem.
-
Thanks Doug, I will check with Lead Dev as I know he worked on it for about 30 minutes in the command window.
We changed IP addresses, etc. to no avail.
-
Did you try nslookup as Doug suggested? from pc
-
Will PM the URL to you Alan. thx
-
Different computers on the same network / IP segment not loading the site could be a Proxy server OR DNS Setting.
I'm on Mac so that's what I know about.
What are your Mac network settings?
Do you get an IP address?
Can you load other websites from the same virtual hosting server?
Can you load the website via the IP address?
Whats the URL of the website?
Steve
-
Are other sites loading normally?
Can you resolve the host name from a command window on the PC:
nslookup {hostname}
If you can't resolve the hostname then it's probably a DNS issue. (you could try adding an entry to your hosts file and see if that gets around the problem?)
-
When you say does not load what does happen?
can you PM a url to me?Can it be loaded from outside office?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get only the most needed css for faster loading?
I have been using the Firefox duster app to clean up my css so only the page rendering css is loaded when my page is loaded. But it doesn't seem to be working now. Does anyone know of another tool that will do this for me?
Technical SEO | | RoxBrock0 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
No google traffic for this site? Help?
Hi We have not done this web site http://climateacs.co.uk but have now picked it up and its getting no traffic what so ever from google do you think its been blacklisted? I have added it to my webmaster tools and there are no manual actions on it and most of the backlinks on google webmaster tools are from yell.com. However when I run it on opensiteexplorer I am seeing some chinese type links?? It is not really showing many search queries at all when you view them in webmaster tools under United Kingdom. I was going to start citation building for the address to help support the google places entry but just wanted to see what other peoples opinion was really on this site? Thanks Tracy
Technical SEO | | dashesndots0 -
Website Migration Query
We are going to migrate our site but we cannot do this gradually, so before we complete the whole migration, we were thinking of launching the new site on a sub-domain and gradually redirect traffic to the sub-domain, starting with 10%, moving up steadily so that we then migrate to the new site within four/five weeks. The new site will have a new URL structure on the same domain, with a complete re-design and the IP address will be changing as well, even though the server geographical location will remain the same. a) Should we noindex the new sub-domain while the new site is on trial? b) Are there any other issues we should look out for? Thanks in Advance 🙂
Technical SEO | | seoec0 -
Does Having Links on this blog Hurt or Help?
Hello, I created a wordpress blog a while back, http://plastic-bins.com/ If you go into one of the pages on the blog (for example: http://plastic-bins.com/plastic-shelf-bins/ ) you will notice a link after the text/content telling you where you can purchase the stuff that is talked about on that page. There is one link back to the e-commerce site on every category page on the blog. Does anyone know if these links will help me or hurt me in terms of ranking in the SERPS? Thanks
Technical SEO | | Prime850 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
404 help
Hello all, firstly let me apologize if this is the wrong place to ask this question. I have a site www.promptresponseaccidentmanagement.com which gets a 200ok when checked for crawl issues, however pages such as /whiplash-injury-compensation-claims.php , /road-traffic-accident-compensation-claims.php and quite a few more return a 404. That's fine (usually) as I can quite happily fix that most of the time. However if you actually go to those pages in your browser, or click through to them on any part of the site you will see that they are in fact not redirecting to a 404 and everything is fine!? Any body got any ideas? Best H
Technical SEO | | haydyn0 -
Do you validate you websites?
Do you consider the guidelines from http://validator.w3.org/ when setting up a new website? As far as I know they don't influence rankings ... What is your opinion about that topix?
Technical SEO | | petrakraft0