A website that will not load on a particular computer? Help Me Please!
-
We took on a new client about two weeks ago, took them off a proprietary CMS, placed them on a WordPress site, optimized the site, etc. and were finishing up small details three days ago. My PC in my personal office all of a sudden would not load the site from a Google search, from a direct url, etc.
Our office was using a D-Link wireless router but my PC is hardwired in the office. I cranked up my MacBook Pro with solid state drive (6 months old), got on wireless, and....site would not load. PC's and Macs in offices around me would all load the site.A search online brought up a fix for the PC and tried it - did not work, had lead dev try it - did not work, called a server side friend and he had never heard of such a thing. Every fix revolved around changing IP addresses, etc. I uninstalled my antivirus programs on my PC, installed every update that was outstanding, there was no new software installed on either box prior to problem.
Can you help??? Is there any chance someone not associated with us and just looking for my client or someone entering a direct url could experience?
-
Yes, Woj, we were able to get it from other PC's in the office. But good way to check, thanks.
-
Can you get to it using a proxy service like proxify.com?
-
Ok will be working till late so let us know how you go.
-
Thanks Alan,
Will try that when I go into office in an hour or so. -
Think it is around proxy srvr / DNS setting Steve. Tried changing based on a fix for XP but to no avail.
Mac is working from out of office and will recheck in an hour or so at office. (OMG, working on Sat again).
Thanks for help.
-
Thanks EGOL, did both and no good. Completely uninstalled AVG. Disabled windows firewall. Still no good. Only thing online is fix for Windows (I use XP pro on that mach.) and even that did not work. ???
-
Disable firewall or antivirus for a moment and try visiting the site. My firewall blocks a couple of sites or makes them perform poorly.
-
Rob, you will need to do the NSlookup on the PC with the problem.
It may be that other un-affected machines got dns cached, and will eventualy have same problem.
-
Thanks Doug, I will check with Lead Dev as I know he worked on it for about 30 minutes in the command window.
We changed IP addresses, etc. to no avail.
-
Did you try nslookup as Doug suggested? from pc
-
Will PM the URL to you Alan. thx
-
Different computers on the same network / IP segment not loading the site could be a Proxy server OR DNS Setting.
I'm on Mac so that's what I know about.
What are your Mac network settings?
Do you get an IP address?
Can you load other websites from the same virtual hosting server?
Can you load the website via the IP address?
Whats the URL of the website?
Steve
-
Are other sites loading normally?
Can you resolve the host name from a command window on the PC:
nslookup {hostname}
If you can't resolve the hostname then it's probably a DNS issue. (you could try adding an entry to your hosts file and see if that gets around the problem?)
-
When you say does not load what does happen?
can you PM a url to me?Can it be loaded from outside office?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you help by advising how to stop a URL from referring to another URL on my website with a 404 errorplease?
How to stop a URL from referring to another URL on my site. I'm getting a 404 error on a referred URL which is (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/[null id=43484])referred from URL (https://webwritinglab.com/know-exactly-what-your-ideal-clients-want-in-8-easy-steps/) The referred URL is the URL page that I want and I do not need it redirecting to the other URL as that's presenting a 404 error. I have tried saving the permalink in WordPress and recreated the .htaccess file and the problem is still there. Can you advise how to fix this please? Is it a case of removing the redirect? Is this advisable and how do I do that please? Thanks
Technical SEO | | Nichole.wynter20200 -
My android website not showing on results
my website will index good،and its top on some few keywords but its not top for many keywords . << some time it showed on results but will be hidden after some time >> what you thing??? url is https://android-apk.org
Technical SEO | | moztabliq10 -
Will canonical solve this?
Hi all, I look after a website which sells a range of products. Each of these products has different applications, so each product has a different product page. For eg. Product one for x application Product one for y application Product one for z application Each variation page has its own URL as if it is a page of its own. The text on each of the pages is slightly different depending on the application, but generally very similar. If I were to have a generic page for product one, and add canonical tags to all the variation pages pointing to this generic page, would that solve the duplicate content issue? Thanks in advance, Ethan
Technical SEO | | Analoxltd0 -
Please Help! Crawl & Site Errors - Will This Impact My SEO?
Hello Moz, I need urgent help. I remove a tonne of product pages and put everything into one product page to deal with duplicate content. I thought this was a good thing to do until I got an email from Google saying: "Googlebot identified a significant increase in the number of URLs on ****.com that return a 404 (not found) error. " I checked it out and found the problem: 4 Soft 404's
Technical SEO | | crocman
41 Not Found's What do I need to do to fix this? Is it a problem or should I just ignore? I removed all the pages on WordPress but I need to do it somehow manually through Google? I have worked so hard on my SERP's that this will destroy me if I'm penalised. Please can someone advise?0 -
Duplicate content on charity website
Hi Mozers, We are working on a website for a UK charity – they are a hospice and have two distinct brands, one for their adult services and another for their children’s services. They currently have two different websites which have a large number of pages that contain identical text. We spoke with them and agreed that it would be better to combine the websites under one URL – that way a number of the duplicate pages could be reduced as they are relevant to both brands. What seamed like a good idea initially is beginning to not look so good now. We had planned to use CSS to load different style sheets for each brand – depending on the referring URL (adult / Child) the page would display the appropriate branding. This will will work well up to a point. What we can’t work out is how to style the page if it is the initial landing page – the brands are quite different and we need to get this right. It is not such an issue for the management type pages (board of trustees etc) as they govern both identities. The issue is the donation, fundraising pages – they need to be found, and we are concerned that users will be confused if one of those pages is the initial landing page and they are served the wrong brand. We have thought of making one page the main page and using rel canonical on the other one, but that will affect its ability to be found in the search engines. Really not sure what the best way to move forward would be, any suggestions / guidance would be much appreciated. Thanks Fraser .
Technical SEO | | fraserhannah0 -
Check my website loading time
Kindly check my website loading time for the home page and deep pages. Do I need to make it fast or it is Okey? Website - brandstenmedia.com.au
Technical SEO | | Green.landon0 -
A few pages deindexed from Google .. PLEASE HELP!
My client has a fairly new site and we were agressively building content to the website. It is an ecommerce store and we have got a blog as well. We guest blogged in a few places and wrote 3-5 articles a day. Last few days, i noticed 3-4 pages that we were building links to got deindexed. What could be the reason? We weren't using any bots to build links, only a couple of it around 5-10 links to a page. Google WMT is not showing any messages and no manual action is seen. What could be the reason? I've submitted those URL for reindex and so far nothing seems to work. Any idea? Please help.
Technical SEO | | WayneRooney0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0