I have a 404 error on my site i can't find.
-
I have looked everywhere.
I thought it might have just showed up while making some changes, so while in webmaster tools i said it was fixed.....It's still there. Even moz pro found it.
error is http://mydomain.com/mydomain.com
No idea how it even happened. thought it might be a plugin problem.
Any ideas how to fix this?
-
Well i am testing a couple things.
1. i change a link on my site to an existing page.
2. i deleted a "coming soon plugin that in not using.
3. i deleted an older sitemap i found in my ftp file manager.
If none of this works then i have no idea what the problem is and will have to start eliminating plugins i guess.
thank you all for the help,
-
Perhaps you set the link to mydomain.com rather than http:mydomain.com and your setup has prefixed with the domain.
ScreamingFrog is good.
-
couldnt find it with open site but i did find it with screaming frog.
but i already knew what it was.
I have a button on my home page that currently didnt have anywhere to go so i linked it back to my homepage. not sure if that was the problem but i have change it to link to an existing page and will see what happens.
thank you for the help
if that fixes it i will let you know.
-
Crawl the site with screaming frog. It will ref in linking page.
-
Thank you
I will give it a try and let you know.
-
Hi Nathan,
Search on Open Site explorer the page and see if you can find where it is being linked from. Then you can go and update it and also 301 redirect the page to your homepage as an additional safety net for any links missed.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile first - what about content that you don't want to display on mobile?
ANOTHER mobile first question. Have searched the forum and didn't see something similar. Feel free to passive- aggressively link to an old thread. TL;DR - Some content would just clutter the page on mobile but is worth having on desktop. Will this now be ignored on desktop searches? Long form: We have a few ecommerce websites. We're toying with the idea of placing a lot more text on our collection/category pages. Primarily to try and set the scene for our products and sell the company a bit more effectively. It's also, obviously, an opportunity to include a couple of long tail keywords. Because mobile screens are small (duh) and easily cluttered, we're inclined _not _to display this content on mobile. In this case; will any SEO benefit be lost entirely, even to searchers on desktop? Sorry if I've completely misunderstood mobile-first indexing! Just an in-house marketing manager trying to keep up! cries into keyboard Thanks for your time.
Technical SEO | | MSGroup
Ross0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
Why doesn't SEOmoz see internal/external links on my site?
My SEOmoz analysis that my site contains neither external or internal lnks. I have used other tools and they have all seen the internal and external links on the pages. There aren't many but they are there. Why isn't SEOmoz seeing them?
Technical SEO | | iain0 -
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
subdomain www.nwexterminating.com subfolder www.nwexterminating.com/pest_control www.nwexterminating.com/termite_services www.nwexterminating.com/bed_bug_services
Technical SEO | | NWExterminating0 -
Error msg 'Duplicate Page Content', how to fix?
Hey guys, I'm new to SEO and have the following error msg 'Duplicate Page Content'. Of course I know what it means, but my question is how do you delete the old pages that has duplicate content? I use to run my website through Joomla! but have since moved to Shopify. I see that the duplicated site content is still from the old Joomla! site and I would like to learn how to delete this content (or best practice in this situation). Any advice would be very helpful! Cheers, Peter
Technical SEO | | pjuszczynski0 -
Get rid of a large amount of 404 errors
Hi all, The problem:Google pointed out to me that I have a large increase of 404 errors. In short I had software before that created pages (automated) for long tale search terms and feeded them to google. Recently I quit this service and all those pages (about 500000) were deleted. Now google GWM points out about 800000 404 errors. What I noticed: I had a large amount of 404's before when I changed my website. I fixed it (proper 302) and as soon as all the 404's in GWM were gone I had around 200 visitors a day more. It seems that a clean site is better positioned. Anybody any suggestion on how to tell google that all urls starting with www.domain/webdir/ should be deleted from cache?
Technical SEO | | hometextileshop0 -
What would you do if a site's entire content is on a subdomain?
Scenario: There is a website called mydomain.com and it is a new domain with about 300 inbound links (some going to the product pages and categories), but they have some high trust links The website has categories a, b, c etc but they are all on a subdomain so instead of being mydomain.com/categoryA/productname the entire site's structure looks like subdomain.mydomain.com/categoryA/productname Would you go to the effort of 301ing the subdomain urls to the correct url structure of mydomain.com/category/product name, or would you leave it as it is? Just interested as to the extent of the issues this could cause in the future and if this is something worth resolving sooner than later.
Technical SEO | | Kerry220 -
Google has not indexed my site in over 4 weeks, what's the problem?
We recently put in permanent redirects to our new url, but Google seems to not want to index the new url. There was no problems with the old url and the new url is brand new so should have no 'black marks' against it. We have done everything we can think off in terms of submitting site maps, telling google our url has changed in webmaster tools, mentioning the new url on social sites etc...but still nothing. It has been over 4 weeks now since we set up the redirects to the url, any ideas why Google seems to be choosing not to index it? Thanks
Technical SEO | | cewe0