I have a 404 error on my site i can't find.
-
I have looked everywhere.
I thought it might have just showed up while making some changes, so while in webmaster tools i said it was fixed.....It's still there. Even moz pro found it.
error is http://mydomain.com/mydomain.com
No idea how it even happened. thought it might be a plugin problem.
Any ideas how to fix this?
-
Well i am testing a couple things.
1. i change a link on my site to an existing page.
2. i deleted a "coming soon plugin that in not using.
3. i deleted an older sitemap i found in my ftp file manager.
If none of this works then i have no idea what the problem is and will have to start eliminating plugins i guess.
thank you all for the help,
-
Perhaps you set the link to mydomain.com rather than http:mydomain.com and your setup has prefixed with the domain.
ScreamingFrog is good.
-
couldnt find it with open site but i did find it with screaming frog.
but i already knew what it was.
I have a button on my home page that currently didnt have anywhere to go so i linked it back to my homepage. not sure if that was the problem but i have change it to link to an existing page and will see what happens.
thank you for the help
if that fixes it i will let you know.
-
Crawl the site with screaming frog. It will ref in linking page.
-
Thank you
I will give it a try and let you know.
-
Hi Nathan,
Search on Open Site explorer the page and see if you can find where it is being linked from. Then you can go and update it and also 301 redirect the page to your homepage as an additional safety net for any links missed.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi My website is showing 403 error. Please help me to find out the reason
Hi My website is showing 403 error. Please help me to find out the reason & how to fix URL :http://eosdev.sharjah.ae/admin/adminlogin.aspx
Technical SEO | | nlogix0 -
Hreflang Tags - error: 'en' - no return tags
Hello, We have recently implemented Hreflang tags to improve the findability of our content in each specific language. However, Webmaster tool is giving us this error... Does anyone know what it means and how to solve it? Here I attach a screenshot: http://screencast.com/t/a4AsqLNtF6J Thanks for your help!
Technical SEO | | Kilgray0 -
I have 2 E-commerce sites - Can i cross link?
Good Morning Everyone, I have 2 e-commerce websites that are similar and sell the same products. The content (text/descriptions/titles) is different so the content is not duplicate. SITE A has a ton of blog posts with highly relevant information and we frequently update the blog with posts about the types of products we carry and how it can help people in their daily lives... SITE B has no blog posts, but the content on the blog from SITE A is extremely relevant and helpful to anyone using SITE B. My question is, do you think it is frowned upon if i were to add links on SITE B that point to specific posts on SITE A... For example, if you are browsing a category page on SITE B, i was thinking of adding links on the bottom that would say "For More Information, Please Check Out These Posts on our Blog" www.sitea.com/blog/relevantinfo1 www.sitea.com/blog/relevantinfo2 www.sitea.com/blog/relevantinfo3 I think this would seriously help our browsers and potential customers get all of the information that they need, but what do you think Google would think about this cross-linking and if it violates their guidelines? Thanks for any opinions and advice.
Technical SEO | | Prime850 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Google insists robots.txt is blocking... but it isn't.
I recently launched a new website. During development, I'd enabled the option in WordPress to prevent search engines from indexing the site. When the site went public (over 24 hours ago), I cleared that option. At that point, I added a specific robots.txt file that only disallowed a couple directories of files. You can view the robots.txt at http://photogeardeals.com/robots.txt Google (via Webmaster tools) is insisting that my robots.txt file contains a "Disallow: /" on line 2 and that it's preventing Google from indexing the site and preventing me from submitting a sitemap. These errors are showing both in the sitemap section of Webmaster tools as well as the Blocked URLs section. Bing's webmaster tools are able to read the site and sitemap just fine. Any idea why Google insists I'm disallowing everything even after telling it to re-fetch?
Technical SEO | | ahockley0 -
Increase in authorization permission errors error after site switch
We launched our new site 2 days ago , since site was down for 12 hours for maintenance, we saw google webmaster tool shows this error . Since then google hasnt crawled, its been 36 hours. Do we need to do anyting? We have close to a million page google crawled before and I am wondering if this will effect anything.
Technical SEO | | tpt.com0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
Will changing our colocation affect our site's link juice?
If we change our site's server location to a new IP, will this affect anything involving SEO? The site name and links will not be changing.
Technical SEO | | 9Studios0