Error got removed by request
-
hey there,
I have website, Which is Doing best on google, But from last 2-3 times i am Getting error in Google Search Console, removed by request., here is screenshot http://prntscr.com/iva5y0
My Question is i have Not Requested Google to Remove My Homepage URL, Then
1. Why I am Getting This warning again & again, ??
2. Will this Harm My Sites Current traffic & Ranking Status ?
3.What Steps I need Take to stop This warning came again & again ???
Thnx in advance & please Help Out on this Query
-
Sorry for disappearing on you - any updates here?
-
Yes, Just Homepage Gets This Error, and It Still Ranks, but when we type site:domain.com It is Not Showing Homepage Meta Description. Also Clicks Got Down too From This week
-
Wait - so, the pages that Google says have been removed are still ranking?
-
yes, have checked, All pages are indexed properly, Then after I am Getting warning every 2 days
-
You're right - this will lose you traffic! If Google says that a page is blocked, that means that Google will not list it on its results pages.
Have you checked to see if there's a noindex tag on the pages that Google says have been removed?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do i fix fatal error message?
I Am Trying To Remove A Robots.txt code i put in my root domain a while back because i didn't know what i was doing. everytime i enter my domain (domain.com/robots.txt) i get a fatal error message. How do I fix this fatal error message? ipALV2z
Technical SEO | | icebergsal0 -
Spammy Structured Data Markup Removal
Hi There, I'm in a weird situation and I am wondering if you can help me. Here we go, We had some of our developers implement structured data markup on our site, and they obviously did not know what they were doing. They messed up our results in the SERP big time and we wound up getting manually penalized for it. We removed those markups and got rid of that penalty (phew), however now we are still stuck with two issues. We had some pages that we changed their URLs, so the old URLs are now dead pages getting redirected to the newer version of the same old page, however, two things now happened: a) for some reason two of the old dead pages still come up in the Google SERP, even though it's over six weeks since we changed the URLs. We made sure that we aren't linking to the old version of the url anywhere from our site. b) those two old URLs are showing up in the SERP with the old spammy markup. We don't have anywhere to remove the markup from cause there are no such pages anymore so obviously there isn't this markup code anywhere anymore. We need a solution for getting the markup out of the SERP. We thought of one idea that might help - create new pages for those old URLs, and make sure that there is nothing spammy in there, and we should tell google not to index these pages - hopefully, that will get Google to de-index those pages. Is this a good idea, if yes, is there anything I should know about, or watch out for? Or do you have a better one for me? Thanks so much
Technical SEO | | Joseph-Green-SEO0 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
How to fix an 803 error?
Error Code 803: Incomplete HTTP Response Received How can I fix this error?
Technical SEO | | netprodjb0 -
Schema Markup Errors - Priority or Not?
Greetings All... I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn. I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them. My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have. Any thoughts?
Technical SEO | | AfroSEO0 -
500 Server Error on RSS Feed
Hi there, I am getting multiple 500 errors on my RSS feed. Here is the error: <dt>Title</dt> <dd>500 : Error</dd> <dt>Meta Description</dt> <dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error</dd> <dt>Meta Robots</dt> <dd>Not present/empty</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> Any ideas as to why this is happening, they are valid feeds?
Technical SEO | | mistat20000 -
Weird 404 Errors in Webmaster Tools
Hi, In a regular check with Webmaster Tools, I have noticed a sudden increase in the number of "not found-404" errors. So I have been looking at them and noticed something weird has been going on. There are well over 100 pages with 404-errors. The funny thing is, none of the ULR's are correct, For example, if the actual url is something like www.domain.com/latest-reviews , the 404-error points to a non-existent URL like www.domain.com/latest-re And when I checked where they were linked from, they are all from these spammy sites. Anyone know what could be causing these links, why would anyone link on purpose to a non-existent page? cheers,
Technical SEO | | Gamer070 -
How to remove crawl errors in google webmaster tools
In my webmaster tools account it says that I have almost 8000 crawl errors. Most of which are http 403 errors The urls are http://legendzelda.net/forums/index.php?app=members§ion=friends&module=profile&do=remove&member_id=224 http://legendzelda.net/forums/index.php?app=core&module=attach§ion=attach&attach_rel_module=post&attach_id=166 And similar urls. I recently blocked crawl access to my members folder to remove duplicate errors but not sure how i can block access to these kinds of urls since its not really a folder thing. Any idea on how to?
Technical SEO | | NoahGlaser780