Has my site been penalized by google
-
Hi all
I have noticed a sudden drop in rankings for most of my keywords on kerryblu ,co,uk and was thinking the site may have been manually penalized by google.
I have not received any notification of this in webmaster tools but can't think of any other reason for the loss of rankings.
I have searched the web for info on this but can't find a definite answer. Is there any way of knowing for sure.
At the time of the crash the only real change I made was adding google adsense to my blog. Could this be responsible.
Thanks for looking.
-
I got the same issue. After the 8th of May, my blog traffic drop nearly 20%. I haven't practiced any black hat seo method. I saw that few other webmasters also reporting the same issue.
-
Hi
Thanks for looking. The problem with ebay sellers is that they take my photographs and use my descriptions. They often get indexed before me which then makes it look like I am the one placing duplicate content on my site. Not sure how to get around this.
-
Hi,
It looks indeed like you have a new low record as far as visibility.
http://screencast.com/t/mR2Uu6saTV
Anyway based on the site, size, look and feel, format and niche coverage
it dosen't look like a manual penalty. There can be a filter in place though - there is a lot of improvement that can / should be made to the site in order to gain better visibility anyway.
You should do a full assessment and a first thing i would do is to remove from the index the thin ages - the pages that you have in the index but you don't rank or receive any or very low traffic - this can be a good start. (that is if your products are not "unique" - as I've found some on ebay with more or less the same description).
There are also some other areas that you can work on to improve engagement, branding.
Just my opinion - hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long Google will take to Disavow a link?
Just want to know how long will Google take to Disavow a link? I uploaded my file on 18 Dec 2020 and today is 5th January 2021 and still, that link is appearing in my Search Console in Top linking domains. Anyone who recently done this practice and how long it took? I mentioned the domain name below and hopefully, it will disavow all the links [subdomain+www+without www] coming from that domain. domain:abcd.com Help me out, please...
White Hat / Black Hat SEO | | seotoolsland.com0 -
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
What would you say is hurting this site, Penguin or Panda?
Would you say this is both Penguin and Panda and no penalty has ever been lifted? What would be your general recommendations for this site? seWnoQm
White Hat / Black Hat SEO | | BobGW0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Penalty for all new sites on a domain?
Hi @all, a friend has an interesting problem. He got a manuel link penalty in the end of 2011...it is an old domain with domainpop >5000 but with a lot bad links (wigdet and banners and other seo domains, but nothing like scrapebox etc)...he lost most of the traffic a few days after the notification in WMT (unnatural links) and an other time after the first pinguin update in april´12. In the end of 2012 after deleting (or nofollowing) and disavow a lot of links google lifted the manuel penalty (WMT notification). But nothing happened after lifting, the rankings didn´t improve (after 4 months already!). Almost all money keywords aren´t in the top 100, no traffic increases and he has good content on this domain. We built a hand of new trust links to test some sites but nothing improved. We did in february a test and build a completely new site on this domain, it´s in the menu and got some internal links from content...We did it, because some sites which weren´t optimized before the penalty (no external backlinks) are still ranking on the first google site for small keywords. After a few days the new site started to rank with our keyword between 40-45. That was ok and as we expected. This site was ranking constantly there for almost 6 weeks and now its gone since ten days. We didn´t change anything. It´s the same phenomena like the old sites on this domain...the site doesnt even rank for the title! Could it still be an manuel penalty for the hole domain or what kind of reasons are possible? Looking forward for your ideas and hope you unterstand the problem! 😉 Thanks!!!
White Hat / Black Hat SEO | | TheLastSeo0 -
Why does Google recommend schema for local business/ organizations?
Why does Google recommend schema for local business/ organizations? The reason I ask is I was in Structed Data Testing Tool, and I was running some businesses and organizations through it. Yet every time, it says this "information will not appear as a rich snippet in search results, because it seems to describe an organization. Google does not currently display organization information in rich snippets". Additionally, many of times when you do search the restaurant or a related query it will still show telephone number and reviews and location. Would it be better to list it as a place, since I want to have its reviews and location show up thanks? I would be interested to hear what everyone else opinions are on this thanks.
White Hat / Black Hat SEO | | PeterRota0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
What are the biggest optimization factors for Google Places?
I know some of the basic factors to rank better on Google Places, but I'm looking to see where the priority is and if there are negative factors?
White Hat / Black Hat SEO | | anchorwave0