Have we suffered a Google penalty?
-
Hello,
In January, we started a new blog to supplement our core ecommerce website. The URL of the website is www.footballshirtblog.co.uk and the idea behind it was that we would write articles related to our industry to build a community which would ultimately boost our sales.
We would add several posts per day, a mix between shorter news stories of around 150 words and more detailed content pages of around 500 words.
Everything was going well, we were making slow but sure progress on the main generic keywords but were receiving several thousand visitors a day, mostly finding the posts themselves on Google.
The surge on traffic meant we needed to move server, which we did around 6 weeks ago. When we did this, we had a few teething problems with file permissions, etc, which meant we were tempoarily able to add new posts. As our developers were tied up with other issues, this continued for a 7-10 day period, with no new content being added.
In this period, the site completely dropped from Google, losing all it's rankings and traffic, to the extent it now doesn't even rank for it's own name. This is very frustrating as we have put a huge amount of work and content into developing this site.
We have added a few posts since, but not a huge amount as it is frustrating to do it with no return and the concern that the site has been banned forever. I cannot think of any logical reason why this penalty has occured as we haven't been link spamming, etc.
Does anyone have any feedback or suggestions as to how we can get back on track?
Regards,
David -
Does anyone have any further feedback on this? We are at a complete lost as to why the penalty may have occured.
-
Thanks for your feedback. I have checked these issues and it doesn't seem to solve the issue.
We moved the site to www.eapps.com server, but I can't find anything on there which would suggest why Google might have penalised the site.
I have also checked the opendns and as far as I can see there seems to be no issue there.
Does anyone have any further suggestions on how to tackle this issue?
Regards,
Simon
-
Make sure all your DNS is pointed correctly. If you have a lame nameserver somewhere that could cause you problems. Use this tool to make sure your site is resolving properly
Did you move and then just hard shut-off the old site? If so, you might have created an issue where bots would still resolve the old site from cache and thought your site was down.
Have you checked the Google webmaster tools to see if Google has reported any problems?
-
Is it a shared server you moved to? You could be sharing with a dodgy site that's got a blacklisted IP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my content being fully read by Google?
Hi mozzers, I wanted to ask you a quick question regarding Google's crawlability of webpages. We just launched a series of content pieces but I believe there's an issue.
Intermediate & Advanced SEO | | TyEl
Based on what I am seeing when I inspect the URL it looks like Google is only able to see a few titles and internal links. For instance, when I inspect one of the URLs on GSC this is the screenshot I am seeing: image.pngWhen I perform the "cache:" I barely see any content**:** image.pngVS one of our blog post image.png Would you agree with me there's a problem here? Is this related to the heavy use of JS? If so somehow I wasn't able to detect this on any of the crawling tools? Thanks!0 -
What is Google supposed to return when you submit an image URL into Fetch as Google? Is a few lines of readable text followed by lots of unreadable text normal?
I am seeing something like this (Is this normal?): HTTP/1.1 200 OK
Intermediate & Advanced SEO | | Autoboof
Server: nginx
Content-Type: image/jpeg
X-Content-Type-Options: nosniff
Last-Modified: Fri, 13 Nov 2015 15:23:04 GMT
Cache-Control: max-age=1209600
Expires: Fri, 27 Nov 2015 15:23:55 GMT
X-Request-ID: v-8dd8519e-8a1a-11e5-a595-12313d18b975
X-AH-Environment: prod
Content-Length: 25505
Accept-Ranges: bytes
Date: Fri, 13 Nov 2015 15:24:11 GMT
X-Varnish: 863978362 863966195
Age: 16
Via: 1.1 varnish
Connection: keep-alive
X-Cache: HIT
X-Cache-Hits: 1 ����•JFIF••••��;CREATOR: gd-jpeg v1.0 (using IJG JPEG v80), quality = 75
��C•••••••••• •
••
••••••••• $.' ",#(7),01444'9=82<.342��C• ••••
•2!!22222222222222222222222222222222222222222222222222��•••••v••"••••••��••••••••••••••••
•���•••••••••••••}•••••••!1A••Qa•"q•2���•#B��•R��$3br�
••••%&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz���������������������������������������������������������������������������•••••••••••••••••••
•���••••••••••••••w••••••!1••AQ•aq•"2�••B���� #3R�•br�0 -
Why Did My Google Crawls Hit A Wall?
Hello, One my the sites I work with, http://www.oransi.com, has seen a significant decrease in crawl Googlebot activity in the last 90 days. See screenshot. This decrease in crawl stats runs in conjunction with less Kb downloaded per day & an increase in how much time it took Google to download a page. The client did just go through a redesign, however that happened on 4/16/15, which was after the decrease in Googlebot activity, so that should not be the issue. Same could be said for the mobilegeddan algorithm change. Any help would be greatly appreciated. 5u1lM6B
Intermediate & Advanced SEO | | BrandLabs0 -
How to stop Google crawling after 301 redirect?
I have removed all pages from my old website and set 301 redirect to new website. But, I have verified old website with Google webmaster tools' HTML verification file which enable me to track all data and existence of pages in Google search for my old website. I was assumed that, Google will stop crawling and DE-indexed all pages after 301 redirect. Because, I have set 301 redirect before 3 months. Now, I'm able to see Google bot activity on my website with help of Google webmaster tools. You can find out attachment to know more about it. How can it possible & How Google can crawl removed pages? You can see following image to know more about it. First & Second
Intermediate & Advanced SEO | | CommercePundit0 -
How Will This Google Change Effect Us?
http://online.wsj.com/article/SB10001424052702304459804577281842851136290.html
Intermediate & Advanced SEO | | alhallinan0 -
Google plus
"Google+ members, and to a lesser extent others who are signed into Google, will be able to search against both the broader web and their own Google+ social graph. That’s right; Google+ circles, photos, posts and more will be integrated into search in ways other social platforms can only dream about." What is meant by " and to a lesser extent others who are signed into Google" ? Does it mean that non-google plus members won't be able to view Google+photos, posts ?
Intermediate & Advanced SEO | | seoug_20050 -
Google Places Alternatives
Hello, I have been looking into Google places alternatives. Any suggestions? On a related note. I am based in the UK and looking to add a UK business to Bing Local but having read this thread ( - http://www.bing.com/community/maps/f/12264/t/653672.aspx -) I don't think it's possible. Anyone had any luck or experience? Cheers dj
Intermediate & Advanced SEO | | JohnW-UK0 -
Custom Attributes in Google Places
Hi Guys I'm looking for some clarity of what I can and can't add to the custom attribute fields in a Google Places listing. From my understanding, you can add additional information about your services, but not what those services are. The issue I'm trying to resolve is that a client of mine offers far more than the 5 services/ category options Places allow. They are a home services company, covering all sorts from plumbing, painting and decorating, through to extensions etc. They have about 25 different services. At the moment I'm restricted to just getting rankings for 5 services (correlated to the categories in Places), when I'd like to rank locally for them all. As Google is showing local results for most search queries related to their services whether those searches are geographically modified or not, I'm in a position where even if I am ranking top 5 organically for the terms, I'm still on bottom of page 1, or top of page 2. Would it be wise to add these additional services to the custom attributes section of the Places listing, or would this set off the potential for a listing suspension? Any ideas how to combat this problem would be very welcome.
Intermediate & Advanced SEO | | PerchDigital0