Penalization.... please help me...
-
First of all, sorry for my english but i'm an Italian girl seo.
Before Panda update seo was clear and easy: good quality, good natural backlink and so on...
Now there is an update fast every week and it's a mess!
I work as seo for a big italian e-commerce and (more or less) one month's ago in google webmaster tool I tried a message frome Google who told me the site: www.giordanoshop.com is penalty for innatural backlink.
But I ve do noting against Google politicy: no pay backlink, no fam and so on..
There are some streing link but I can't delete it because I don't do it.
I ask the riconsideretion of website but google still tell me it faund innatural link.
What shoud I do?
The pr of the site is the same but all keyword has lose ranking: from 1 page to 3 and from 1 to 6 page...
What can I do? I risk to lose my work sob.
-
grazie you are great I ll try to apply all your advice.
Only another question? If I find backlink not mine what can I do to delete it?
-
Arianna-
Bummer. Anybody that calls themselves an "italian girl seo" deserves some help.
I have found a couple of helpful articles and information sources over the last few months, as I have had to help new and existing clients with some of these same issues.
First of all, don't complain about what google is doing. This is where the marketplace is going and its like complaining about bad weather, just let it rain..........
Second-you have to find those links. You need to do a site audit/backlinks risk assessment. I have copied and pasted a great "down and dirty guide" on checking for low quality links and basically doing a site audit. This is a blog that was provided by another SEOmoz'r Modesto Siotos. Here is the link......
http://www.seomoz.org/blog/how-to-check-which-links-can-harm-your-sites-rankings
here is the copy and paste:"
The Right Time For a Backlinks Risk Assessment
Carrying out a backlinks audit in order to identify the percentage of low-quality backlinks would be a good starting point. A manual, thorough assessment would only be possible for relatively small websites as it is much easier to gather and analyse backlinks data – for bigger sites with thousands of backlinks that would be pointless. The following process expands on Richard Baxter's solution on 'How to check for low quality links', and I hope it makes it more complete.
- Identify as many linking root domains as possible using various backlinks data sources.
- Check the ToolBar PageRank (TBPR) for all linking root domains and pay attention on the TBPR distribution
- Work out the percentage of linking root domains that has been deindexed
- Check social metrics distribution (optional)
- Repeat steps 2,3 and 4 periodically (e.g. weekly, monthly) and check for the following:
- A spike towards the low end of the TBPR distribution
- Increasing number of deindexed linking root domains on a weekly/monthly basis
- Unchanged numbers of social metrics, remaining in very low levels"
END OF CUT AND PASTED INFO****************************
Third- Here is another post from a fellow SEO moz'r who does a good job of simplifying the process and providing the cold hard facts and options. Its entitles "6 ways to recover from bad links".
http://www.seomoz.org/blog/6-ways-to-recover-from-bad-links
I hope this information helps you. There are no quick and easy fixes. If you dont want to lose the business then you need to spend the time and make it happen. If this has helped you please make sure you show me some italian love and give me the thumbs UP!!!!!!!!
Ciao
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much content is duplicate content? Differentiate between website pages, help-guides and blog-posts.
Hi all, I wonder that duplicate content is the strong reason beside our ranking drop. We have multiple pages of same "topic" (not exactly same content; not even 30% similar) spread across different pages like website pages (product info), blog-posts and helpguides. This happens with many websites and I wonder is there any specific way we need to differentiate the content? Does Google find the difference across website pages and blog-pots of same topic? Any good reference about this? Thanks
Algorithm Updates | | vtmoz0 -
Backlinks to internal pages help website to rank better or vice versa?
Hi Moz community, We have our backlinks mostly pointed to our homepage. We are trying to rank better and not having minimum number of backlinks to our internal pages is one of the things I worry about. Backlinks to homepage alone help in ranking internal pages or backlinks to internal pages help in ranking homepage? Or both required? Thanks
Algorithm Updates | | vtmoz0 -
Help for a webstore with Google Warnings for Watermark Images and Panda
I have not had too much experience with helping websites that have been hit by Panda - any tried and tested formulas I can pass to website owner would be great. He does not want to reveal domain name - its in the area of children/baby products 'Web site featured on page 1 of Google search results for many years (website 5 years old- Australian domain) . In April/May 2014, Google suspended our Google Shopping account because we used watermarks on all our images. We were advised that the suspension would remain in place indefinitely or until such time the watermarks were removed. We wrote back to Google to explain that these watermarks were put in place by our store back 2005 with the sole purpose of protecting our intellectual property. Needless to say, their attitude was unwavering. And as a result, revenue plummeted. However, the perfect storm was about to hit our store without warning. In the same month, Panda 4.0 was unleashed and our store was hit once again. This update alone reduced visitor numbers by around 50% overnight. The Panda 4.0 algorithm update was designed to target poor quality, duplicate content and unfortunately we had some of it. We have now begun creating original content with many of the new products we're uploading onto our web site. It's slow and tedious. We have modified our web site to now include a tag on a the home page (this was missing). We have removed many duplicate links from our footer (it was too big and contained hundreds of links that were also repeated from the header). We introduced a blog and we have engaged the services of a local seo company to disavow any bad backlinks and add missing or improve existing content to category and brand pages. No improvement in our situation is yet visible and with Christmas just 3 months away, poor sales during our 'bread and butter' period will mean even tougher times for our store in 2015. ANY PANDA EXPERTS who can help please email me felicity@gardenbeet.com - looking for independent freelancers rather than agencies
Algorithm Updates | | GardenBeet0 -
Please help explain this (Question about search results)
What's up SEO's, I'm new the SEO world and had a quick question. I just installed the MOZBAR and did a google search: "What is Google Voice" I attached an image of the results I received. Can someone explain how MacWorld's article outranked Google's when both Google's Page Authority and Domain Authority are so much stronger than MacWorlds. This is in addition to google having many more links. This is basic, but any insight will be very helpful. Thanks guys! [Screen%20Shot%202014-02-18%20at%206.08.15%20PM.png](file:///Users/jackfarrell/Desktop/Screen%20Shot%202014-02-18%20at%206.08.15%20PM.png)
Algorithm Updates | | Petbrosia1 -
Fetch as Google - removes start words from Meta Title ?? Help!
Hi all, I'm experiencing some strange behaviour with Google Webmaster Tools. I noticed that some of our pages from our ecom site were missing start keywords - I created a template for meta titles that uses Manufacturer - Ref Number - Product Name - Online Shop; all trimmed under 65 chars just in case. To give you an idea, an example meta title looks like:
Algorithm Updates | | bjs2010
Weber 522053 - Electric Barbecue Q 140 Grey - Online Shop The strange behaviour is if I do a "Fetch as Google" in GWT, no problem - I can see it pulls the variables and it's ok. So I click submit to index. Then I do a google site:URL search, to see what it has indexed, and I see the meta description has changed (so I know it's working), but the meta title has been cut so it looks like this:
Electric Barbecue Q 140 Grey - Online Shop So I am confused - why would Google cut off some words at start of meta title? Even after the Fetch as Googlebot looks perfectly ok? I should point out that this method works perfect on our other pages, which are many hundreds - but it's not working on some pages for some weird reason.... Any ideas?0 -
SEO Faith Shaker... help!!
Something has happened which is, well inexplicable to me... I'm stumped! We have a client that has two sites which compete for the same keywords. One is a .com, the other is a .co.uk. They have different content so there's no dupe worries. We have, for the past few months been carrying out SEO for the .com site. It's doing great. We don't do anything with the .co.uk site, which, incidentally dropped from 2nd (under the .com) to 9th after Panda for its main keyword. The owner of the site has switched the .co.uk to Wordpress and now that site, with the same content, same links, same social signals, etc... (nothing was done to it except the platform being changed) has suddenly shot up above the .com for not only its main keyword but most of the others too. What gives?? It doesn't even have a link from the .com site! So, the .com which has undergone SEO is now being beaten by the .co.uk which hasn't. The .com is still directly underneath it. It feels like all of the things we know about SEO, all of the ranking factors and everything are being totally undermined here, just due to a change to Wordpress. Surely that can't be it?? The .com is an older domain, has more content, has always done well, has more links and from better places, and all the social stuff surrounding the business is targeted at it. This isn't a penalization issue or anything like that, this is simply a matter of the .co.uk suddenly blasting above everything for no apparent reason. Any ideas?? I know that there "might" be a tiny, tiny, tiny advantage of the country TLD but that's not enough to do this, and the .co.uk always did worse before.
Algorithm Updates | | SteveOllington1 -
Help on article submission
I've had a few articles made up about Microsoft Training for the training company I work for. What's the best practice to submitting these articles, any tips on how best to use these articles to boost seo for our website? All help is greatly appreciated. Jack
Algorithm Updates | | jpc10040