Hit by Panda - Google Disavow Help
-
Hi
I hope you can help me
A Website I manage has been hit hard by the Panda Update. I am really struggling to understand what is seen as a Spammy link. The Website use to be on page 1 for "fancy dress" now it isnt visable for that term at all and most other terms the site has dropped for.
I have looked into what might have gone wrong and have removed several links , used the disavow tool 2-3 times and submitted re-consideration requests, but each time google informs me that they are still detecting unnatural links.
Could somebody please take a look at our link profile www.partydomain.co.uk for "fancy dress" as an example and show examples of links you would consider that google might not like.
It would also be good if anybody had any contacts in the UK that could help
thanks
Adam
-
Thank you
-
Thank you
-
Thank you
-
Thank you
-
To further stress EGOL's point, this is definitely a Penguin problem if related to unnatural link detection. But that's simple semantics which are definitely relevant when searching for information/help in this matter.
It honestly should be pretty easy to figure out what spammy links there are. For example, you sell dresses, so if your website has links all over underground hip hop forums then it's probably unnatural.
What paid link services have you used? Can you contact them for submission reports? Most will provide this upon request...
--just a couple thoughts
-
A Website I manage has been hit hard by the Panda Update. I am really struggling to understand what is seen as a Spammy link.
ohhhh... Panda problems are usually caused by thin content and duplicate content. Penguin problems and unnatural link penanties result from spammy links.
You might want to read the new YouMoz article about these topics by Marie Haynes to get panda and penguin straight.... and also learn how to handle some spammy link problems.
Lots of people - even pro SEOs - don't understand the difference between Panda, Penguin and unnatural links.
Read the article so you don't do damage through improper actions.
-
Hey Adam,
This doc is a little old but still useful. You need two docs. Your link profile from OSE and this doc http://seogadget.com/categorising-your-links/ (download excel at the bottom).
The way they show to copy data from OSE to SEGadget has changed, so see comments at the bottom or leave a message here if you get stuck.
Once you have the data you can see a big list of panda losers/winners etc.. good chance those spammy links are in the panda losers section...
Also are you getting any messages in webmaster tools?
Adam
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice to have gated white paper indexed by Google
Our main website white paper page has an image and brief description of the white paper. Once you click the white paper you are redirected to a form to access the gated white paper. Once you complete that form you are redirected to the white paper pdf which is housed on a subdomain/Hubspot. Because of this, I do not believe our website is getting "credit" for the keywords/content on these pages. Any suggestions on how we can allow the search engines to crawl this content while still keeping it gated? As I understand it a sub domain cannot hep or hurt (aside from critical crawler issues) the main domain. Thank you
On-Page Optimization | | NikCall0 -
Disavowed links, updated website etc - still no ranking improvements
Hi, Could anyone take a look at www.artificialgrass4u.co.uk - a few years ago it used to rank highly for 'artificial grass' ... then when Google rolled out its algorithms punishing websites with poor links it lost all it's rankings. We've disavowed almost all of the bad links, and have been adding new optimised content etc over the past few months but rankings still haven't improved. Is there anything I'm missing? Thanks
On-Page Optimization | | icansee0 -
Can somebody help me with a "Grade F" report
My Seomoz account tells me i've got a Grade F for my on-page optimalisation. The report said there's no single "on page keyword" usage at the whole page. Can somebody tell me what went wrong? If you take a look at my website: www.oceandrivers.nl, you'll see that i've used the keyword "prive chauffeur huren" everywere. In the URL, the H1 etc. (See image)
On-Page Optimization | | OceanDrivers
So i don't get it?! Thanks in advance! [](<a href=)" target="_blank">a> visWA visWA0 -
My text does not show up in Google
Hi there. I've got an urgent question I hope someone can help me with. I've made a website (www.tonyharrismakingcents.com.au) with a few content pages. I don't get a lot of traffic. All my pages are scrawled and I don't see any errors. However, when I copy an entire paragraph and Google it, it does not show up in the search results. This makes me believe that the pages are not scrawled correctly. Only when I search for the exact paragraph by putting it between "", the website shows up on the results page. What can be the reason for this? Thanks for your help..It's much appreciated.
On-Page Optimization | | csrinpractice0 -
Help I don't understand Rel Canonical
I'm really stuck on how to fix up Rel Canonical errors on a Wordpress site. I went in and changed all the URLs to remove the www and added / to the end. I get this message on page analysis details: <dt>Canonical URL</dt> <dd>"http://www.some-url.com.au/",</dd> <dd>"http://some-url..com.au/", and</dd> <dd>"http://some-url..com.au/"</dd> <dd>Well the first one with the www doesn't exists and the second two urls are the same! (Note that I have removed the actual URL for this post)</dd> <dd>I'm not sure how to read and fix the errors from the reports ether. The only issues I can see is that the 'Tag Value' has the www and the 'Page Title - URL' doesn't have the www.
On-Page Optimization | | zapprabbit
</dd>0 -
Remove internal site SERPS from Google Index?
1. Internal Serp pages did not have a robots meta tag 2. As a result, client site has thousands (~4,400) of internal site SERP pages in the Google index. 3. We added the NoIndex, Follow attribute to all internal SERPS 4. We Disallowed: domain.com/internal-search-operator in Robots.txt 5. No new SERP pages are being indexed, but the other 4000 something that were already there are still in the index weeks later. 6. The pages are dynamically created and still work, so I can't use the Remove Content tool from google, because the pages don't 404. Is there any way to get these pages out of the index besides just waiting and hoping google eventuall drops them? Thanks
On-Page Optimization | | delegator.com0 -
What URL Should I use in Google Place Page?
Alright, I have a client that has 1 website and 14 locations. We want to create place pages for each of their locations but my question is which URL should I put in the place page and why? I can put in the root domain into each place page, or should I put in the URL that lands on the actual location on the root. example: domain.com/location1 Thanks!
On-Page Optimization | | tcseopro0 -
Should I let Google index tags?
Should I let Google index tags? Positive? Negative Right now Google index every page, including tags... looks like I am risking to get duplicate content errors? If thats true should I just block /tag in robots.txt Also is it better to have as many pages indexed by google or it's should be as lees as possible and specific to the content as much as possible. Cheers
On-Page Optimization | | DiamondJewelryEmpire0