Should you bother with an "impact links" manual action
-
I have a couple sites that have these, and I have done a lot of work to get them removed, but there seems to be very little if any benefit from doing this. In fact, sites were we have done nothing after these penalties seem to be doing better than ones where we have done link removal and the reconsideration request.
Google says "I_f you don’t control the links pointing to your site, no action is required on your part. From Google’s perspective, the links already won’t count in ranking. However, if possible, you may wish to remove any artificial links to your site and, if you’re able to get the artificial links removed, submit a reconsideration request__. If we determine that the links to your site are no longer in violation of our guidelines, we’ll revoke the manual action._"
I would guess a lot of people with this penalty don't even know they have it, and it sounds like leaving it alone really doesn't hurt your site.
If seems to me that just simply ignoring this and building better links and higher quality content should help improve your site rankings vs. worrying about trying to get all these links removed/disavowed.
What are your thoughts? Is it worth trying to get this manual action removed?
-
Hey Dave,
It's unfortunate but very much true. I have experimented this with dozens of site
- Deleting thousands of artificial links
- Disavow links
- Fixing Anchors
but recovery is always not measured up to expectations. I tried different with couple of sites (Penguin 2.1), I never remove back-links
- Just build more quality links
and recovery is far better.
I feel, deleting even spamy links leads to loss of link juice which brings things further down for a while.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rel="prev" / "next"
Hi guys, The tech department implemented rel="prev" and rel="next" on this website a long time ago.
Intermediate & Advanced SEO | | AdenaSEO
We also added a canonical tag to the 'own' page. We're talking about the following situation: https://bit.ly/2H3HpRD However we still see a situation where a lot of paginated pages are visible in the SERP.
Is this just a case of rel="prev" and "next" being directives to Google?
And in this specific case, Google deciding to not only show the 1st page in the SERP, but still show most of the paginated pages in the SERP? Please let me know, what you think. Regards,
Tom1 -
Social Links through Link Shortners. Does it count?
We use link shortner services like Bitly, Goo.gl, etc. Does the post used while making use of such link shortner services counts as a social signal. Or should we post the complete website url pointing to each page while posting on social sites. Secondly, should we write a new description while posting on Social sites or just copy paste a few lines of original posts?
Intermediate & Advanced SEO | | welcomecure0 -
Pages with rel "next"/"prev" still crawling as duplicate?
Howdy! I have a site that is crawling as "duplicate content pages" that is really just pagination. The rel next/prev is in place and done correctly but Roger Bot and Google are both showing duplicated content + duplicate page titles & meta's respectively. The only thing I can think of is we have a canonical pointing back at the URL you are on - we do not have a view all option right now and would not feel comfortable recommending it given the speed implications and size of their catalog. Any experience, recommendations here? Something to be worried about? /collections/all?page=15"/>
Intermediate & Advanced SEO | | paul-bold0 -
Change domain whilst under a partial manual links penalty
Hi there We're currently under a manual penalty for some unnatural links to our domain and have been working on fixing that but had our first re-consideration request rejected so we're doing a second round of link removals The issue we have is that we were planning to change our domain before the SSL certificate expires in a couple of weeks and renew the certificate with the new domain but are unsure whether to stop working on the reconsideration request, change the domain and wait until the manual penalty moves to the new domain before continuing the link removal. Alternatively try and use the domain change to select which links are 301'd to the new site and leave behind the bad links in the hope that the manual penalty wouldn't be applied to the new domain Any thoughts or advice would be appreciated
Intermediate & Advanced SEO | | Ham19790 -
Google Manual Action (manual-Penalty)- Unnatural inbound links
Dear friends, I just get from Google two "Unnatural inbound links" notifications via Google Webmaster Tools, the first is for our WWW version of the site and the second is for the NON-WWW version. My question, I should send two identical reconsideration request for WWW and NON-WWW or treat them as different sites? Thank you Claudio
Intermediate & Advanced SEO | | SharewarePros0 -
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing. Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
End of March we migrated our site over to HubSpot. We went from page 3 on Google to non existent. Still found on page 2 of Yahoo and Bing under same keywords " parts washers" Beyond frustrated...HELP PLEASE "www.vortexpartswashers.com"
Intermediate & Advanced SEO | | mhart0 -
What url should i link to?
Hi everybody, after some discussions i decided to keep my page on the old domain for better seo rankings; However, the new third level domain sounds better: poltronafraubrescia.zenucchi.it.... the question is: i'm going to recive a high value link and i don't know if i should link directly to the old adress ( www.zenucchi.it/ITA/poltrona-frau-brescia.it ) where the page is located or to the new one by making a 301 redirect to the previous. what's best? and second question what's the way to keep the page on this adress ( www.zenucchi.it/ITA/poltrona-frau-brescia.it ) but show poltronafraubrescia.zenucchi.it as url? thank you guido
Intermediate & Advanced SEO | | guidoboem0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0