Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can I dissavow links on a 301'd website?
-
So we are performing link removal for a client on his old website (A), which is being 301 redirected to his new website (B). We have identified toxic links on site A and are removing, once complete we will undo the current 301, confirm a new GWT account for website A, and then submit the disavow report.
We would then like to reapply the 301 redirect to site B while we are waiting for Google to process the disavow report, the logic being we can retain some current rankings on site B while waiting for the disavow to process on site A.
Has anyone had experience with this method? I foresee some potential issues here but am interested to here from others on this. Thanks!
-
I tend to agree with Federico's concerns. If the 301 transfers a penalty, the impact could be long-term, and it could be harder to rescue site B. The short-term ranking gains may not be worth it.
Google hasn't been clear on how this operates with 301 redirects. John's suggestion to disavow on both sites seems safe. Worst case, it's wasted effort, but it's not much effort (once you've built one file, building two is easy). Still, you've got to wait for that to process, and if the algorithmic penalty is something like Penguin, then you'd have to wait for a data refresh. This could take months, so I'd be really hesitant to risk site B until you've cleaned up the mess.
Once you disavow to site A, the 301-redirect should be fairly safe, but it does depend on the extent of the penalty. The risk/reward trade-off is definitely a "devil is in the details" sort of situation.
-
Well, you are right, manual are easier to fix, although most likely, sites with manual penalties usually fall into an algorithmic penalty too.
Steps I'd suggest:
- Don't reinstate the redirect.
- Do some cleaning, extensive cleaning.
- Use it just as a redirection for users, but not for crawlers, rankings (using robots.txt disallow site A and 302 redirect the domain to site B).
Hope that helps!
-
No, this is an algorithmic penalty. Wish it was manual, would be easier to figure out.
-
But did you get any MANUAL penalty on A or B?
-
The problem is that despite the algorithmic penalty site A appears to be pushing heavy authority to site B and keeping decent rankings for some very competitive terms that we otherwise would not rank for with site B. If I remove the 301 I fully expect all current rankings to drop, I am trying to avoid this.
Were doing link removal now, but plan on having to use the disavow tool once we have a few removal requests out to webmasters. I actually got an answer on this from John Mueller at Google in the technical SEO community on G+.
John Mueller
"I would think about the final state you want to be in and just do that. If you want to do a domain move, then 301 and keep them. If you do a domain move + disavow links, then submit the file for both domains. This process will take quite some time (maybe even a year), so you don't want to play with it incrementally: just find out what you want in the end and set that up." -
Hey Chris,
Did site A or B receive a manual penalty?
As any penalty on A, which is 301'd to B, will ultimately pass the penalty to B. I would suggest removing the 301 ASAP. Then cleanup the A domain until it's clean (if a manual action, until it's revoked) and then you can think of putting the 301 back.
Removing a manual penalty could be a long process, it took 1 year for us and 4 reconsideration requests to get the penalty revoked. We had to use the disavow as a machete as disavowed almost our entire link profile leaving aside the domains that we knew were good links, all others were disavowed using the "Domain:" to avoid any missed link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO'ing a sports advice website
Hi Team Moz, Despite being in tech/product development for 10+ years, I'm relatively new to SEO (and completely new to this forum) so was hoping for community advice before I dive in to see how Google likes (or perhaps doesn't) my soon to be built content. I'm building a site (BetSharper, an early-stage work in progress) that will deliver practical, data orientated predictive advice prior to sporting events commencing. The initial user personas I am targeting would need advice on specific games so, as an example, I would build a specific page for the upcoming Stanley Cup Game 1 between the Capitals and the Tampa Bay Lighting. I'm in the midst of keyword research and believe I have found some easier to achieve initial keywords (I'm realistic, building my DA will take time!) that include the team names but don't reference dates or state of the tournament. The question is, hypothetically if I ranked for this page for this sporting event this year, would it make sense to refresh the same page with 2019 matchup content when they meet again next year, or create a new page? I am assuming I would be targeting the same intended keywords but wondering if I get google credit for 2018 engagement post 2019 refresh. Or should I start fresh with a new page and specifically target keywords afresh each time? I read some background info on canonical tabs but wasn't sure if it was relevant in my case. I hope I've managed to articulate myself on what feels like an edge case within the wonderful world of SEO. Any advice the community delivers would be much appreciated...... Kind Regards James.
Intermediate & Advanced SEO | | JB19770 -
Can you disallow links via Search Console?
Hey guys, Is it possible in anyway to nofollow links via search console (not disavow) but just nofollow external links pointing to your site? Cheers.
Intermediate & Advanced SEO | | lohardiu90 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Website Redesign, 301 Redirects, and Link Juice
I want to change my client’s ecommerce site to Shopify. The only problem is that Shopify doesn’t let you customize domains. I plan to: keep each page’s content exactly the same keep the same domain name 301 redirect all of the pages to their new url The ONLY thing that will change is each page’s url. Again, each page will have the exact same content. The only source of traffic to this site is via Google organic search and sales depend on the traffic. There are about 10 pages that have excellent link juice, 20 pages that have medium link juice, and the rest is small link juice. Many of our links that have significant link juice are on message boards written by people that like our product. I plan to change these urls and 301 redirect them to their new urls. I’ve read tons of pages online about this topic. Some people that say it won’t effect link juice at all, some say it will might effect link juice temporarily, and others are uncertain. Most answers tend to be “You should be good. You might lose some traffic temporarily. You might want to switch some of your urls to the new structure to see how it affects it first.” Here’s my question: 1) Has anyone ever done changed a url structure for an existing website with link juice? What were your results and do you have a definitive answer on the topic? 2) How much link juice (if any) will be lost if I keep all of the exact content the same but only change each page’s url? 3) If link juice is temporarily lost and then regained, how long will it be temporarily lost? 1 week? 1 month? 6 months? Thanks.
Intermediate & Advanced SEO | | kirbyf0 -
Can a large fluctuation of links cause traffic loss?
I've been asked to look at a site that has lost 70/80% if their search traffic. This happened suddenly around the 17th April. Traffic dropped off over a couple of days and then flat-lined over the next couple of weeks. The screenshot attached, shows the impressions/clicks reported in GWT. When I investigated I found: There had been no changes/updates to the site in question There were no messages in GWT indicating a manual penalty The number of pages indexed shows no significant change There are no particular trends in keywords/queries affected (they all were.) I did discover that ahrefs.com showed that a large number of links were reported lost on the 17th April. (17k links from 1 domain). These links reappeared around the 26th/27th April. But traffic shows no sign of any recovery. The links in question were from a single development server (that shouldn't have been indexed in the first place, but that's another matter.) Is it possible that these links were, maybe artificially, boosting the authority of the affected site? Has the sudden fluctuation in such a large number of links caused the site to trip an algorithmic penalty (penguin?) Without going into too much detail as I'm bound by client confidentiality - The affected site is really a large database and the links pointing to it are generated by a half dozen or so article based sister sites based on how the articles are tagged. The links point to dynamically generated content based on the url. The site does provide a useful/valuable service/purpose - it's not trying to "game the system" in order to rank. That doesn't mean to say that it hasn't been performing better in search than it should have been. This means that the affected site has ~900,000 links pointing to is that are the names of different "entities". Any thoughts/insights would be appreciated. I've expresses a pessimistic outlook to the client, but as you can imaging they are confused and concerned. LVSceCN.png
Intermediate & Advanced SEO | | DougRoberts0 -
301 vs 410 redirect: What to use when removing a URL from the website
We are in the process of detemining how to handle URLs that are completely removed from our website? Think of these as listings that have an expiration date (i.e. http://www.noodle.org/test-prep/tphU3/sat-group-course). What is the best practice for removing these listings (assuming not many people are linking to them externally). 301 to a general page (i.e. http://www.noodle.org/search/test-prep) Do nothing and leave them up but remove from the site map (as they are no longer useful from a user perspective) return a 404 or 410?
Intermediate & Advanced SEO | | abargmann0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0