Disavow File and SSL Conversion Question
-
Moz Community,
So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under
Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected).
Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version?
Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well.
Thanks
QL
-
Hi. I think mememax gave a very good answer.
The only thing I would submit for consideration is making too many changes at one time can be hard to track later. When we did the switch to https, I was super paranoid we would screw something up and lose rankings. So I chose to leave the disavow file exactly the same. It turned out the switch was not as bad as I thought and we didn't have any noticeable effect on rankings. So later when I was convinced that the https switch was not a factor, I could modify the disavow file. I also left the old domains from years ago in there for the reasons mememax points out.
Good Luck!
-
Hi QuickLearner,
You are actually raising a very interesting point. So, as for disavow you have to disavow links pointing to the current site and the ones pointing to any other property you own which is 301ing to it to be extra safe.
Remember that the disavow file should include all URLs/Domains that are pointing to your site that you are not able to remove by yourself or after trying to ping the webmaster. Based on this:
- you should disavow in your http site all the links that are pointing to the HTTP site only that you marked as spammy
- since you're going to make many changes on the disavow file, it may be a good moment to further reanalyze links you want to include vs you want to remove. Just ensure you're doing it right.
- the HTTPS site disavow file should contain all the links of the HTTP site + the ones pointing to it. Again only the links you want to remove obviously
- Even if sites that have expired can be safely removed as they're not linking to your site anymore, in the past I always kept them. Two reasons:
- sometimes google index is not very much up to date especially with tiny, low quality sites, which these ones may be. The site may have disappeared but if google doesn't drop it, it still counts as a link to your site
- you never know what's the real reason behind that site 4XX,5XX. So in case they may reappear I would just keep it there. It's like an IP blacklist. I don't know if that IP is still used but just in case I keep it there.
I hope this helps you!
e
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect question | new blog install on subdomain
Hi, I am running a wordpress site and our blog has grown to have a bit of a life of its own. I would like to use a more blog-oriented wordpress theme to take advantage of features that help with content discoverability, which is what the current theme I'm using doesn't really provide. I've seen sites like Canva, Mint and Hubspot put their blog on a subdomain, so the blog is sort of a separate site within a site. Advantages I see to this approach: Use a separate wordpress theme Help the blog feel like its own site and increase user engagement Give the blog its own name and identity My questions are: Are there any other SEO ramifications for taking this approach? For example, is a subdomain (blog.mysite.com) disadvantageous somehow, or inferior to to mysite.com/article-title? Regarding redirects, I read a recent Moz article about how 301s now do not lose page rank. I would also be able to implement https when I redirect, which is a plus. Is this an ok approach? Assuming I have to create redirect rules manually for each post though Thanks!
Intermediate & Advanced SEO | | mikequery0 -
Questions about Event Calendar Format and Duplicate Content
Hi there: We maintain a calendar of digital events and conferences on our website here: https://splatworld.tv/events/ . We're trying to add as many events as we can and I'm wondering about the descriptions of each. We're pulling them from the conference websites, mostly, but I'm worried about the scraped content creating duplicate content issues. I've also noticed that most calendars of this type which rank well are not including actual event descriptions, but rather just names, locations and a link out to the conference website. See https://www.semrush.com/blog/the-ultimate-calendar-of-digital-marketing-events-2017/ and http://www.marketingterms.com/conferences/ . Anyone have any thoughts on this? Thanks, in ..advance..
Intermediate & Advanced SEO | | Daaveey0 -
A 302 Redirect Question | Events Page Updates
Hello Moz World, I have a client that has a TON, like close to a thousand pages that have a 302 redirect set up. After further investigation, I found that every month they update their events page & Demo Request page, and the old events pages still exist but, get a 302 redirect to the updated page. From what I gather, this is a default mechanism set up by the hosting provider. My questions; is this an example of when to use a Rel=canonical? Also, is there a method for integrating this without having to go into every page and integrate the code snippet? And Lastly, How should I go about ensuring this doesn't happen in the future? Thanks ahead of time, you guys rock! B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Disavow File Submission process?
Hi Mozzers, I am working for client that hasn't got penalized but has lots of junk seo directories that I would like to disavow. My question is should i try reaching out to webmasters (if they are existant) first and show proof to google? or should I just go ahead and submit the file without any reach out?will it still work? Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Links from random sites: Disavow?
I am looking at the links to my site from GWT. I see a bunch of random sites I've never heard of. I never made an effort to get links from these sites. Sites like | http://www.xlx.pl | Also found one porn site! Should I just ignore these or disavow them?
Intermediate & Advanced SEO | | inhouseseo0 -
A very basic seo question
Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Conversion Rate Optimisation - advice before seeking out a specialist
Hello! I have a site I've been working on for a lovely client, a small business start up since Christmas. The site has a very simple layout, is ranking well and maintaining its positions, has solid social media, is receiving enough traffic and ranking for a number of terms. The problem is - conversions! The site just isn't converting. I have spoken with a few peers who have said advanced CRO will be too much for me to learn in terms of Psychology of Buying, learning about colors, fonts etc. I understand meta descriptions for example are something that I can do, I was wondering if anyone could give me advice on any other basic CRO techniques I could apply to the site before going to a specialist. Any advice would be MUCH appreciated - the moz community is always so helpful! Charlotte 🙂
Intermediate & Advanced SEO | | CharlotteWaller1 -
Sitemap.xml Question
I am pretty new to SEO and I have been creating new pages for our website for niche terms. Should I include ALL pages on our website in the sitemap.xml or should I only have our "main" pages listed on the sitemap.xml file? Thanks
Intermediate & Advanced SEO | | threebiz0