Disavow File and SSL Conversion Question
-
Moz Community,
So we have a website that we are moving to SSL. It has been 4 years since we submitted our disavow file to google via GWT. We decided to go through our backlinks and realized that many domains we are disavowing currently (under
Since we are moving to SSL I understand Google looks at this as a new site. Therefore, we decided to go through our backlinks and realized that many domains we are disavowing currently are no longer active (after 4 years this is expected).
Therefore, is it ok to create a new disavow file with the new profile on GW (ssl version of our site)? Also, is it ok the new GW disavow file doesn't include urls we previously disavowed with the non https version?
Some links from the old disavow we found were disavowed but they shouldn't have been. Moreover, we found new links we wanted to disavow as well.
Thanks
QL
-
Hi. I think mememax gave a very good answer.
The only thing I would submit for consideration is making too many changes at one time can be hard to track later. When we did the switch to https, I was super paranoid we would screw something up and lose rankings. So I chose to leave the disavow file exactly the same. It turned out the switch was not as bad as I thought and we didn't have any noticeable effect on rankings. So later when I was convinced that the https switch was not a factor, I could modify the disavow file. I also left the old domains from years ago in there for the reasons mememax points out.
Good Luck!
-
Hi QuickLearner,
You are actually raising a very interesting point. So, as for disavow you have to disavow links pointing to the current site and the ones pointing to any other property you own which is 301ing to it to be extra safe.
Remember that the disavow file should include all URLs/Domains that are pointing to your site that you are not able to remove by yourself or after trying to ping the webmaster. Based on this:
- you should disavow in your http site all the links that are pointing to the HTTP site only that you marked as spammy
- since you're going to make many changes on the disavow file, it may be a good moment to further reanalyze links you want to include vs you want to remove. Just ensure you're doing it right.
- the HTTPS site disavow file should contain all the links of the HTTP site + the ones pointing to it. Again only the links you want to remove obviously
- Even if sites that have expired can be safely removed as they're not linking to your site anymore, in the past I always kept them. Two reasons:
- sometimes google index is not very much up to date especially with tiny, low quality sites, which these ones may be. The site may have disappeared but if google doesn't drop it, it still counts as a link to your site
- you never know what's the real reason behind that site 4XX,5XX. So in case they may reappear I would just keep it there. It's like an IP blacklist. I don't know if that IP is still used but just in case I keep it there.
I hope this helps you!
e
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on Indexing, Hreflang tag, Canonical
Dear All, Have a question. We've a client (pharma), who has a prescription medicine approved only in the US, and has only one global site at .com which is accessed by all their target audience all over the world.
Intermediate & Advanced SEO | | jrohwer
For the rest of the US, we can create a replica of the home page (which actually features that drug), minus the existence of the medicine, and set IP filter so that non-US traffic see the duplicate of the home page. Question is, how best to tackle this semi-duplicate page. Possibly no-index won't do because that will block the site from the non-US geography. Hreflang won't work here possibly, because we are not dealing different languages, we are dealing same language (En) but different Geographies. Canonical might be the best way to go? Wanted to have an insight from the experts. Thanks,
Suparno (for Jeff)1 -
Links from swf file widely distributed?
Hello, I just realised that Google is listing as backlinks, links from swf games that we created and distributed widely. We never used this method to have backlinks but as we create the games and give them for free to other sites, we added a link back to our site, if the user who played the game want to visit us. But I am worried that this is interpreted as black hat seo, and this affected our ranking badly. Anyone had this kind of issue? How do you think we should be tackling this? Is this could be affected our site? Thanks for your help on this guys 😉
Intermediate & Advanced SEO | | drimlike0 -
XML Sitemap Questions For Big Site
Hey Guys, I have a few question about XML Sitemaps. For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml). If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)? If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this? Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it? Thank you!
Intermediate & Advanced SEO | | keywordwizzard0 -
Uncontrollable Spammy Backlinks - Disavow or Not?
Hey Mozzers, I have ran a few different backlink reports, and I noticed that one of my sites has an incredible amount of spammy backlinks. These were not done by a prior SEO, they are simmply spammy links that were scraped and inserted on terrible sites, forums, directories, etc. 100% uncontrollable. The anchor text includes anything from the domain to "live sex" and "victoria's secret coupons". There are probably close to 700 or so of these backlinks from around 150-200 domains. I have read that one should contact the webmaster, and use disavow as a last resort, but I am not sure if that advice is for spammy link building techniques, which we have no history of doing. Is this normal? What is the best way to handle this? Is it likely that these are affecting this site's ranking at the moment? The number of spammy links drastically affects the ratio of quality backlinks to spammy backlinks. This is so frustrating...
Intermediate & Advanced SEO | | evan890 -
Question regarding error url while checking in open-site explorer tool
Hello friends, My website home url & inner page url shows error while checking the open-site site explorer tool from SEOMoz, for a website** eg.www.abc.com website as**
Intermediate & Advanced SEO | | zco_seo
**"Oh Hey! It looks like that URL redirects to www.abc.com/error.aspx?aspxerrorpath=/default.aspx. Would you like to see data for that URL instead?"**May I know the reason, why this url showing this result while checking back link report from the tool?**May I know on what basis this tool is evaluating the website url as well?****May I know, Will this affect the Google SERPs for this website?**Thanks0 -
Site Structure Question
Hi All, Got a question about site structure, I currently have a website where everything is hosted on the root of the domain. See example below: site.com/men site.com/men-shorts site.com/men-shorts-[product name] I want to change the structure to site.com/men/shorts/[product-name] I have asked a couple of SEOs and some agree with me that the structure needs to be changed and some say that as long as I dictate the structure with internal links and breadcrumbs the URL structure doesn't matter... What do you guys think? Many thanks, Carlos
Intermediate & Advanced SEO | | Carlos-R0 -
Canonical or 301 redirect, that is the question?
So my site has duplicate content issues because of the index.html and the www and non www version of the site. What's the best way to deal with this without htaccess? Is it a 301 redirect or is it the canonical, or is it both?
Intermediate & Advanced SEO | | bronxpad0 -
Advanced Question on Synonym Variation Pages!
Hi, This is quite an advanced question, so I'll go through in detail - please bare with me! I launched the new version of our website exactly a week ago - and all the key metrics are in the right direction: Pages / Visit +5% , Time on Site +25%, Bounce rate down 1 %. I work in an industry were our primary keyword has 4 synonyms and our long tail keywords are location related. So as an example I have primary synonyms like: Holiday, Vacation, Break, Trip (Not actually these but they are good enough as an example). Pluralised versions and you have 8 in total. So my longtail keywords are like: Las Vegas Vacation / Las Vegas Vacations
Intermediate & Advanced SEO | | James77
Las Vegas Holiday / Las Vegas Holidays
Las Vegas Trip / Las Vegas Trips
Las Vegas Breaks / Las vegas Breaks All these synonyms effectively mean the same thing, so my thinking on my new website was to specifically target each of these synonyms with their own unique page and optimise the meta and page titles, to those exact words. To make these pages truely unique, I therefore got a bunch of copywriters to write about 600 words unique for every long tail synonym (well over 750,000 words in total!). So now at this point I have my page "Las Vegas Holidays" with 600 unique words of content, and "Las Vegas Vactions" with 600 words of unique content etc etc etc. The problem is, when the user is searching for these words, there primary goal is not to read 600 words of content on "Las Vegas Holidays" - their primary goal is to get a list of last vegas holidays that they can search, view purchase (they may want to read 600 words of content, but is not their primary goal). So this puts me in a dilema - I need to display the nuts and bolt (IE the actual holidays in las vegas) to the customer on any page they land on off my synonyms as the primary content. But to make sure these pages are unique I need to also have this unique content on that page. So here's what I did: On every synonym version of the page I display the exact same information. However, on each page I have a "Information" link. and on click this pop's up a layer which contains my unique content for that page. To further optimise using perfect anchors in this content pop-up, I have cross linked the synonym pages (totally naturally) - IE on my "Las Vegas Holidays" page, in the content I may have the words "Las Vegas Breaks" - this would be linked the the "Las Vegas Breaks" synonym page. In theory I don't think there is anything wrong with what I am doing in the eyes of the customer - but I have a big concern that this may well look "fishy" to SE's. IE the pages are almost identical to the user except for this information pop-up layer of unique content, titles and meta. We know that Google at least can get can tell exactly what the user see's when they land on that page ( from their "Preview") and can distinguise between user visible and hidden text. Therefore, even though from a user experience, I think we are making a page that is perfect for them (they get the list of vactions etc as the primary content, and can read infomation if they want by clicking a button), I am concerned that SE's are going to say - hold on a minute there are load of pages here that are identical except for a chuck of text that is not visible to the user (Even though this is visible to the user if they click the "Information" button), and this content cross links to a load of almost identical pages with the same thing. Today I checked our rankings, and we have taken a fair whack from google - I'm not overly concerned at the moment as I expected big fluctuations from ranking for the first few weeks - but I'd be a lot more confident if they were fluctuating in the right direction!! So what do I do?
As far as I can see my options break down as follows: Content Display:
1/. Keep it as it is, and hope the SE's don't see it as spammy. Even though I think what we are doing is the best for customer experience, I'm concerned SE's won't. 2/. On every synonym page, below all the list of products, packages etc that the customer wants to see, display the unique content as a block of subtext text which is visble by default. This however could make the page a bit ugly. 3/. Display a visible snippet of the unique content, below all the packages, and have a more button which expands the rest of the content - IE have a part visible layer. This is slightly better for display, but again I'm only displaying a portion of visible content and the rest will still be flagged as "hidden" by default to the SE's. Cross Linking within the content:
1/. Keep it as it is where synonym keywords link to the synonym version of the page. 2/. Alter it so that every sysnonym keyword links to the "primary" synonym version of the page - EG if I now "Las Vegas Holidays" is my main keyword, then "Las Vegas Vactions" keyword, would not link to my "Las Vegas Vactions" page as current, but would link to my "Las Vegas Holidays" page. I apologise for the indepth questions, but it requires a lot of explanation to get it across clearly. I would be grateful on any of your thoughts. Many thanks in advance.0