Moving to TLS and disavow file
-
I'm considering the move to TLS/SSL obviously will be setting up the version in Search Console, do I need to re-upload the disavowal file previously generated before the move?
Look forward to your response.
-
I appreciate your comprehensive article, However, may I kindly point out my question was to do with Disavow in Google Search Console, Not the implementation of secure.
-
1. Get and Install Certificates
Buy a 2048-bit TLS/SSL SHA-2 secure certificate from a Certificate Authority (CA)
Generate some documents so that the CA can issue a signed certificate
Send the CA what they need (your public key and certificate signing request)
Install certificates on your servers
2. Enable HTTPS on Your Servers
Configure your server for HTTPS. Check out these configuration tips for popular servers.
Test properly functioning using an external testing tool. Here’s a good one.
Set a reminder to update your secure certificate before it expires.
3. Code & Configuration Changes
Update site content to request https resources
Update internal links to point to https pages or consider making internal links relative
Use protocol relative URIs. Example: (see note below)
Add self-referencing rel canonical tag to every page, pointing to your HTTPS URIs
Change all Ad calls to work with HTTPS
Update any internal tools, such as Optimizely or CrazyEgg, to work with HTTPS
Update legacy redirects to eliminate chained redirects (see note below)
Update OpenGraph, Schema, Semantic markup etc. to point to HTTPS
Update social sharing buttons to preserve share counts
4. Robots.txt, XML Sitemaps, Search Console and Analytics
Create and verify a new property for the HTTPS site in Google Search Console
Create a new XML sitemap file that points to your HTTPS URLs and upload it to the new property in Search Console
Create a new robots.txt file for the HTTPS site and copy over all existing rules. Include a Sitemap link to the new HTTPS XML sitemap.
Remove all rules from the HTTP robots.txt file, except for the Sitemap link, and leave it in place. This is to encourage bots to crawl and follow all redirects.
Copy any existing disavow file and upload it to the new HTTPS property in Search Console
Note: Don’t use the “Change of Address” feature in Google Search Console. That’s used for migrations to new domains.
5. Redirect HTTP to HTTPs
Deploy the redirect code
Redirect HTTP to HTTPS on IIS (7.x and higher)
Redirect HTTP to HTTPS on Apache
Redirect HTTP to HTTPS on Nginx
Include exceptions to any global redirect directives for your existing robots.txt and XML sitemap files
6. Follow-Up (after the release)
Use a tools, like SSL Check, to scan your site for non-secure content
Check HTTPS redirects and legacy redirects to ensure they work correctly. Check for long redirect chains using a tools that captures the header responses (I like Redirect Path by Ayima). Check for proper redirect functionality from both www and non-www, with and without trailing slashes, etc.
Use “Fetch as Google” tool and submit your Home page and other key pages to speed up the indexing process. I use the “Crawl this URL and its direct links” option.
Monitor the Index Status report in Search Console. The HTTP property should eventually go to zero, and the HTTPS should increase. Take this a step further by calculating the indexation rates of each XML sitemap and monitor them over time.
Monitor the Crawl Errors report in Search Console and address errors, as appropriate
When most new (HTTPS) URLs are already indexed, remove the legacy sitemap link from Robots.txt
Update incoming links that are within your control to point to HTTPS (eg. links to your site from social media profiles)
7. Turn on Strict Transport Security (HSTS)
Once you’re absolutely sure the entire site is working with HTTPS, use HSTS to improve performance by ensuring the browser “remembers” to send all requests to your site to https based on a policy you set. Keep in mind that this means your site will only use HTTPS, so make sure it works! (source).
here is another an good written guide for you.
-
Hey there,
You absolutely do need to. One of the biggest mistakes people make is not migrating their disavow file when they switch domains/encryptions.
Sean
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where does rel=canonical go? One file that manages sort order, view, filters, etc...
Where do I put the rel=canonical when the search.cfm (using URL re-write) page is the one and only page, just using url parameters to control sort, filter, view, etc. Do I just put the rel=canonical at the top of the search.cfm page? The duplicate content issues I am getting are: https://www.domain.com/tx/austin/ https://www.domain.com/tx/austin/?d=25&h=&s=r&t=&v=l&a= Just want to be clear since Moz Pro is picking up both URL's but it's only really one file, search.cfm Thanks in advance for your help.
Technical SEO | | ErnieB0 -
Http to https - Copy Disavow?
If the switch is made from http to https (with 301 redirects from http to https) should the disavow file be copied over in GWT so it is also uploaded against the https as well as the http version?
Technical SEO | | twitime0 -
301 Redirect keep html files on server?
Hello just one quick question which came up in the discussion here: http://moz.com/community/q/take-a-good-amount-of-existing-landing-pages-offline-because-of-low-traffic-cannibalism-and-thin-content When I do 301 redirects where I put together content from 2 pages, should I keep the page/html which redirects on the server? Or should I delete? Or does it make no difference at all?
Technical SEO | | _Heiko_0 -
Moved a site and changed URL structures: Looking for help with pay
Hi Gents and Ladies Before I get started, here is the website in question. www.moldinspectiontesting.ca. I apologize in advance if I miss any important or necessary details. This might actually seem like several disjointed thoughts. It is very late where I am and I am a very exhausted. No on to this monster of a post. **The background story: ** My programmer and I recently moved the website from a standalone CMS to Wordpress. The owners of the site/company were having major issues with their old SEO/designer at the time. They felt very abused and taken by this person (which I agree they were - financially, emotionally and more). They wanted to wash their hands of the old SEO/designer completely. They sought someone out to do a minor redesign (the old site did look very dated) and transfer all of their copy as affordably as possible. We took the job on. I have my own strengths with SEO but on this one I am a little out of my element. Read on to find out what that is. **Here are some of the issues, what we did and a little more history: ** The old site had a terribly unclean URL structure as most of it was machine written. The owners would make changes to one central location/page and the old CMS would then generate hundreds of service area pages that used long, parameter heavy url's (along with duplicate content). We could not duplicate this URL structure during the transfer and went with a simple, clean structure. Here is an example of how we modified the url's... Old: http://www.moldinspectiontesting.ca/service_area/index.cfm?for=Greater Toronto Area New: http://www.moldinspectiontesting.ca/toronto My programmer took to writing 301 redirects and URL rewrites (.htaccess) for all their service area pages (which tally in the hundreds). As I hinted to above, the site also suffers from a overwhelming amount of duplicate copy which we are very slowly modifying so that it becomes unique. It's also currently suffering from a tremendous amount of keyword cannibalization. This is also a result of the old SEO's work which we had to transfer without fixing first (hosting renewal deadline with the old SEO/designer forced us to get the site up and running in a very very short window). We are currently working on both of these issues now. SERPs have been swinging violently since the transfer and understandably so. Changes have cause and effect. I am bit perplexed though. Pages are indexed one day and ranking very well locally and then apparently de-indexed the next. It might be worth noting that they had some de-index problems in the months prior to meeting us. I suspect this was in large part to the duplicate copy. The ranking pages (on a url basis) are also changing up. We will see a clean url rank and then drop one week and then an unclean version rank and drop off the next (for the same city, same web search). Sometimes they rank along side each other. The terms they want to rank for are very easy to rank on because they are so geographically targeted. The competition is slim in many cases. This time last year, they were having one of the best years in the company's 20+ year history (prior to being de-indexed). **On to the questions: ** **What should we do to reduce the loss in these ranked pages? With the actions we took, can I expect the old unclean url's to drop off over time and the clean url's to pick up the ranks? Where would you start in helping this site? Is there anything obvious we have missed? I planned on starting with new keyword research to diversify what they rank on and then following that up with fresh copy across the board. ** If you are well versed with this type of problem/situation (url changes, index/de-index status, analyzing these things etc), I would love to pick your brain or even bring you on board to work with us (paid).
Technical SEO | | mattylac0 -
Robots.txt file
How do i get Google to stop indexing my old pages and start indexing my new pages even months down the line? Do i need to install a Robots.txt file on each page?
Technical SEO | | gimes0 -
IMPACTS on Moving Online Store to New Platform
I, I have this online store, http://www.filtrationmontreal.com/ built on osCommerce. This platform is NOT SEO friendly and I expecting to move it to a new platform, probably, http://www.americommerce.com/ Can I limit the SEO impact when moving this store? The domaine will be the same, but every other url on the site will change. I only know the basics of SEO, and only copy html code when I need to. So my knowledge are limited. Using Google Analytics, I have the most popular page url visit on the current store. To limit impact, I thought I should copy the most popular page URL of the curent store and do a 301 redirect on the store platform. Do you think this my limit the impact? Would there be a other way to limit SEO impact on Google search results? I don't want to be penalize if the search and up on a 401 page. Can anybody help me? Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
.lbi file - SEO friendly or not?
Up until yesterday afternoon i had never heard of a .lbi file. It turns out it is a library file used by Adobe Dreamweaver. From what i can tell it works like a client side included but i am unsure of the technology behind it. The issue:
Technical SEO | | kchandler
When running through a recent SEO audit for a new client i found these .lbi files being used all over there site for site wide callouts and even navigation. When viewing this content through firebug or in the browser you can see the executed HTML content but when viewing the source or the page in seo-browser.com the content is nowhere to be seen. So my thought is this is not SEO friendly and is the same as displaying content in any client-side script like JavaScript or JQuery. Any feedback or thoughts on this subject would be awesome, especially if anyone has used these previously. Unfortunately i cannot share the client site but i would be more than happy to answer any questions if more detail is needed. Thanks in advance - Kyle0 -
Moving Duplicate Sites
Apologies in advance for the complexity. My client, company A, has purchased company B in the same industry, with A and B having separate domains. Current hosting arrangement combines registrar and hosting functions in 1 account so as to allow both domains to point to a common folder, with the result that identical content is displayed for both A & B. The current site is kind of an amalgam of A and B. Company A has decided to rebrand and completely absorb company B. The problem is that link value overwhelmingly favours B over A. The current (only) hosting package is Windows, and I am creating a new site and moving them to Linux with another hosting company. I can use 301's for A , but not for B as it is a separate domain and currently shares a hosting package with A. How can I best preserve the link juice that domain B has? The only conclusion I can come up with is to set up separate Linux hosting for B which will allow for the use of 301's. Does anyone have a better idea?
Technical SEO | | waynekolenchuk0