Correct Indexing problem
-
I recently redirected an old site to a new site. All the URLs were the same except the domain. When I redirected them I failed to realize the new site had https enable on all pages. I have noticed that Google is now indexing both the http and https version of pages in the results. How can I fix this? I am going to submit a sitemap but don't know if there is more I can do to get this fixed faster.
-
Okay I may have understood your original post differently then what you meant.
So the case is you have HTTPS enabled, but Google is Indexing Both HTTP & HTTPS pages. However, you want them to only index the HTTP version. You are also running a cart or checkout which is only HTTPS which is likely not relevant to Google so I would recommend blocking those pages with robots.txt.
I would recommend coding an IF statement to deal with duplicate indexing (https & http) & setting up a robots.txt file to prevent crawling pages that have no value and are there for customer use only.
Something like this would work in php:
_
_if ( isset($_SERVER['HTTPS']) || (isset($SERVER['HTTPS']) && strtolower($SERVER['HTTPS'])) == 'on' ) {echo ''."\n";}
else {echo ''."\n";}
?>_I'm not sure the code in asp since I rarely ever use Windows servers but you should be able to find that with Google.
Then setup your robots.txt to block all urls that are specific to personal data like this: (Example)
Disallow: /catalog/account.php
Disallow: /catalog/account_edit.php
Disallow: /catalog/account_history.php
Disallow: /catalog/account_history_info.php
Disallow: /catalog/account_password.php
Disallow: /catalog/add_checkout_success.php
Disallow: /catalog/address_book.php
Disallow: /catalog/address_book_process.php
Disallow: /catalog/checkout_confirmation.php
Disallow: /catalog/checkout_payment.php
Disallow: /catalog/checkout_process.php
Disallow: /catalog/checkout_shipping.php
Disallow: /catalog/checkout_shipping_address.php
Disallow: /catalog/checkout_success.php
Disallow: /catalog/cookie_usage.php
Disallow: /catalog/create_account.phpI hope that helps
Don_
-
My site should be running http on all pages except the checkout. Would this work the opposite of what you have written and I can make a rule for the checkout to allow https?
Thanks
jared
-
If your site is running on https only, then a simple edit to your .htaccess file will correctly re-direct (301) any request for a http page to the correct https page.
Sample Code:
RewriteCond %{HTTPS} !=on
RewriteRule .* https://%{SERVER_NAME}%{REQUEST_URI} [R=301,L]
There are several ways to handle this, so you may also benefit from Searching ".htaccess 301 redirect http to https"
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
Google indexing is slowing down?
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png
Technical SEO | | RyanTheMoz0 -
Canonicalization, does it still index
If I have 2 pages that are identical but on different domains that our team manages, if we place a rel=canonical tag on the page we prefer/should display, will the page that doesn't have the canonical tag still be indexed and show on SERPs?
Technical SEO | | kroe10 -
Fake Links indexing in google
Hello everyone, I have an interesting situation occurring here, and hoping maybe someone here has seen something of this nature or be able to offer some sort of advice. So, we recently installed a wordpress to a subdomain for our business and have been blogging through it. We added the google webmaster tools meta tag and I've noticed an increase in 404 links. I brought this up to or server admin, and he verified that there were a lot of ip's pinging our server looking for these links that don't exist. We've combed through our server files and nothing seems to be compromised. Today, we noticed that when you do site:ourdomain.com into google the subdomain with wordpress shows hundreds of these fake links, that when you visit them, return a 404 page. Just curious if anyone has seen anything like this, what it may be, how we can stop it, could it negatively impact us in anyway? Should we even worry about it? Here's the link to the google results. https://www.google.com/search?q=site%3Amshowells.com&oq=site%3A&aqs=chrome.0.69i59j69i57j69i58.1905j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 (odd links show up on pages 2-3+)
Technical SEO | | mshowells0 -
Website being crawled but not indexed any thoughts?
Hi Everyone,
Technical SEO | | Ant71
I created a new website a few weeks ago www.drivingseaford.co.uk , did a little link citation, links from Google+, submitted to webmaster tools etc but its still not getting indexed. Webmaster tools crawl stats page is showing pages being crawled, no errors. But 0 indexed. http://www.drivingseaford.co.uk/robots.txt is showing User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Im a bit stumped as never had this before!!! Any ideas from you lovely people?? Antony0 -
Page not indexed but still has a PageRank, how?
http://www.optiproerp.com/products.aspx page is not indexed in Google but still has a PageRank of 1. How? Regards
Technical SEO | | IM_Learner0 -
How to change noindex to index?
Hey, I've recently upgraded to a pro SEOmoz account and have realised i have 14574 issues to do with 'blocked by meta-robot' and that 'This page is being kept out of the search engine indexes by the meta tag , which may have a value of "noindex", keeping this page out of the index.' How can i change this so my pages get indexed? I read somewhere that i need to change my privacy settings but that thread was 3 years old and now the WP Dashboard has updated.. Please let me know Many thanks, Jamie P.s Im using WordPress 3.5 And i have the XML sitemap plugin And i have no idea where to look for this robots.txt file..
Technical SEO | | markgreggs0 -
301 forward of index to root
Hi, In my crawl diagnotics, I received an error for duplicate content: 1. www.website.com 2. www.website.com/ 3. www.website.com/index.html Which code do I have to add to my htaccess to avoid this?
Technical SEO | | wellnesswooz0