How can I prevent Google and other search engines to crawl my secured pages (https:)?
-
Let me know your thoughts guys. Thanks in advance!
-
Your best bet is to place a meta noindex tag on each secure page. If it's only a few pages, you could just add it by hand. If it's many, you should be able to access each pages protocol with whatever server-side language you're using, and dynamically add it on all secure pages.
If you use robots.txt to exclude the pages, Google can still show them in search results, with the description below them that reads, "A description for this result is not available because of this site's robots.txt – learn more." Personally, I don't care for that.
-
Hi there, blocking the HTTPS version of your pages from being crawled by the search engines is a bit tricky. You might need to come up with a separate robots.txt file to handle the HTTPS requests.
Here you go to know more about the process:
http://www.seoworkers.com/seo-articles-tutorials/robots-and-https.html
Hope it helps.
Best,
Devanur Rafi
-
Hi esiow
You have a choice of placing a robots.txt file in the root folder of your website or if blocking individual pages you could use the meta robots tag. See these page for more information: http://moz.com/learn/seo/robotstxt and https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag?csw=1
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
With Google's new Speed Update, what does that mean for AMP pages?
Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think!
Web Design | | TaylorRHawkins
Thanks!2 -
Referring subdirectory pages from 3rd hierarchy level pages. Will this hurts?
Hi all, We have product feature pages at 3rd tier like website.com/product/features. We have the help guides for each of these features on a different subdirectory like website.com/help/guides. We are linking these help guides from every page of features. So, will it hurts us anywhere just because we are encouraging 4th tier pages in website, moreover they are from different sub-directory. Thanks
Web Design | | vtmoz0 -
Can the design still be considered adaptive if the URL is different?
I was under the impression our site had a mobile dedicated design, but my developers are telling me we have an adaptive design. The mobile site is set up different and has different content and the url is as follows: www.site.com/MobileView/MobileHome.aspx Can it still be considered adaptive if the URL is not the exact same? Hopefully this make sense and I appreciate anyone's input!
Web Design | | AliMac260 -
Google pagespeed / lazy image load
Hi, we are using the apache module of google pagespeed. It works really great, helps a lot. But today I've asked me one question: Does the "lazy load" feature for images harm the ranking? The module reworks the page to load the images only if the are visible at the screen. Is this behavior also triggered by the google bot? Or are the images invisible for google? Any expirience about that? Best wishes, Georg.
Web Design | | GeorgFranz0 -
How Does Google differentiate a keyword you are optimizing for and a non-keyword?
So, let's say that my company is called John's Business Consulting and I offer outsourced HR work (recruiting, evaluating, personality assessments, background checks). So for my home page I want "Business Consulting" to be my keyword that I want to rank for. But "recruiting services", "talent development" are all words that describe a service that I offer and could potential be keywords, how do I get Google to not dilute my authority for "business consulting"?
Web Design | | wlw20090 -
How can we improve our e-commerce site architecture to help best preserve Page Authority?
Today I installed the SEOMoz toolbar for Firefox (very cool, highly recommended). I was comparing our site http://www.ccisolutions.com to this competitor: http://www.uniquesquared.com For the most part, the deeper I go in our site the more the page authority drops. We have a few exceptions where the page authority of a subcategory page is actually better than the cat. page one level up. In comparison, when I was looking at http://www.uniquesquared.com I noticed that their page authority stays at "21" on every single category page I visit. Are you seeing what I'm seeing? Is this potentially a problem with the tool bar or, is there something significantly different about their site architecture that allows them to maintain that PA across all category and sub category pages? Is there something fundamentally wrong with our (http://www.ccisolutions.com) site architecture? I understand that we have longer URLs, but this is an old store with a lot of SKUs, so we have decided not to remove the /category/ and /product/ from the URLs because the 301 redirects that would result wouldn't pass all of the authority they've built up over the years. Interested to know viewpoints on the site architecture and how it might be improved. Thanks!
Web Design | | danatanseo0 -
What is the best information architecture for developing local seo pages?
I think I have a good handle on the external local seo factors such as citations but I'd like to determine the best IA layout for starting a new site or adding new content to a local site. I see lots of small sites with duplicate content pages for each town/area which I know is poor practice. I also see sites that have unique content for each of those pages but it seems like bad design practice, from a user perspective, to create so many pages just for the search engines. To the example... My remodeling company needs to have some top level pages on its site to help the customers learn about my product, call these "Kitchen Remodeling" and "Bathroom Remodeling" for our purposes. Should I build these pages to be helpful to the customer without worrying too much about the SEO for now and focus on subfolders for my immediate area which would target keywords like "Kitchen Remodeling Mytown"? Aside from my future site, which is not a priority, I would like to be equipped to advise on best practices for the website development in situations where I am involved at the beginning of the process rather than just making the local SEO fit after the fact. Thanks in advance!
Web Design | | EthanB0 -
Parameters - Google Web Master Tools
In Google Web Mastertools you can stipulate which paramters you want the Googlebots to ignore when crawling your site. This is common place on pages that add some form of parameterisaton to the end of the link when a web user filters the information on a page (eg. on a clothes website someone may filter the products so they only see 'blue' jumpers, rather than 'all') This is meant to be beneficial as it means Google trawls through less duplicate content. Having now set this up, what impact will this have on my search results, if any? Don't get me wrong, I'm not expecting to shoot up to no.1, but will it benefit me in any way?
Web Design | | DHS_SH0