Question #1: Does Google index https:// pages? I thought they didn't because....
-
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored)
My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one.
The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/
instead of
**http://**www.example.com/example-page/
To double check that this was causing a loss in Link Juice. I jumped over to OSE.
Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed.
So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed...
Right??
Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed.
The problem is.. is this a volusion problem?
Should I switch to Wordpress?
here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress)
-
Hi Tyler
Looks like the duplicate title tags are largely from empty pages like these:
http://www.uncommonthread.com/008-Pink-Ice-p/14410008.htm
http://www.uncommonthread.com/001-Gold-p/14410001.htm
http://www.uncommonthread.com/019-Copper-p/14410019.htm
http://www.uncommonthread.com/027-Electric-Blue-p/14410027.htm
Even though these pages are somewhat unique, the content is definitely "thin" and having a lot of pages like this typically isn't good for rankings.
Ideally, you would list small product variations on the same page, or even have several similar product pages canonical to a master page. Generally if you don't have a 200 words minimum of good editorial content, Google might consider it duplicate.
I don't see any reason why switching to http should cause too much problem if you passed everything through a 301 redirect. To be honest, it's typical for rankings to fluxuate frequently so it could be a million things.
If I look at the text-only cache of the page you sent: http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com&strip=1
... it looks pretty similar. If it were my page, I'd probably try to include more descriptive text on the page, richer descriptions, ets.
Hope this helps!
-
Wow. What an awesome answer.
I literally don't know if I can't thank you enough for taking the time to answer so wholesomely.
I decided to go ahead and fix the https:// and change it to http://
Weird results here.. Traffic went down by 5.5% compared to the month before I posted this thread.
I noticed an increase in duplicate title tags (about 700 - 1000 of them) in my seomoz account.
Could that be likely to be the reason for the decrease? Or is it just because I shouldn't have made such a drastic site-wide change like that?
I am attempting to give unique title tags, and html titles to all of those product pages that are causing the increases in duplicate titles
I also am in a slight predicament because she hired another company to do some "optimization" around October 23rd.
Since then, they have made some spammy changes in my opinion, but some results have shown up (20+% increase starting around Jan 1st, and capping on the day I made the https:// change), and I can't get her to agree with me that we should invest in building a Social following, making better content, and blogging more often, etc. I also think we should move the blog to be in a sub folder on the domain as well..
I compared the webcache you showed me to a wordpress site that i built and the difference really was pretty shocking
http://webcache.googleusercontent.com/search?q=cache:http://www.ontracparts.com
whats the difference as far as rankings and SEs are concerned?
-
Hi Tyler,
Great question! In fact, it's a common misconception that Google doesn't index https. In truth, these days they appear to index most https just fine.
If we do a site operator Google search for https on your site, we get something like this:
site:uncommonthread.com/ inurl:https (click to see results)
This returns 165 URLs on your site with the https protocol.
But.... these URLs don't show up in OSE because at this time, the Linkscape crawler can't crawl https. When it was originally built Google still didn't index https, so it wasn't needed. This should be fixed in just a few months and you should start seeing those https results in there. The good news is that OSE is completely separate from Google and doesn't influence your rankings in any way.
Now for the bad news....
Whenever you have https, you want to make sure you only have ONE version of the url, so that https either redirects (via 301) to the http version, or vice versa. Otherwise Google might index both versions. For example, both of these URLs resolve on your site:
https://www.uncommonthread.com/kb_results.asp?ID=5
http://www.uncommonthread.com/kb_results.asp?ID=5
The solution is to either 301 redirect one to the other, or have an absolute canonical tag on both pages that points to one or the other (an absolute canonical means it contains the full URL, including http or https)
That said, I don't see any evidence that Google has indexed both URL versions of your site (at least not like Dunkin Donuts
Should You Switch to Wordpress?
Based simply on the https issue, switching to Wordpress isn't necessary. But Wordpress does offer other advantages, and is generally a very SEO friendly platform.
There might be other considerations you may consider to switch away from your current CMS.
For example, consider Google's Text-only cache of your homepage: http://webcache.googleusercontent.com/search?q=cache:http://www.uncommonthread.com/default.asp&strip=1
See how barren it is? Without taking a deep dive, it's possible the structure and technology employed by your CMS is causing indexing/crawling issues, and considerable technical effort may be required to make it SEO friendly. I can't give you a definite answer either way, but it's something to think about.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
With Google's new Speed Update, what does that mean for AMP pages?
Hey everyone! I wanted to get the other Mozzers opinions on this. With Google announcing a new Speed Update that will affect mobile rankings, I wanted to ask: How will AMP pages play into this? Let me know what you think!
Web Design | | TaylorRHawkins
Thanks!2 -
Sitemap Question (aspx, XML, HTML)
Hey everyone! My company uses a tool called SEOQuake. We are trying to hit all of their "checkmarks" when we run a diagnosis for them. One of the only things we can not figure out how to pass is their section for Site Compliance ---> XML Sitemaps. Our client's websites that we have built are all using .aspx URL structures, and when I view them, it clearly states that it is an XML file. It has this text written at the top of the .aspx page: "This XML file does not appear to have any style information associated with it. The document tree is shown below." Does anyone know what is happening here?
Web Design | | TaylorRHawkins
Thank you!1 -
Fetch as Google not showing Waypoints.js on scroll animation
So I noticed that my main content underneath 4 reasons to choose LED Habitats did not show up in Fetch as Google as well as a few other sections. The site being brand new, so I'm not sure how this will be indexed. What happens is, as the user scrolls the content is brought in using Waypoints and Animate.css which offers an engaging yet simple user experience. I'm just afraid that If the content doesn't show up in "Fetch as Google" in webmaster tools that this content will never be found / indexed by Google. There are thousands of sites that use this library, I'm just curious what I'm doing wrong.. or what I can do. Is there a way for me to keep the simple animations but keep Google Happy at the same time? I took a screen shot of "Fetch as Google" and you can see blatant missing sections which are the sections animated by the waypoints library. Thanks for listening! Robert ZqgLWHi
Web Design | | swarming0 -
Why is Google displaying meta descriptions for pages that are nowhere contained in said page metas?
Certain search keywords are pulling up incorrect page titles and meta descriptions for our site. I've looked through our code, and the text used by Google in the search results is nowhere found inside our site. I've also looked at previous iterations of our site from over a decade ago and still haven't found it. I then searched specifically for the exact phrased incorrect meta descriptions and found a long list of spammy sites linking to our domain with the exact, incorrect meta description. Is this why Google is displaying the incorrect data, and how do I get Google to use the meta descriptions from my actual site?
Web Design | | Closetstogo0 -
We're considering making notable changes to our website's navigation. Other than 301 redirects from old pages to new, what do I need to consider with this type of move or update?
We would like to make some navigation changes to our website: www.NetGainIT.com, specifically to the services section. I know that I will need a list of 301 redirects if I do not plan on keeping certain pages, but what else do I need to consider?
Web Design | | NetGainTech0 -
Certain PHP Pages Not Showing In SERPs
Hi all, You've all been so helpful so far, I'm hoping you can help me with our trickiest SEO question yet. Last year we migrated 7 sites into one, going from flat html to Joomla. This went fine and although we saw a slight drop in traffic, it wasn't too bad. Now however traffic has started to drop and we've been able to hone it down into a certain area of our website. Each of the 7 sites had their own page with some php code that was fed products. These products were updated everyday and were are second most popular page on the sites aprt from the home page. These pages were found in Google no problem and were an essential resource for our site. What we have found is these pages cannot be found at all, unless you type the full business name and product. If you just type the product and the location our customer is based in, we're no where, using the Moz tools it says we're not in the top 50 results. This is a bit of a shock since we used to be at least on the first page, usually quite high up. I'm a little stumped as SEO wise these pages are technically better. They offer the same functionality but in a much more SEO friendly way. I've asked our developer to check: Nothing is being blocked in robots.txt
Web Design | | HB17
The pages are being indexed
There's no strange code errors Essentially the pages can't be found even if we type the full title, for example 'customer's products in their town' to be found we have to type 'customer's products in their town and their full business name'. The top third of the page is just html text, the bottom like I mentioned is PHP and is fed data from a database which is refreshed each morning. I know our developer did some rel conical work but has assured me that's all working fine. While I know it's a new website, we've owned the domain for a while so our domain authority isn't brand new and 0, we also have pages with worse page authority that show up on page 1 no problem, so I'm leaning towards something else might not be right. Can anyone help me figure out why these pages are being indexed but not even found? Thanks!0 -
Sites went from page 1 to page 40 + in results
Hello all We are looking for any insight we can get as to why all (except 1) of our sites were effected very badly in the rankings by Google since the Panda updates. Several of our sites londonescape.com dublinescape.com and prague, paris, florence, delhi, dubai and a few others (all escape.com urls) have had major drop in their rankings. LondonEscape.net (now.com (changed after rank drop) ), was ranked between 4th & 6th but is now down around 400th and DelhiEscape.net and MunichEscape.com were both number 1 for several years for our main key words We also had two Stay sites number 1 , AmsterdamStay and NewYorkstay both .com ranked number 1 for years , NewYork has dropped to 10th place so far the Amsterdam site has not been effected. We are not really sure what we did wrong. MunichEscape and DelhiEcape should never have been page 1 sites ) just 5 pages and a click thru to main site WorldEscape) but we never did anything to make them number 1. London, NewYork and Amsterdam sites have had regular new content added, all is checked to make sure its original. **Since the rankings drop ** LondonEscape.com site We have redirected the.net to the .com url Added a mountain of new articles and content Redesigned the site / script Got a fair few links removed from sites, any with multiple links to us. A few I have not managed yet to get taken down. So far no result in increased rankings. We contacted Google but they informed us we have NOT had a manual ban imposed on us, we received NO mails from Google informing us we had done anything wrong. We were hoping it would be a 6 month ban but we are way past that now. Anyone any ideas ?
Web Design | | WorldEscape0 -
How not to get penalized by having a Single Page Interface (SPI) ?
Guys, I run a real estate website where my clients pay me to advertise their properties. The thing is, from the beginning, I had this idea about a user interface that would remain entirely on the same page. On my site the user can filter the properties on the left panel, and the listings (4 properties at each time) are refreshed on the right side, where there is pagination. So when the user clicks on one property ad, the ad is loaded by ajax below the search panel in the same page .. there's a "back up" button that the user clicks to go back to the search panel and click on another property. People are loving our implementation and the user experience, so I simply can't let go of this UI "inovation" just for SEO, because it really is something that makes us stand out from our competitors. My question, then, is: how not to get penalized in SEO by having this Single Page Interface, because in the eyes of Google users might not be browsing my site deep enough ?
Web Design | | pqdbr0