Parallax Scrolling when used with “hash bang” technique is good for SEO or not?
-
Hello friends,
One of my client’s website http://chakracentral.com/ is using Parallax scrolling with most of the URLs containing hash “#” tag. Please see few sample URLs below:
http://chakracentral.com/#panelBlock4 (service page)
http://chakracentral.com/#panelBlock3 (about-us page)I am planning to use “hash bang” technique on this website so that Google can read all the internal pages (containing hash “#” tag) with the current site architecture as the client is not comfortable in changing it.
Reference: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started#2-set-up-your-server-to-handle-requests-for-urls-that-contain-escaped_fragment
But the problem that I am facing is that, lots of industry experts do not consider parallax websites (even with hash bang technique) good for SEO especially for mobile devices. See some references below:
http://searchengineland.com/the-perils-of-parallax-design-for-seo-164919
https://moz.com/blog/parallax-scrolling-websites-and-seo-a-collection-of-solutions-and-examplesSo please find my queries below for which I need help:
1. Will it be good to use the “hash bang” technique on this website and perform SEO to improve the rankings on desktop as well as mobile devices?
2. Is using “hash bang” technique for a parallax scrolling website good for only desktop and not recommended for mobile devices and that we should have a separate mobile version (without parallax scrolling) of the website for mobile SEO?
3. Parallax scrolling technique (even with "hash bang") is not at all good for SEO for both desktop as well as mobile devices and should be avoided if we want to have a good SEO friendly website?
4. Any issue with Google Analytics tracking for the same website?Regards,
Sarmad Javed -
Hello Sarmad,
I don't recommend having a single-page website on the front-end. You can handle the loading of content however you like: Javascript, lazy loading, dynamic serving.. as long as each section/page has a different URL. A hashtag is not a new URL, but rather a named anchor link to somewhere else on the same page. Thus, using this technique you would only have a single page indexed in Google, not a lot of room to target different topics.
Lots of sites use the parallax design style, but have multiple pages. Parallax and single-page are not necessarily synonymous. The most common use on a multi-page site would be to tell a story on a landing page or the home page.
More concerning to me right now is that it looks like they're putting their clients' sites on their own subdomains, which are fully indexable by Google: https://goo.gl/z9dSDl .
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How .com and .me effect SEO?
Hi, We have a project https://www.shipwaves.me/ and https://www.shipwaves.com/ and however if we are doing SEO for .me domain - is this give early results like as .com domain. I'm not sure that .me is in the category of .me domains. Well, if we are following the same strategy as like following for .com domains - is that the results will be the same or not. Also, in terms of any additional strategy to be done to make the SEO little faster for the .me domain. Guys, please share your thought on this.
Algorithm Updates | | LayaPaul0 -
Anyone suspect that a site's total page count affects SEO?
I've been trying to find out the underlying reason why so many websites are ranked higher than mine despite seemingly having far worse links. I've spent a lot of time researching and have read through all the general advice about what could possibly be hurting my site's SEO, from page speed to h1 tags to broken links, and all the various on-page SEO optimization stuff....so the issue here isn't very obvious. From viewing all of my competitors, they seem to have a much higher number of web pages on their sites than mine does. My site currently has 20 pages or so and most of my competitors are well in the hundreds, so I'm wondering if this could potentially be part of the issue here. I know Google has never officially said that page number matters, but does anyone suspect that perhaps page count matters towards SEO and that competing sites with more total pages than you might have an advantage SEOwise?
Algorithm Updates | | ButtaC1 -
Using Brand value for SEO: Can we use keyword with brand name?
Hi Moz community, I am curious to know this. Let's say there is a brand value for a company. It has it's own popularity that it's been mentioned across the internet and social media directly with brand name without their service or industry keyword. Now if the company started promoting themselves like keyword along with their brand name, will it help them to rank for that keyword. For example, Moz is already famous, now they want to rank for "SEO" and related keywords, so they started calling themselves on internet "Moz SEO"; will this fetch them in ranking for keyword SEO? My ultimate question is, using primary keyword along with brand name will work out in ranking for that primary keyword or not? Thanks
Algorithm Updates | | vtmoz0 -
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
What was the biggest challenge you faced as an SEO in 2012?
As an SEO (in-house, freelance, consultant, agency, entrepreneur) what was the biggest challenge you faced in 2012? Please be as specific as you can, and let us all know what you are doing to overcome this challenge in 2013. For me personally I would have to say the biggest challenge I had to deal with was Google+ Local. Obviously Google is putting a lot into G+L, but it has been so messy and at times I have just thrown my arms up in the air. Especially when it comes to multi-state locations and losing reviews.
Algorithm Updates | | clarktbell0 -
Local SEO-How to handle multiple business at same address
I have a client who shares the same address and suite number with multiple business. What should be done to optimize their website and citations for local SEO? Is this a huge issue? What should we do so our rankings aren't affected. Will changes take a long time to take place? Thanks
Algorithm Updates | | caeevans0 -
Has anyone starting using schema.org?
On the 3rd June 2011 Google announced that they are going to start using Schema. Do you think this will change the way search engines find content, from briefly looking at Schema I'm concerned that the proposed tags could just turn into another keyword meta tag and be abused. Have you started using this tags yet and have you noticed a difference?
Algorithm Updates | | Seaward-Group0