Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URLs with Hashtags - Does Google Index Them?
-
Hi there,
I have a potential issue with a site whereby all pages are dynamically populated using Javascript. Thus, an example of an URL on their site would be www.example.com/#!/category/product.
I have read lots of conflicting information on the web - some says Google will ignore everything after the hashtag; other people say that Google will now index everything after the hashtag.
Does anybody have any conclusive information about this? Any links to Google or Matt Cutts as confirmation would be brilliant.
P.S. I am aware about the potential issue of duplicate content, but I can assure you that has been dealt with. I am only concerned about whether Google will index full URLs that contain hashtags.
Thanks all!
Mark
-
Hi All,
It looks like Google has setup a nice dev site and FAQ page to go over the options here especially when using AJAX and hash tags to link to hidden content. https://developers.google.com/webmasters/ajax-crawling/docs/faq#whereinresults.
It looks as if Google will be able to index the content of the entire page (hidden and initially shown) and not create a separate URL if you use a ! before the #. I'd read up on that FAQ page, and play with site commands on the Google dev site.
-
Thankfully Webmaster World were able to provide some decent information, for those of you who have arrived here looking for a similar answer.
There is something called the "hash-bang" which makes javascript pages crawlable. Hashbang refers to hash (#) bang (!) - so an example would be example.com/#!/page-1.
Here's a great place to read more, understand and learn to implement:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=174992
Cheers all!
-
Here's an example of a # URL which has not been indexed.
http://dulas.org.uk/hydro-info.cfm#specification_installation
Unlike the site I am working on, this site 'hides' content from the user until they click on a particular tab. All of the original code is in the source for http://dulas.org.uk/hydro-info.cfm but only shown to the user if they activate the particular piece of javascript when they click on a tab.
The site I am working on is different - it loads content based on javascript, however it essentially loads as a new page - the content is not present in the source until you click no something, when new content will load and the old content will disappear.
Perhaps Google will be able to see that these # pages function much like a normal page, loading completely new content and getting rid of old content, and may therefore index them if I submit them in a sitemap. However, I'd like to hear from somebody who can tell me they have done this and had success!
Thanks,
Mark
-
Hi Lee,
Thanks for your response. My concern is that # URLs tend to send users to a particular location on a page, rather than a new page itself. Therefore, some things I have read suggest that Google has adapted to ignore anything after a # in order to avoid indexing an enormous amount of duplicate content. Strange that there is so much conflicting info out there!
Cheers,
Mark
-
Hi Mark, although I don't have any conclusive evidence I would say that Google does index hashtag URLS.
Think of it this way; when you link within a page using an anchor (#), Google see's the '#' and 'non-# URLS' as unique URLS so logically this does suggest that they do index the full URL.
Hooe that's helped, Lee.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scroll and URL Changing
Hi, So my website is having an issue indexing. Much like other sports sites like ESPN or MLB or a variety of others my site changes the URL as you go down the page. So if you go on a news article and continue scrolling you'll go to another news article. I believe that this is creating errors in Search Console with the article being given an error of being "too long". I don't know how to keep this infinite scroll and url changing which increases my pageviews and eliminate the errors. Can someone help?
Web Design | | mattdinbrooklyn0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Is there a way to redirect URLs with a hash-bang (#!) format?
Hi Moz, I'm trying to redirect www.site.com/locations/#!city to www.site.com/locations/city. This seems difficult because anything after the hash character in the URL does not make it to the server thus cannot be parsed for rewriting. Is there an SEO friendly way to implement these redirects? Thanks for reading!
Web Design | | DA20130 -
Does Google count the domain name in its 115-character "ideal" URL length?
I've been following various threads having to do with URL length and Google's happiness therewith and have yet to find an answer to the question posed in the title. Some answers and discussions have come close, but none I've found have addressed this with any specificity. Here are four hypothetical URLs of varying lengths and configurations: EXAMPLE ONE:
Web Design | | RScime25
my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (115 characters) EXAMPLE TWO: sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (126 characters) EXAMPLE THREE: www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (130 characters) EXAMPLE FOUR: http://www.sample.com/my-big-widgets-are-the-best-widgets-in-the-world-and-come-in-many-vibrant-and-unique-colors-and-configurations.html (137 characters) Assuming the examples contain appropriate keywords and are linked to appropriate anchor text (etc.,) how would Google look upon each? All I've been able to garner thus far is that URLs should be as short as possible while still containing and contextualizing keywords. I have 500+ URLs to review for the company I work for and could use some guidance; yes, I know I should test, but testing is problematical to the extreme; I look to the collective/accumulated wisdom of the MOZVerse for help. Thanks.1 -
How to make Address Text Clickable for Google Map Link for Mobile Device
How do I make the address text on the site a clickable link for mobile devices?
Web Design | | bozzie3110 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0 -
Custom URL's with Bigcommerce Issue (Is it worth it?)
We're building out a store in Bigcommerce, who for all intensive purposes is perfect for SEO besides the fact that you can not change the URL's to be custom. My question is, does this kill the SEO value of bigcommerce, despite everything else being great? So for example the URL's for a category page would be something like this www.mysite.com/categories/keyword and the product URL's are pulled in by product name, so product URL's could be something like www.mysite.com/products/Product-Description-Long-223.html (notice the words will be capitalized and their is no way to remove the trailing .html) I could go with Interspire (the liscenced version of Bigcommerce) or Magento so I can custom edit this stuff. But then its a lot more work for my employee's on the buildout.
Web Design | | iAnalyst.com0 -
IP block in Google
Our office has a number of people performing analysis and research on keyword positions, volume, competition etc. We have 1 external static IP address. We installed the static IP so we can filter out our visits in Google Analytics. However by 10 AM we get impssible CAPTCHA's or even get blocked in Google. Do you have any experience with such an issue? Any solutions you can recommend? Any help would be appreciated! SXI5A.png
Web Design | | Partouter0