URL with a # but no ! being indexed
-
Given that it contains a #, how come Google is able to index this URL?:
It was my understanding that Google can't handle # properly unless it's paired with a ! (hash fragment / bang).
site:http://www.rtl.nl/xl/#/home returns nothing, but:
site:http://www.rtl.nl/xl returns http://www.rtl.nl/xl/#/home in the result set
-
Thanks Cyrus, that makes a lot of sense - one of those strange intricacies!
-
The clue here is when you search for Google's cached version of the page:
http://webcache.googleusercontent.com/search?q=cache:http://www.rtl.nl/xl/#/home
...which shows they associate this page with the higher directory (without the hash) or http://www.rtl.nl/xl/
Which is totally consistent with the way Google typically considers hashtags (not hashbangs #!). In other words, Google is ignoring everything after the hash for indexation purposes, but they are displaying it in search results. John Mueller of Google explained this on a very old webmaster forum:
"There are some cases where we're experimenting with showing them in the snippet (as in Colin's example), to help users to find parts of a page quicker."
So I think something like that is happening here. Google displays the URL for certain queries, but really it associates it with the higher level page, and doesn't really index/cache it as it's own separate page.
Hope this makes sense! Thanks for the great question.
-
tags are used to refer various sections of page to show up hiding the other contents, creating a feel of "Menu" in Parallax Designs .
Using # is referring internal sections of a within a page and NOT an url or a HTML / PHP file.
Since crawlers index only URLs these kind of menu's wont get indexed.
Google is capable of handling these property as site-links ( showing the most clicked # tags ) of the page.
Regards,
Raj
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with parameter urls?
We have a ton of ugly parameter urls that are coming up in google, in semrush, etc. What do we do with them? I know they can cause issues. EX https://www.hibbshomes.com/wp-content/themes/highstand/assets/js/cubeportfolio/js/jquery.cubeportfolio.min.js?ver=6.3
Intermediate & Advanced SEO | | stldanni0 -
Redirecting a Few URLs to a New Domain
We are in the process of buying the blog section of a site. Let's say Site A is buying Site B. We have taken the content from Site B and replicated it on Site A, along with the exact url besides the TLD. We then issued 301 redirects from Site B to Site A and initiated a crawl on those original Site B urls so Google would understand they are now redirecting to Site A. The new urls for Site A, with the same content are now showing up in Google's index if we do a site:SiteA.com search on the big G. Anyone have any experience with this as to how long before Site A urls should replace Site B urls in the search results? I undestand there may be a ranking difference and CTR difference based on domain bias, etc... I'm just asking if everything goes as planned and there isn't a huge issue, does the process take weeks or months?
Intermediate & Advanced SEO | | seoaustin0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
WordPress Duplicate URLs?
On my site, there are two different category bases leading to the exact same page. My developer claims that this is a common — and natural — occurrence when using WordPress, and that there's not a duplicate content issue to worry about. Is this true? Here's an example of the correct url. and... Here's an example of the same exact content, but using a different url. Notice that one is coming from /topics and the other is coming from /authors base. My understanding is that this is bad. Am I wrong?
Intermediate & Advanced SEO | | JasonMOZ1 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Increasing index
Hi! I'm having some trouble getting Google to index pages which once had a querystring in them but now are being redirected with a 301. The pages have a lot of unique content but this doesn't seem to matter. I feels as if there stuck in limbo (or a sandbox 🙂 Any clues on how to fix this? Thanks / Niklas
Intermediate & Advanced SEO | | KAN-Malmo0 -
New URL : Which is best
Which is best: www.domainname.com/category-subcategory or www.domainname.com/subcategory-category or www.domainname.com/category/subcategory or www.domain.com/subcategory/category I am going to have 12 different subcategories under the category
Intermediate & Advanced SEO | | Boodreaux0