Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do search engines treat urls that end in hashtags?
-
How do search engines treat urls that end in hashtags? For example, www.domain.com/abc#xyz.
-
Hashtags can be used for multiple purposes in URLs but the only one I have personal experience with is their tracking ability. Two examples of services which can track URLs via hashtags are TYNT and AddThis.
You add a code snippet to your site in much the same way that you add Google Analytics. From that point forward, all of your URLs will have a unique hash tag added. Anytime someone copies and pastes your URL, it will contain a this tag.
When a user clicks on the link, you can identify the specific source via TYNT or AddThis. The service is free and they offer nice reporting options. Tynt is more experienced in this area, but AddThis offers other social sharing tools (facebook, twitter, google+1, etc) as well.
I used to use TYNT until AddThis added the feature, which is when I switched to AddThis.
-
Yes
Feel free to try it. Take this post for example. Copy the URL, then add a hashtag and some characters after it. You will return to this exact same post. The hashtag offers a tracking ability but otherwise does not impact the actual URL.
http://www.seomoz.org/q/how-do-search-engines-treat-urls-that-end-in-hashtags#sgdgfsdf
-
So, if someone links to www.domain.com/abc#xyz and www.domain.com/abc#nz, then it is the same as linking twice to www.domain.com/abc?
Thanks!
-
Google truncates the URL at the hashmark. Using your example they treat the URL as www.domain.com/abc
While every search engine is their own private company and can do as they wish, it seems reasonable that most, if not all, follow Google's process.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should search pages be indexed?
Hey guys, I've always believed that search pages should be no-indexed but now I'm wondering if there is an argument to index them? Appreciate any thoughts!
Technical SEO | | RebekahVP0 -
URL has caps, but canonical does not. Now what?
Hi, Just started working with a site that has the occasional url with a capital, but then the url in the canonical as lower case. Neither, when entered in a browser, resolves to the other. It's a Shopify site. What do you think I should do?
Technical SEO | | 945010 -
Does an Apostrophe affect searches?
Does Google differentiate between keyphrase structures such as Mens Sunglasses & Men**'**s Sunglasses? I.e. does the inclusion/exclusion of an apostrophe make any difference when optimising your main keyword/phrase for a page? Keyword explorer appears to give different results..... I.e. no data for Men's Sunglasses, but data appears for Mens sunglasses. So if I optimise my page to include the apostrophe, will it screw the potential success for that page? Thanks 🙂 Bob
Technical SEO | | SushiUK1 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
How does a search engine bot navigate past a .PDF link?
We have a large number of product pages that contain links to a .pdf of the technical specs for that product. These are all set up to open in a new window when the end user clicks. If these pages are being crawled, and a bot follows the link for the .pdf, is there any way for that bot to continue to crawl the site, or does it get stuck on that dangling page because it doesn't contain any links back to the site (it's a .pdf) and the "back" button doesn't work because the page opened in a new window? If this situation effectively stops the bot in its tracks and it can't crawl any further, what's the best way to fix this? 1. Add a rel="nofollow" attribute 2. Don't open the link in a new window so the back button remains finctional 3. Both 1 and 2 or 4. Create specs on the page instead of relying on a .pdf Here's an example page: http://www.ccisolutions.com/StoreFront/product/mackie-cfx12-mkii-compact-mixer - The technical spec .pdf is located under the "Downloads" tab [the content is all on one page in the source code - the tabs are just a design element] Thoughts and suggestions would be greatly appreciated. Dana
Technical SEO | | danatanseo0 -
Landing Page URL Structure
We are finally setting up landing pages to support our PPC campaigns. There has been some debate internally about the URL structure. Originally we were planning on URL's like: domain.com /california /florida /ny I would prefer to have the URL's for each state inside a "state" folder like: domain.com /state /california /florida /ny I like having the folders and pages for each state under a parent folder to keep the root folder as clean as possible. Having a folder or file for each state in the root will be very messy. Before you scream URL rewriting :-). Our current site is still running under Classic ASP which doesn't support URL rewriting. We have tried to use HeliconTech's ISAPI rewrite module for IIS but had to remove it because of too many configuration issues. Next year when our coding to MVC is complete we will use URL rewriting. So the question for now: Is there any advantage or disadvantage to one URL structure over the other?
Technical SEO | | briankb0 -
Duplicate Content and URL Capitalization
I have multiple URLs that SEOMoz is reporting as duplicate content. The reason is that there are characters in the URL that may, or may not, be capitalized depending on user input. A couple examples are: www.househitz.com/Pennsylvania/Houses-for-sale www.househitz.com/Pennsylvania/houses-for-sale www.househitz.com/Pennsylvania/Houses-for-rent www.househitz.com/Pennsylvania/houses-for-rent There are currently thousands of instances of this on the site. Is this something I should spend effort to try and resolve (may not be minor effort), or should I just ignore it and move on?
Technical SEO | | Jom0 -
Google News URL Format
Hi, We are currently redesigning our gaming website (www.totallygn.com) and one of our main goals is to get listed by Google News in future. Looking at the Google News URL requirements "The URL for each article must contain a unique number consisting of at least three digits." How does the above affect SEO structure? I was planning on using a format such as www.totallygn.com/xbox-360/360-reviews/fifa-12-review how would this compare to something like? www.totallygn.com/xbox-360/360-reviews/fifa-12-review234 Thanks in advance for your help
Technical SEO | | WalesDragon0