Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do you 301 redirect URLs with a hashbang (#!) format? We just lost a ton of pagerank because we thought javascript redirect was the only way! But other sites have been able to do this – examples and details inside
-
Hi Moz,
Here's more info on our problem, and thanks for reading!
- We’re trying to Create 301 redirects for 44 pages on site.com.
- We’re having trouble 301 redirecting these pages, possibly because they are AJAX and have hashbangs in the URLs.
- These are locations pages. The old locations URLs are in the following format: www.site.com/locations/#!new-york and the new URLs that we want to redirect to are in this format: www.site.com/locations/new-york
- We have not been able to create these redirects using Yoast WordPress SEO plugin v.1.5.3.2.
- The CMS is WordPress version 3.9.1
- The reason we want to 301 redirect these pages is because we have created new pages to replace them, and we want to pass pagerank from the old pages to the new. A 301 redirect is the ideal way to pass pagerank.
- Examples of pages that are able to 301 redirect hashbang URLs include http://www.sherrilltree.com/Saddles#!Saddles and https://twitter.com/#!RobOusbey.
-
The solution I came up with was:
- Create a list of all the source URLs you have, and all the destination URLs you want
- Create all the destination URL pages
- Work out what the Ugly versions of all hashbang (pretty) URLs should be and record them (ref: https://developers.google.com/webmasters/ajax-crawling/docs/specification)
- Implement 301 Redirects for the Ugly URLs
- Deploy a Sitemap with Pretty URLs
- Submit Your Sitemap to Google Webmaster Tools
- Wait for Google to re-index all your pages
- Check that the new URL(s) show up in Google search results too
- Clean up – Remove the pretty URLs from the sitemap
Job done!
I created a detailed page on this with examples on my blog at www.thedriversgarage.com/web-technology/redirecting-hashbang-urls-wix-urls/
Disclaimer - Make your own enquiries and do your own tests. I'm a pragmatist, I really don't care if this complies to standards. It worked for me and that's all I cared about. Google, etc. may process this stuff differently in the future. Do your own tests.
-
I would like to point out that twitter is using javascript redirects not serverside redirects. If you disable javascript and try that url it will load the homepage/ your twitter feed and the url will stay the same.
The second url doesn't seem to be properly redirecting as at least for me it just 301 redirects back to itself.
-
That's not true. Google is able to crawl and index properly setup ajax based pages like the one in question. Bing on the otherhand is not able to do so or at least not last time I checked.
-
That will teach me to skim read

Perhaps trying a different 301 plugin will help? Alternatively, you can pretty much redirect anything from within .htaccess.
This page on Webmaster World might be worth reading.
-Andy
-
Thanks for the responses!
@Kevin: Our main concern here is getting back that lost page rank, since javascript redirects don't pass page rank. We used http://www.internetofficer.com/seo-tool/redirect-check/ and _SEO Tools for Excel _to check whether the hashbang URL examples were using 301 redirects.
The correct URLs are
http://twitter.com/#!RobOusbey
http://www.sherrilltree.com/Saddles/#!Saddles@iNetSEO
These pages were indexed by Google before somehow, I suspect using escaped_fragment? the hashbang URLs would show up in search results
-
With the JavaScript option, people who bookmarked the page will get redirected.
-
The hash tag means that the page wont be indexed by Google and therefore, carry no page rank. It is like it is invisible. Just launch the new pages because Google will have never seen the current ones.
-Andy
-
I may be wrong, but I don't believe you can do this via a 301 redirect. How did you know the examples used a 301 redirect? The examples provided may have used JavaScript to do it (may not be the best, but can't think of any other option).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
410 or 301 after URL update?
Hi there, A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!). The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible. I don't want to go mad with 301s - but for external links in, this seems like the best solution? On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist? Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore? Thanks for any insights 🙂
Intermediate & Advanced SEO | | Fubra0 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Splitting and moving site to two domains - How to redirect
I have a client who is going to split their retail and wholesale business and rebrand the retail biz. So let’s say they are going to move everything from currentdomain.com to either retaildomain.com or wholesaledomain.com. The most important business for them is the retail site, so they want to pass on as much ranking power as they can from currentdomain.com to retaildomain.com. I see two choices here: We can 301 redirect all of currentdomain.com to retaildomain.com, and then redirect any wholesale pages to wholesaledomain.com. The advantage is that we can use GSC’s change of address tool to report the change to Google. The downside is that there is a redirect chain (2 hops) to wholesaledomain.com. Would this confuse Google? Or we can 301 redirect page by page from currentdomain.com to the appropriate page on either new site. This means no redirect chains but it also means that we can’t use GSC’s change of address tool. Which would you do and why? And is there another option that I'm missing? I appreciate any insights you can share.
Intermediate & Advanced SEO | | rich.owings1 -
URL mapping for site migration
Hi all! I'm currently working on a migration for a large e-commerce site. The old one has around 2.5k urls, the new one 7.5k. I now need to sort out the redirects from one to the other. This is proving pretty tricky, as the URL structure has changed site wide. There doesn't seem to be any consistent rules either so using regex doesn't really work. By and large, the copy appears to be the same though. Does anybody know of a tool I can crawl the sites with that will export the crawled url and related copy into a spreadsheet? That way I can crawl both sites and compare the copy to match them up. Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
What is the best way to handle special characters in URLs
What is the best way to handle special characters? We have some URL's that use special characters and when a sitemap is generate using Xenu it changes the characters to something different. Do we need to have physically change the URL back to display the correct character? Example: URL: http://petstreetmall.com/Feeding-&-Watering/361.html Sitmap Link: http://www.petstreetmall.com/Feeding-%26-Watering/361.html
Intermediate & Advanced SEO | | WebRiverGroup0 -
Is 301 redirect suggested on pagination pages
Hi - Due to pagination the default page of site is coming in 2 url with - ?page=1/ sub-url and /sub-url is 301 a recommended solution due to this pagination urls Also - is it required to create separate title and meta description of every pagination page We are taking specifically in context of our discounts and offer section http://www.mycarhelpline.com/index.php?option=com_offers&view=list&Itemid=9
Intermediate & Advanced SEO | | Modi0 -
How to stop Google crawling after 301 redirect?
I have removed all pages from my old website and set 301 redirect to new website. But, I have verified old website with Google webmaster tools' HTML verification file which enable me to track all data and existence of pages in Google search for my old website. I was assumed that, Google will stop crawling and DE-indexed all pages after 301 redirect. Because, I have set 301 redirect before 3 months. Now, I'm able to see Google bot activity on my website with help of Google webmaster tools. You can find out attachment to know more about it. How can it possible & How Google can crawl removed pages? You can see following image to know more about it. First & Second
Intermediate & Advanced SEO | | CommercePundit0 -
Can penalties be passed via 301 redirect?
I have a well established domain that's been hit with some penalties. It hasn't been nuked off the map, just downgraded, especially on short-tail, one word type queries. I'm planning on redirecting this domain to another well established domain. The domains already have a history of lots of interlinking and are very similar from a subject matter standpoint. I feel that the penalized domain has been hit with an "over-optimization" of link anchor text penalty (I'm hoping it's algorithmic, but it could be manual). My question is if anyone has ever heard of a penalty like this being transferred to another domain through a 301 redirect. My hope is that the penalty just puts a cap on how much juice the redirect can pass, rather than transferring the penalty to the other domain itself. Any thoughts on this?
Intermediate & Advanced SEO | | SEOMG1