Special characters in URL
-
Hello everybody,
my question focus on special parameters in URL. I i am working for a website that use a lot of special entities in their URLS. For instance:
www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1.html
I am about to make 301 redirect rules for all these urls to clean ones. IE:
www.mydomain.com/mykeyword1-mykeyword2%2C-1%2Cpage1
would become:
www.mydomain.com/mykeyword1-mykeyword.htmlI just wanted to know if anybody has already done this kind of "cleanup" and if i could expect a positive boost or not.
Thanks
-
Thanks for your answers,
Yes, i will use a regex because it is much simpler to make one rule for thousands of URLs than to have one specific rule for each URL. -
If it's an ecommerce site, then 301s may take forever. I agree with Shaun about the mod_rewrites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Indexed But Not Submitted to Sitemap
Hi guys, In Google's webmaster tool it says that the URL has been indexed but not submitted to the sitemap. Is it necessary that the URL be submitted to the sitemap if it has already been indexed? Appreciate your help with this. Mark
Technical SEO | | marktheshark100 -
Changing site URL structure
Hey everybody, I'm looking for a bit of advice. A few weeks ago Google sent me an email saying all pages with any text input on them need to switch to https for those pages. This is no problem, I was slowly switching the site to https anyway using a 301 redirect. However, my site also has a language subfolder in the url, mysite.com/en/ mysite.com/ru/ etc. Due to poor work on my part the translations of the site haven't been updated in a long time and lots of the pages are in english even on the russian version etc. So I'm thinking of just removing this url structure and just having mysite.com My plan is to 301 all requests to https and remove the language subfolder in the url at the same time. So far the https switching hasn't changed my rankings. Am I more at risk of losing my rankings by doing this? Thanks!
Technical SEO | | Ruhol0 -
Localizing URLs Path - Hreflang
Hello, This is a simple question regarding how URLs should be managed for proper results with the hreflang tags. Right now, we have a website in English and German. The hreflang tag is working properly. This is how we currently have it: https://www.memoq.com/ https://de.memoq.com/ But we will soon change the way we localize our web, moving out of the sub-domain structure. There is this possibility of localizing the URLs path, but I was wondering if the hreflang tag would work in such case. The new structure would look something like: https://www.memoq.com/why-memoq https://www.memoq.com/de/warum-memoQ So my question is: If we localize the keyword in the path of the URL, will the tag still work? Or do they need to be in the same language than the English version. Thanks!
Technical SEO | | Kilgray1 -
Language Specific Characters in URLs for
Hi People, would really appreciate your advice as we are debating best practice and advice seems very subjective depending if we are talking to our dev or SEO team. We are developing a website aimed at the South American market with content entirely in Spanish. This is our first international site so our experience is limited. Should we be using Spanish characters (such as www.xyz.com/contáctanos) in URLs or should we use ASCII character replacements? What are the pros and cons for SEO and usability? Would really be great to get advice from the Moz community and make me look good at the same time as it was my suggestion 🙂 Nick
Technical SEO | | nickspiteri0 -
Capitals URLs to Non Capitals...
Hi, I am working on a website which has capital urls and non capital urls which will be generating duplicate content, and I know it is better to use all lower case. The problem is that the page authority is better for the capital versions and I was wondering will it negatively impact the SEO of we 301 redirect the uppercase urls to the lowercase counterparts? Thanks.
Technical SEO | | J_Sinclair0 -
Single URL not indexed
Hi everyone! Some days ago, I noticed that one of our URLs (http://www.access.de/karriereplanung/webinare) is no longer in the Google index. We never had any form of penalty, link warning etc. Our traffic by Google is constantly growing every month. This single page does not have an external link pointing to it - only internal links. The page has been indexed all the time. The HTTP status code is 200, there is no noindex or something in the code. I submitted the URL on GWMT to let Google send it to the index. It was crawled successfully by Google, sent to the index 5 days ago - nothing happened, still not indexed. Do you have any suggestions why this page is no longer indexed? It is well linked internally and one click away from the home page. There is still the PR of 5 showing, I always thought that pages with PR are indexed.......
Technical SEO | | accessKellyOCG0 -
Friendly URLS (SEO urls)
Hello, I own a eCommerce site with more than 5k of products, urls of products are : www.site.com/index.php?route=product/product&path=61_87&product_id=266 Im thinking about make it friend to seo site.com/category/product-brand Here is my question,will I lost ranks for make that change? Its very important to me know it Thank you very much!
Technical SEO | | matiw0 -
Spider Indexed Disallowed URLs
Hi there, In order to reduce the huge amount of duplicate content and titles for a cliënt, we have disallowed all spiders for some areas of the site in August via the robots.txt-file. This was followed by a huge decrease in errors in our SEOmoz crawl report, which, of course, made us satisfied. In the meanwhile, we haven't changed anything in the back-end, robots.txt-file, FTP, website or anything. But our crawl report came in this November and all of a sudden all the errors where back. We've checked the errors and noticed URLs that are definitly disallowed. The disallowment of these URLs is also verified by our Google Webmaster Tools, other robots.txt-checkers and when we search for a disallowed URL in Google, it says that it's blocked for spiders. Where did these errors came from? Was it the SEOmoz spider that broke our disallowment or something? You can see the drop and the increase in errors in the attached image. Thanks in advance. [
Technical SEO | | ooseoo](<a href=)" target="_blank">a> [
](<a href=)" target="_blank">a> LAAFj.jpg
0