Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will I loose from SEO if I rename my urls to be more keyword friendly?
-
As a good practice of SEO is to have your keywords in the links. I am thinking of doing some optimization and change my urls to more effective keywords. I am using shopify and there is an option (a tick) that you can check while changing the url (ex. for a category, for a product, for a blog post). This will give a redirection to the old post to the new.
Is it good practice? Is it risky for losing SEO or it will help to rank higher because I will have better keywords in my links?
-
I will check those guides and see how I can work on optimizations. Thank you.
-
Hi Spiros
Below are two guides that may assist in your URL structure. It appears the site is not ranking for target keywords so a re-structure is maybe beneficial. It is an ideal practice to have the target customer query in the URL structure. Shopify though has limitations so may need to work within them.
https://www.hostgator.com/blog/best-url-structure-seo/
https://moz.com/blog/15-seo-best-practices-for-structuring-urls
The next step will 301'ing the old page to the new page.
Hope that helps.
-
If my pages were ranking top 20 keywords I would use tools like MOZ! So no they are not ranking top 20 and I want to achieve this....
-
Spiros
The first rule of seo - is to do no harm. Sometimes when you optimise a site as per "best practice" - then you lose rankings. It is important to protect the keywords that matter that you already rank for.
So an initial step is an audit.
Perhaps I will reframe. - is the page you want to change the URL for ranking in the top 20 positions for the keyword you are targeting for?
Regards
-
Obviously, I want to change the URL in order to optimize for a better keyword and not fix broken links. So you are saying that it will hurt SEO? I am using moz tools because SEO in not 100% optimized for my site. So I don't think my pages are ranking well. And ranking well is related to the specific keyword you want to rank for a page. An "expert" from MOZ told me that it can take like 5-10 months to see results from SEO actions. Then this is not measurable...I can't fix something today "ex a URL" and wait 10 months! I will need a more specific answer!
-
HI
If it is not broken - you do not fix it. So the first question is, how is the site ranking today for the customer query/ies you are targetting? If you are ranking well, then likely do not touch urls.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will critical error in wordpress for memory limit affect seo rankings?
will critical error in wordpress to increase memory limit affect seo rankings?
Intermediate & Advanced SEO | | gamstopbet0 -
URL in russian
Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
Intermediate & Advanced SEO | | alexrbrg
http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
SEO time
I wanto to be in the top of the google search. I am usiing a lot of SEO tools but... I have done it during one month. Do I have to wait more?
Intermediate & Advanced SEO | | CarlosZambrana0 -
Does having a ? on the end of your URL affect your SEO?
I have some redirects that were done with at "?" at the end of the URL to include google coding (i.e. you click on an adwords link and the google coding follows the redirected link). When there is not coding to follow the link just appears as "filename.html?". Will that affect us negatively SEO-wise? Thank you.
Intermediate & Advanced SEO | | RoxBrock1 -
Do UTM URL parameters hurt SEO backlink value?
Does www.example.com and www.example.com/?utm_source=Google&utm_medium=Press+Release&utm_campaign=Google have the same SEO backlink value? I would assume that Google knows the difference.
Intermediate & Advanced SEO | | mkhGT0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
Can MadCap Flare WebHelp be made SEO Friendly?
A team member is porting over documentation from a .org wiki that will be placed on the company's root domain. The problem with MadCap is that it uses frames as well as javascript navigation. Has anyone encountered this problem before? I'm unfamiliar with the software and the project is pretty far into the pipeline at this point (I'm new at the company as well). Any advice on work-arounds or alternatives would be greatly appreciated.
Intermediate & Advanced SEO | | AnthonyYoung1