Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I avoid duplicate url keywords?
-
I'm curious to know
Can having a keyword repeat in the URL cause any penalties ?
For example
xyzroofing.com/commercial-roofing
xyzroofing.com/roofing-repairs
My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way.
Also
One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
-
Thank you Boyd
-
There is no penalty for using the same keyword twice in a URL, especially if it's part of your domain name.
There are many examples of sites that have a sub folder that contain the same keyword as their domain name that have no problem ranking including your competition:
- runningwarehouse.com/mens-running-shoes.html ranks #2 for 'running shoes'
- seo.com/seo ranks #5 for 'professional seo'
- overthetopseo.com/professional-seo-services-what-to-expect/ ranks #2 for 'professional seo' (in fact only 3 url's that rank for that phrase don't repeat the term 'seo' in their url.)
- contentmarketinginstitute.com/what-is-content-marketing ranks #1 for 'content marketing'
- etc.
**Ranking the correct page: **
Whenever you have an issue with the wrong page ranking better than the one you want, you just need to work on tweaking your onsite optimization for those pages. (And you may have to continue building more links to the page you want to rank.)
Here is a list of things that I'd make some test changes to: (Keep in mind that you can always revert things back if a test makes rankings go down.)
- Test different title tags on the two pages making one less optimized for the keyword and the other more optimized.
- Add more copy to the page you want to rank.
- Do an internal link audit. You want to make sure that anytime you are linking from one page to another with a specific keyword as the anchor text, that it links to the page that you want to rank for that phrase.
After you make a change, you need to wait until Google re-caches that page and sees the update (which can take a few days or more sometimes) and then check your rankings after that to see if there was any movement or not.
Boyd
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate LocalBusiness Schema Markup
Hello! I've been having a hard time finding an answer to this specific question so I figured I'd drop it here. I always add custom LocalBusiness markup to clients' homepages, but sometimes the client's website provider will include their own automated LocalBusiness markup. The codes I create often include more information. Assuming the website provider is unwilling to remove their markup, is it a bad idea to include my code as well? It seems like it could potentially be read as spammy by Google. Do the pros of having more detailed markup outweigh that potential negative impact?
Local Website Optimization | | GoogleAlgoServant0 -
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
301 or 302 Redirects with locale URLs?
Hi Mozers, I have a bit of a tricky question I need some help answering. My agency are building a brand new website for a client of ours which means changing the domain name (yay...). So! I have my 301's all ready to go for the UK locale, however, the issue I have is that the site will also eventually have French, German and Spanish locales - but these won't be ready to go until later this year. We will be launching in just English for September. The current site already has the French and German locales on it as well. Just to make sure I'm being clear, the site will be www.example.com for launch, but by lets say November, we will also have a www.example.com/fr/ and www.example.com/de/ site launched too. So what do I do with the locale URLs? As I said above, the exisitng site already has the French and German locales on it, so I don't particularly want to redirect the /fr/ and /de/ URLs to the English homepage, as I will want to redirect them to the new URLs in November, and redirecting more than once is bad for SEO right? Any ideas? Would 302s maybe be the best suggestion? Thanks! Virginia
Local Website Optimization | | Virginia-Girtz1 -
How to Handle Franchise Duplicate Content
My agency handles digital marketing for about 80 Window World stores, each with separate sites. For the most part, the content across all of these sites is the exact same, though we have slowly but surely been working through getting new, unique content up on some of the top pages over the past year. These pages include resource pages and specific product pages. I'm trying to figure out the best temporary solution as we go through this process. Previously, we have tried to keep the pages we knew were duplicates from indexing, but some pages have still managed to slip through the cracks during redesigns. Would canonicals be the route to go? (do keep in mind that there isn't necessarily one "original version," so there isn't a clear answer as to which page/site all the duplicated pages should point to) Should we just continue to use robots.txt/noindex for all duplicate pages for now? Any other recommendations? Thanks in advance!
Local Website Optimization | | TriMarkDigital0 -
Country/Language combination in subdirectory URL
Hello, We are a multi country/multi lingual (English, Arabic) website. We are following a subdirectory structure to separate and geotarget the country/language combinations. Currently our english and arabic urls are the same: For UAE: example.com/ae (English Site) For Saudi Arabic: example.com/sa (Saudi Arabia) We want to separate the English and Arabic language URLs and I wanted to know if there is any preference as to which kind of URL structure we should go with : example.com/ae-en (Country-Language) example.com/en-ae (Language-Country) example.com/ae/en (Country/Language) Is there any logic to deciding how to structure the language/country combinations or is is entirely a matter of personal preference. Thanks!
Local Website Optimization | | EcommRulz0 -
How does duplicate content work when creating location specific pages?
In a bid to improve the visibility of my site on the Google SERP's, I am creating landing pages that were initially going to be used in some online advertising. I then thought it might be a good idea to improve the content on the pages so that they would perform better in localised searches. So I have a landing page designed specifically to promote what my business can do, and funnel the user in to requesting a quote from us. The main keyword phrase I am using is "website design london", and I will be creating a few more such as "website design birmingham", "website design leeds". The only thing that I've changed at the moment across all these pages is the location name, I haven't touched any of the USP's or the testimonial that I use. However, in both cases "website design XXX" doesn't show up in any of the USP's or testimonial. So my question is that when I have these pages built, and they're indexed, will I be penalised for this tactic?
Local Website Optimization | | mickburkesnr0 -
Duplicate Schema within webpage
I'm implementing schema across a few Wordpress sites. Most (probably all) WP sites use widgets for their footer, which offer their own editable HTML. Is it damaging (or helpful) to implement the exact same markup in the footer and a specific page, like for instance, a locations page that has the address and contact info (which are also in the footer)?
Local Website Optimization | | ReunionMarketing0 -
Duplicate content on a proxy site?
I have a local client with a 500 page site.
Local Website Optimization | | TFinder
They advertise online and use traditional media like direct mail.
A print media company, Valpak, has started a website
And wants the client to use their trackable phone number
And a proxy website. When I type the proxy domain in the browser
It appears to be client home page at this proxy URL. The vendor
Wishes to track activity on its site to prove their value or something
My question is: is their any "authority" risk to my clients website
By allowing this proxy site??0