Should I avoid duplicate url keywords?
-
I'm curious to know
Can having a keyword repeat in the URL cause any penalties ?
For example
xyzroofing.com/commercial-roofing
xyzroofing.com/roofing-repairs
My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way.
Also
One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
-
Thank you Boyd
-
There is no penalty for using the same keyword twice in a URL, especially if it's part of your domain name.
There are many examples of sites that have a sub folder that contain the same keyword as their domain name that have no problem ranking including your competition:
- runningwarehouse.com/mens-running-shoes.html ranks #2 for 'running shoes'
- seo.com/seo ranks #5 for 'professional seo'
- overthetopseo.com/professional-seo-services-what-to-expect/ ranks #2 for 'professional seo' (in fact only 3 url's that rank for that phrase don't repeat the term 'seo' in their url.)
- contentmarketinginstitute.com/what-is-content-marketing ranks #1 for 'content marketing'
- etc.
**Ranking the correct page: **
Whenever you have an issue with the wrong page ranking better than the one you want, you just need to work on tweaking your onsite optimization for those pages. (And you may have to continue building more links to the page you want to rank.)
Here is a list of things that I'd make some test changes to: (Keep in mind that you can always revert things back if a test makes rankings go down.)
- Test different title tags on the two pages making one less optimized for the keyword and the other more optimized.
- Add more copy to the page you want to rank.
- Do an internal link audit. You want to make sure that anytime you are linking from one page to another with a specific keyword as the anchor text, that it links to the page that you want to rank for that phrase.
After you make a change, you need to wait until Google re-caches that page and sees the update (which can take a few days or more sometimes) and then check your rankings after that to see if there was any movement or not.
Boyd
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How accurate are google keyword estimates for local search volume?
We've all used the Google Adwords Keywords Tool, and if you're like me you use it to analyze data for a particular region. Does anyone know how accurate this data is? For example, I'd like to know how often people in Savannah, Georgia search for the word "forklift". I figure that Google can give me two kinds of data when I ask for how many people in Savannah search for "forklift". They might actually give me rough data for how many people in the region actually searched for the term "forklift" over the last 12 months, then divide by 12 to give me a monthly average. Or they might use data on a much broader region and then adjust for Savannah's population size. In other words, they might say, in the US people searched for "forklift" and average of 1,000,000 times a month. The US has a population of 300,000,000. Savannah has a population of about 250,000. 250,000 / 300,000,000 is 0.00083. 1,000,000 times 0.00083 is 208. So, "forklift" is searched in Savannah an average of 208 times. 1. is obviously much more accurate. I suspect that 2. is the model that Google is actually using. Does anyone know with reasonable certainty which it is? Thanks,
Local Website Optimization | | aj613
Adam0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.) We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
RE: Keep Losing Keyword Ranking Position for Targeted Keyword Terms Can't Figure It Out, Please Help!!!
Hey Mozzers, I am pulling my hair out trying to figure out why one of my clients keeps losing their SERP for their targeted keyword terms. We're actively pursuing local citations, making sure their NAP is consistent across the board and refining on-page content to make sure that we're maximizing opportunities. The only thing I've found is a 4xx error that my Moz 'crawl diagnostics' keep returning back to me, however, when I check to see if there's any problems with Google Webmaster Tools, it doesn't return any errors. Is this 4xx error the culprit? Are there any suggestions any of you could give me to help me improve the SERP for my targeted keyword terms. Anyway, any and all insight can help. I'm at my wits end. Thanks for reading and for all of your help!
Local Website Optimization | | maxcarnage0 -
Target broad keywords for local or broad keywords+local city?
Hi, Is it better to target broad keywords in a local market or target 'broad keywords + local city'? Or both? The sites I'm working with currently have landing pages for each 'local city/town + keyword' ... they each have about 5 services they offer and about 7 or more nearby towns they service. This means I'm tracking about 35+ keywords per client. That seems to be a bit much. Am I wrong? Would it be just as effective to target broad keywords and track them locally being that the local market isn't very competitive. Of course the broad keywords yield more search volume according to google keyword tool. However, the current setup is sending a worthwhile traffic volume to the site. According to Miriam's article http://moz.com/blog/local-landing-pages-guide I'm working with a business model 2 - single brick and mortar location servicing many areas nearby. Thanks, Chris
Local Website Optimization | | LinkPoint0 -
UK website to be duplicated onto 2 ccTLD's - is this duplicate content?
Hi We have a client who wishes to have a site created and duplicated onto 3 servers hosted in three different countries. United Kingdom, Australia and USA. All of which will ofcourse be in the English language. A long story short, the website will provide the user 3 options on the homepage asking them which "country site" they wish to view. (I know I can detect the user IP and autoredirect but this is not what they want) Once they choose an option it will direct the user to the appropriate ccTLD. Now the client wants the same information to appear on all 3 sites with some slight variations in products available and English/US spelling difference but for the most part, the sites will look the same with the same content on each page. So my question is, will these 3 sites been seen as duplicates of each other even though they are hosted in different countries and are on ccTLD's? Are there any considerations I should pass onto the client with this approach? Many thanks for reading.
Local Website Optimization | | yousayjump
Kris0 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0 -
Canonical for 80-90% duplicate content help
Hi . I seem to spend more time asking questions atm. I have a site I have revamped www.themorrisagency.co.uk I am working through sorting out the 80-90% duplicated content that just replaces a spattering of geographical and band styles eg: http://www.themorrisagency.co.uk/band-hire/greater-manchester/ 'manchester' being changed to : http://www.themorrisagency.co.uk/band-hire/oxfordshire/ etc So I am going through this slow but essential process atm. I have a main http://www.themorrisagency.co.uk/band-hire/ page My question is: Would it be sensible to (using Yoast SEO plug in) use a canonical redirect as a temp solution from these dup pages to http://www.themorrisagency.co.uk/band-hire/ Rather than remove them What are your thoughts as I am aware that the damage using a rel= could make it worse. Thanks as always Daniel
Local Website Optimization | | Agentmorris0