Creating pages as exact match URL's - good or over-optimization indicator?
-
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain).
Example:
keyword: cars that start with AWhich way to go is better when creating your pages on a non-exact domain match site:
www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the
or
www.sample.com/starts-with-a/ again has "cars that start with A" as the
Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So:
www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/or
www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/Hope someone here at the MOZ community can help out. Thanks so much
-
Hi Curtis,
Thanks for your reply. Well to be more specific the domain would be:
freecarfinder.com/cars-that-start-with-a/The domain is new so it has not authority whatsoever. The domain is not that long but it's not really short neither. The content on the page is pretty small where the exact keyword that's in the URL is mentioned in the heading 1 and twice on a small piece of text that explains how to use the page to search for results.
Totally agree best practise is to test it out. I do see that our competition is using /starts-with/a and is ranking really well with it. Maybe the best option is to create half of the pages using the exact keyword in the URL and half with /starts-with-a/ to see which one performs better?
-
Unless your domain is really strong on car keywords I would include car in the URL, assuming the URL is not that long. Although Google is moving away from exact match into symantic search it seems to be happening slowly and we have certainly seen improvements in ranking by having some exact matches. So I think as longs as you don't have the exact same phrase in all places on the page there isn't much danger. However, the best pratice is to test and learn, make the change and see if it improves the ranking.
Hope that helps, let me know if you need anything more?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Can H1 and Meta title be exactly the same ?
I've heard from some SEO's that H1 and Meta Title shouldn't be exactly the same, why ? Both of them describe what is ON the page right ? Why is it Spammy? Is it ?
White Hat / Black Hat SEO | | Tintanus2 -
Does this URL need rewriting?
Hello, Does this URL need to be rewritten? http://www.nlpca.com/DCweb/modelingwithnlparticleandreas.html Bob
White Hat / Black Hat SEO | | BobGW0 -
Google is giving one of my competitors a quasi page 1 monopoly, how can I complain?
Hi, When you search for "business plan software" on google.co.uk, 7 of the 11 first results are results from 1 company selling 2 products, see below: #1. Government site (related to "business plan" but not to "business plan software")
White Hat / Black Hat SEO | | tbps
#2. Product 1 from Palo Alto Software (livePlan)
#3. bplan.co.uk: content site of Palo Alto Software (relevant to "business plan" but only relevant to "business plan software" because it is featuring and linking to their Product 1 and Product 2 sites)
#4. Same site as #3 but different url
#5. Palo Alto Software Product 2 (Business Plan Pro) page on Palo Alto Software .co.uk corporate site
#6. Same result as #5 but different url (the features page)
#7. Palo Alto Software Product 2 (Business Plan Pro) local site
#8, #9 and #10 are ok
#11. Same as #3 but the .com version instead of the .co.uk This seems wrong to me as it creates an illusion of choice for the customer (especially because they use different sites) whereas in reality the results are showcasing only 2 products. Only 1 of Palo Alto Software's competitors is present on page 1 of the search results (the rest of them are on page 2 and page 3). Did some of you experience a similar issue in a different sector? What would be the best way to point it out to Google? Thanks in advance Guillaume0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
New sub-domain launches thousands of local pages - is it hurting the main domain?
Would greatly appreciate some opinions on this scenario. Domain cruising along for years, top 1-3 rankings for nearly all top non-branded terms and a stronghold for branded searches. Sitelinks prominently shown with branded searches and always ranked #1 for most variations of brand name. Then, sub-domain launches that was over 80,000 local pages - these pages are 90-95% similar with only city and/or state changing to make them appear like unique local pages. Not an uncommon technique but worrisome in a post Panda/Penguin world. These pages are surprisingly NOT captured as duplicate content by the SEOMoz crawler in my campaigns. Additionally about that same time a very aggressive, almost entirely branded paid search campaign was launched that took 20% of the clicks previously going to the main domain in organic to ppc. My concern is this, shortly after this launch of over 80k "local" pages on the sub-domain and the cannibalization of organic clicks through ppc we saw the consistency of sitelinks 6 packs drop to 3 sitelinks if showing at all, including some sub-domains in sitelinks (including the newly launched one) that had never been there before. There's not a clear answer here I'm sure but what are the experts thoughts on this - did a massive launch of highly duplicate pages coupled with a significant decrease in organic CTR for branded terms harm the authority of the main domain (which is only a few dozen pages) causing less sitelinks and less strength as a domain or is all this a coincidence? Or caused by something else we aren't seeing? Thanks for thoughts!
White Hat / Black Hat SEO | | VMLYRDiscoverability0 -
Can a Page Title be all UPPER CASE?
My clients wants to use UPPER CASE for all his page titles. Is this okay? Does Google react badly to this?
White Hat / Black Hat SEO | | petewinter0 -
What do you think of our new category page?
Hey Mozzers! We have come up with a new layout design for a category page and would love to have your opinion on it, specifically from an S_E_O perspective Here is our current page: http://www.builddirect.com/Laminate-Flooring.aspx Our new page (pending approval): http://www.builddirect.com/testing/laminate-flooring/index.html Just to brief you in on the key differences b/w old and new layout: Left text link menu is removed in new layout
White Hat / Black Hat SEO | | Syed1
New layout looks funny with JS disabled - long vertical line up of products(Perhaps important keywords/ content in new layout appears way down?)
Lot of 'clunk' has been removed (bits of text, links, images, etc) Thanks for checking this out.0