Targeting multiple keywords with index page
-
Quick keyword question....
I just started working with a client that is ranking fairly well for a number of keywords with his index page. Right now he has a bunch of duplicate titles, descriptions, etc across the entire site. There are 5 different keywords in the title of the index page alone.
I am wondering if it OK to target 3 different keywords with the index page? Or, if I should cut it down to 1. Think blue widget, red widget, and widget making machines. I want each of the individual keywords to improve but don't want to lose what I have either.
Any ideas?
THANKS!!!!
-
Yeah those are some good points. 3 of the keywords are in the top 20, the other 2 in the top 50. I was thinking about clipping the lower performing keywords to focus more on the other 3.
The 3 that are in the top 20 have around 25k exact searches per month but are a little broad. I think I am going to leave it for now but keep an eye on it.... I will watch it over the next few weeks as I get started working on the site. Hopefully as I become more familiar with the site/niche the answer will be more clear.
Thanks for the valuable information!
-
EGOL,
This is a great add. Jason, please read what EGOL has here as it elucidates what I should have spelled out along with talking with the client. My point in talking with them was to discover how these keywords convert for them and the impact that has on their business.
Egol, thanks again.
-
There are 5 different keywords in the title of the index page alone. I am wondering if it OK to target 3 different keywords with the index page? Or, if I should cut it down to 1.
Wait!
Five, Three or One?
Before you change anything you have to look at current performance.
There are a few sites that have the ability to support five keywords in the title tag.... many more that can support three.... and one could very well be a wholesale retreat.
So, before you retreat from five to three be sure that you are not going to be damaging current traffic.
Remember that you said....
I just started working with a client that is ranking fairly well for a number of keywords with his index page.
It sounds like this site is doing fantastic. Don't retreat from what is working - unless you are moving to more profitable ground.
How are these converting?
SEO does not have hard and fast rules about the number of keywords that a page can attack. You attack as many as your power and your smarts and your competitive situation allow....
.... and keep in mind that some keywords are a lot more valuable than others.
-
The main page is currently an aged PR 3 domain. I think I will talk it over with the client to see what he wants to do. I may just leave the 3 in the main page for now. Thanks for the input!
-
Thanks for the response! I really appreciate it. I think I will have a talk with the client to see what he wants to do.
-
In case of new or relatively "weaker" websites, Google often shows the homepage on the SERP only, instead of the product pages, because the homepage has the highest Page Authority. When the product pages are becoming "stronger", Google will trust those as well and show them in the SERP.
If you're worried about losing the current homepage performance, then keep the 3 keywords in the homepage title if those products are very important for the client, and create three individual pages for each product. After 6-8-10 month when the product pages' authority is high enough, then Google will probably show the product pages instead of the homepage, because those have more relevant content and more relevant links.
-
Jason,
Client is the key word here. First, you have to talk with them and make them aware of the plusses and minuses of the change. Let them decide is my advice. As to if it were mine, I would not change the index page at first and would construct what pages I am going to have for given terms. So, if on index page the most crucial keyword is red widget, it stays and I start looking at how to set up other pages and maximize red widget where it is.
Obviously, over time the chances of maintaining or growing a ranking is higher with a specific page (assuming the same effort, etc.). If you do link buildiing and set it to only go to index page, that will be more of an issue. So, your linking should be specific to what you wish to rank.
Remember, as you start to make changes, chances are you will drop before you begin to climb. If the client is aware of that going in, you have an ally. No, there is no hard rule on when the increase will happen.
I hope this helps you clarify,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index e-commerce marketplace product pages
Hello! We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed. I've attached images of the reasons given for not indexing the platform. gsc-dashboard-1 gsc-dashboard-2 How would we go about fixing this?
Technical SEO | | fbcosta0 -
Why blocking a subfolder dropped indexed pages with 10%?
Hy Guys, maybe you can help me to understand better: on 17.04 I had 7600 pages indexed in google (WMT showing 6113). I have included in the robots.txt file, Disallow: /account/ - which contains the registration page, wishlist, etc. and other stuff since I'm not interested to rank with registration form. on 23.04 I had 6980 pages indexed in google (WMT showing 5985). I understand that this way I'm telling google I don't want that section indexed, by way so manny pages?, Because of the faceted navigation? Cheers
Technical SEO | | catalinmoraru0 -
Staging & Development areas should be not indexable (i.e. no followed/no index in meta robots etc)
Hi I take it if theres a staging or development area on a subdomain for a site, who's content is hence usually duplicate then this should not be indexable i.e. (no-indexed & nofollowed in metarobots) ? In order to prevent dupe content probs as well as non project related people seeing work in progress or finding accidentally in search engine listings ? Also if theres no such info in meta robots is there any other way it may have been made non-indexable, or at least dupe content prob removed by canonicalising the page to the equivalent page on the live site ? In the case in question i am finding it listed in serps when i search for the staging/dev area url, so i presume this needs urgent attention ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
After I 301 redirect duplicate pages to my rel=canonical page, do I need to add any tags or code to the non canonical pages?
I have many duplicate pages. Some pages have 2-3 duplicates. Most of which have Uppercase and Lowercase paths (generated by Microsoft IIS). Does this implementation of 301 and rel=canonical suffice? Or is there more I could do to optimize the passing of duplicate page link juice to the canonical. THANK YOU!
Technical SEO | | PFTools0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
Why are pages linked with URL parameters showing up as separate pages with duplicate content?
Only one page exists . . . Yet I link to the page with different URL parameters for tracking purposes and for some reason it is showing up as a separate page with duplicate content . . . Help? rpcIZ.png
Technical SEO | | BlueLinkERP0 -
How to block "print" pages from indexing
I have a fairly large FAQ section and every article has a "print" button. Unfortunately, this is creating a page for every article which is muddying up the index - especially on my own site using Google Custom Search. Can you recommend a way to block this from happening? Example Article: http://www.knottyboy.com/lore/idx.php/11/183/Maintenance-of-Mature-Locks-6-months-/article/How-do-I-get-sand-out-of-my-dreads.html Example "Print" page: http://www.knottyboy.com/lore/article.php?id=052&action=print
Technical SEO | | dreadmichael0