Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
-
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term.
I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information?
If so, is the alternative to use various long-winded keywords instead?
If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word?
Still new-ish to SEO, so any help is much appreciated!
V.
-
We like to think of all pages written around a specific topic as a content silo. Many of these pages will include the same keywords for sure. The key is to choose which page is the "head" of the silo and should rank for the main phrases assigned to that silo. Then you can use all the other pages in the silo to internally link back to the main page with the proper anchor text, thereby helping the main page (and correct page) rank for the keyword.
To sum up, you might end up with many pages that all include a specific keyword but you're going to internally link all of them to the main page using the keyword as the anchor text which is basically telling Google that all your pages are saying that the main page is the most relevant for that keyword.
-
The pages will compete against each other under normal circumstances, but that's not necessarily an awful thing. For example, maybe your older page only achieved positions 16-30 to the keyword, but the new page might achieve a higher ranking. Unless you pit them against each other, how will you know what's best?
Stopping newer pages competing for old rankings, doesn't give a magical bonus to the old page and make it rank higher. Unless you're absolutely certain that the old page should be the 'definite' landing page for the keyword, a bit of friendly competition doesn't usually hurt much
The pages which really contend for your rankings, are those from other websites. Good luck emailing all the webmasters and complaining at them, that they are using your keywords

Sometimes, under very specific circumstances, keyword cannibalisation can come into play and cause problems. But 90% of the time it's just not really that big of a deal
The big deal is that if you write loads of pages with the same focus keyword, you're NOT writing about new keywords. And if you're not doing that, how will you increase your footprint? Often it's more lucrative to cover other, newer material rather than re-hashing old stuff
The worst you tend to get are rankings that stay largely in the same place, but their ranking URL jumps around as Google tries to decide which page to rank (and then eventually settles on one)
IMO, the worst part about keyword cannibalisation is not the fall-out from it (which is usually minimal) - it's the WASTED time, in terms of getting onto new topics to attract new visitors. Always be expanding
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Replacing keywords by synonyms. Will it increase risk of google keyword stuffing penalization?
I have a page which is ranking already pretty well for a relative competitive keyword.
Intermediate & Advanced SEO | | lcourse
Google also ranks us on first page for synonym of keyword we optimize the page for (even though synonym does not appear on our page). I am now considering to replace some occurences of the keyword in the page by different synonyms, in the hope that our ranking may further improve for these synonyms.
However I am concerned that google may penalize me for keyword stuffing if I am using a wide range of synonyms of one keyword on our page. My plan is only to replace some occurences of keyword with synonyms. I am a bit nerveous here since page is already ranking quite well in a competitive niche. Any thoughts?0 -
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | | BrandBuilder0 -
Is a 404, then a meta refresh 301 to the home page OK for SEO?
Hi Mozzers I have a client that had a lot of soft 404s that we wanted to tidy up. Basically everything was going to the homepage. I recommended they implement proper 404s with a custom 404 page, and 301 any that really should be redirected to another page. What they have actually done is implemented a 404 (without the custom 404 page) and then after a short delay 301 redirected to the homepage. I understand why they want to do this as they don't want to lose the traffic, but is this a problem with SEO and the index? Or will Google treat as a hard 404 anyway? Many thanks
Intermediate & Advanced SEO | | Chammy0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Using Canonical URL to poin to an external page
I was wondering if I can use a canonical URL that points to a page residing on external site? So a page like:
Intermediate & Advanced SEO | | llamb
www.site1.com/whatever.html will have a canonical link in its header to www.site2.com/whatever.html. Thanks.0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
All page files in root? Or to use directories?
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
Intermediate & Advanced SEO | | Peter264
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following: /images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!0