Ranking For Synonyms Without Creating Duplicate Content.
-
We have 2 keywords that are synonyms we really need to rank for as they are pretty much interchangeable terms. We will refer to the terms as Synonym A and Synonym B.
Our site ranks very well for Synonym A but not for Synonym B. Both of these terms carry the same meaning, but the search results are very different. We actively optimize for Synonym A because it has the higher search volume of the 2 terms. We had hoped that Synonym B would get similar rankings due to the fact that the terms are so similar, but that did not pan out for us.
We have lots of content that uses Synonym A predominantly and some that uses Synonym B. We know that good content around Synonym B would help, but we fear that it may be seen as duplicate if we create a piece that’s “Top 10 Synonym B” because we already have that piece for Synonym A. We also don’t want to make too many changes to our existing content in fear we may lose our great ranking for Synonym A.
Has anyone run into this issue before, or does anyone have any ideas of things we can do to increase our position for Synonym B?
-
There's a 100 different ways to do this, but typically my favorite approach is to try to work the synonym into the same copy without seeming spammy.
For example, if my primary keyword is "GMO" and my very literal synonym is "Genetically Modified Organism" then I'd try to work both variations into the copy.<title>GMO Dangers - Knowing the Risks of Genetically Modified Organisms</title>
Here's a great article that goes into depth about the advantages of incorporating multiple variants into your SEO targeting http://cognitiveseo.com/blog/5370/941-traffic-increase-exploiting-the-synonyms-seo-ranking-technique/
-
Hi,
I do following steps to target synonym keywords & keywords ranking improve for both keywords (synonym).
1>I do use both keywords in meta title {one keyword - second keyword |modifiers(e.g cheap, buy, best)}
2> I use one keyword that have more search volume in H1 tag.
3> Place both keywords in webpage content and I would suggest you to long webpage content (at least 500 words).
4>I use both keywords as anchor text for getting links and also append modifiers for anchor text variation.
What I have shared above it is my personal experience I hope it helps in your case as well but not very sure because I am not aware of keywords, webpage content , link building methods etc.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
Sub-domain with spammy content and links: Any impact on main website rankings?
Hi all, One of our sub-domains is forums. Our users will be discussing about our product and many related things. But some of the users in forum are adding a lot of spammy content everyday. I just wonder whether this scenario is ruining our ranking efforts of main website? A sub domain with spammy content really kills the ranking of main website? Thanks
Algorithm Updates | | vtmoz0 -
Its the 21st April, and my non responsive page is still ranking the same ?
Hi, As you know the new algorithm is due today, can anybody confirm why my site wouldn't appear to be affected as yet? Cheers
Algorithm Updates | | CFCU0 -
Loss of 1,000 links has negatively affecting rankings
Hey there, One of the clients we're working with has lost about 1,000 or so backlinks over the last two or three months - mainly old article and directory links - and it has massively affected the site's search rankings. The site was ranking for pretty much all of its keywords in prominent positions on Google (mostly first page) but has now seen positions dive to 50, 90 and even outside of the top 100. Is there any immediate action we can put into place to help restore our rankings?
Algorithm Updates | | Webrevolve0 -
Do links from unrelated sites dilute your rankings for your key phrases?
do links from unrelated sites dilute your rankings for your key phrases? i've always heard don't get links from unrelated sites but if that mattered, then how would sites with totally diverse pages such as newspaper sites, sears, and other catalogue sites rank for these diverse subjects on their site? How does Facebook rank when it gets 100,000 links a day from sites that have nothing to do with a social media site? I'd love to hear everyone's opinion on this. Also, Do links from unrelated sites give less push than related links? Take care,
Algorithm Updates | | Ron10
Ron0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
SERP and SEO Moz ranking
Until a couple of months ago the predicted SEO Moz ranking for a specific keyword was fairly close to what I actually experienced with my website. However, since then the correlation has not been good. For example, according to SEO Moz I am ranked #1 for a specific keyword with google.ca and google.com yet my site actually shows up consistently at #3 for that keyword. Has anyone else noticed this divergence?
Algorithm Updates | | casper4340 -
How To Rank High In Google Places?
Hello SEOmoz, This question has been hounding me for a long time and I've never seen a single reliable information from the web that answers it. Anyway here's my question; Supposing that there are three Google places for three different websites having the same categories and almost same keywords and same district/city/IP how does Google rank one high from the other? Or simply put if you own one of those websites and you would want to rank higher over your competitors in Google places Search results how does one do it? A number of theories were brought up by some of my colleagues: 1. The age of the listing 2. The number of links pointing to the listing (supposing that one can build links to ones listing) 3. The name/url of the listing, tags, description, etc. 4. The address of the listing. 5. Authority of the domain (linked website) You see some listings have either no description, and only one category and yet they rank number one for a specific term/keyword whereas others have complete categories, descriptions etc. If you could please give me a definite answer I will surely appreciate it. Thank you very much and more power!
Algorithm Updates | | LeeAnn300