Google's stand on LSI keywords?
-
Hi all,
So the keywords which appear while typing some keywords and suggested keywords at the bottom of the search results page are refereed as LSI keywords. I been noticing some of the LSI keywords for years related to our industry and Google now suddenly changed them. I wonder why it would be. I can see competitors are started using those LSI keywords widely, is that the reason Google changed them?
Thanks
-
Currently, it is very important to use LSI keywords in content. If you use the right context, adding LSI keywords will be appreciated by Google for your content and give it a better ranking than your competitors.
-
Hi vtmoz,
It's very common for Google to change its algorithm in response to user trends. The thing to remember is that users have different forms of intent when they use certain words or phrases, and Google is constantly altering their ranking methods to reflect this.
While LSI keywords tend to remain static, there can be alterations and shifts within the industry or in how people observe and interact with it. However, these changes would not necessarily occur because companies begin using them more frequently.
More likely, Google is dipping into its reservoir of big data to determine that user queries leading to certain pages were not producing user satisfaction (i.e. bounce metrics were high on pages that were previously identified as supplying relevance via LSI keywords) and therefore made a change to better reflect what users were searching for (and interacting with) from their SERPs.
A couple of questions to ask:
-
Has anything changed within your industry that would cause an LSI keyword shift? (new products, new competitors, new rules and regulations, etc?)
-
Is there a pattern in terms of the keywords that have changed? Is it industry-wide or a specific segment?
-
Are there new ways users may be interacting with the industry? New queries being used?
-
What is the impact on your rankings for general terms related to those LSI terms/phrases?
Based on the answers to these questions, you can better identify whether it's a shift from Google altering the LSI algorithm for your industry, or simply an indicator of a developing industry. My guess is the latter.
Hope this helps - feel free to reach out any time if you need a clarification or just want to chat!
Thanks,
Rob
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you seen what happens when you Google Academy Awards 2014?
Hi All... Just wondering if it is a new feature on Google's side (knowledge Graph)? Thanks
Algorithm Updates | | BeytzNet0 -
Google's Local Search Results for Broad Keywords
I have a question regarding Google's local search results for broad keywords. Since Google is changing their algo to reflect local results for broad words, would it be beneficial now to start going after those words as well? For example: we have a client ranking for 'miami security alarm', but I would like to know if it would be beneficial to start optimizing for 'security alarm' as well. Also, since Google's keyword research tool reflects searches on a national level, how would I be able to find out how many searches a broad keyword is receiving on a local level? Thank you in advanced!
Algorithm Updates | | POPCreative0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
How to show your ratings in the Google SERP
I've noticed that some organic search results are showing ratings just above the meta tag. How are these sites doing this? Example: If you search "cash advance", there is a result between #4 and #6 in the organic results. The site is "goldcashadvance.com". It's showing a 5-star rating in the result.
Algorithm Updates | | sparagi0 -
Is Google Rotating Good Matches?
I have a theory that Google may be trying to be fair to white-hat-seo sites that are doing the right things with blogging, linking, social media, etc. [ie that deserve equal good positioning] are being cycled to and from the first page, perhaps in a weekly or monthly basis. My theory would be that they are purposefully doing it to give those sites more equal exposure. My case: I've had top rankings for http://thedogbitelawyer.com for almost all of the important terms for dog bite lawyers for a couple of years now. When Penguin came out we lost some ground across the board, and identified that perhaps there was too much duplicate content left over from when I inherited the site. I reworked the site wording and link structure a bit and gained back positioning. Since that time we are up and down like a yo-yo on the top terms! Anybody else have this suspicion? If it's true, I don't need to stress, if we are bouncing around for other reason's I'd better keep stressing!
Algorithm Updates | | JCDenver0 -
How to build good content and choose right keywords.?
I have started building content for our website using the Wordpress tool. Now I wanted to know that I use GA and the Adwords keyword tool. I go in for exact matching keywords and have selected a few of them. How do I know if these keywords are actually the ones going to give me good traffic? How can I select good keywords and write content along them. I don't wish to over stuff articles with the keywords. How can I refrain from doing so. Any optimum limit through which I know how much of the keyword needs to occur how many times within an article? Please give some good insights as to how this is accomplished? Thanks
Algorithm Updates | | shanky11 -
Let's talk about link networks
With the recent deindexing of blog/link networks, I was hoping to get the Q&A's take on what defines a link network. Are all link building services using link networks? Would you consider something like: submitedge.com thehoth.com To use link networks? They generate links for you, but most of the time they will do it with "decent" content, on sites like Wordpress, Blogger, Squidoo and other similar sites. I don't think that most of their link sources are owned internally, but I could be wrong. Some of them use profile links to send links to their articles, which is garbage. Would you suggest staying away from services like this all together? I'd say that 90% of the services offered on submitedge might be junk, but a few look useful. I've seen a few people at my company have success with them, but fully understand that it could be short term, and potentially inevitable that those links get deindexed. I'd like to potentially find a good link building service that could bridge the gaps between when I have time to write content and do link building, as I know the engines like to see a steady stream of both. Any thoughts? Any other services you guys have used with some success? I am not looking for sites like fiverr or anything quick/cheap. I'd be willing to spend the appropriate money occasionally when I think I could use a few extra links, but don't think I need a regular link builder (as that's something I like to do). I also don't want to go the route of outright buying links from other websites. Cheers, Vinnie
Algorithm Updates | | vforvinnie2 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0