Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Keywords are indexed on the home page
-
Hello everyone,
For one of our websites, we have optimized for many keywords. However, it seems that every keyword is indexed on the home page, and thus not ranked properly. This occurs only on one of our many websites. I am wondering if anyone knows the cause of this issue, and how to solve it.
Thank you.
-
No, I wouldn't say that would cause such issues.
Your pages should get indexed eventually, as they are in your sitemaps (at least the ones I checked), so I am not surprised you're not seeing issues in reports.
But, tools like Moz will often struggle to give more strategic advice (not that we aren't working on it!), and in this case, if these pages are a priority, you need to prominently link to them within your site - this is the most reliably way to ensure rapid indexing.
-
@tom-capper Hey Tom, thanks a lot for all your support so far.
What I have noticed in other campaigns too, is that our websites have CLS (cumulative layout shift) problems. This is the only issue that I found in the account for Brekken. Do you know by any chance if this is something that causes pages to not get indexed correctly?
Thanks!
-
@ginovdw said in Keywords are indexed on the home page:
It seems like some of those simply aren't indexed.

As above, I recommend you investigate in Google Search Console for a clearer idea of why it isn't indexed, but I notice with /motorbootcharter-lemmer it is listed in your sitemaps, so probably Google will index it eventually.
If you want Google to better understand the value of these pages, consider including them in your main navigation, or linking from the homepage.
-
@ginovdw Have you been able to confirm that those other pages are crawled, indexed, and correctly rendered by Google?
For example, what happens if you inspect them in Google Search Console?
-
So, an example of one of our websites is below:
We optimized the following keywords on the following pages:
Motorbootcharter Lemmer: https://brekken.nl/motorbootcharter-lemmer
Jachtverhuur Lemmer:
https://brekken.nl/jachtverhuur-lemmerBootverhuur IJsselmeer:
https://brekken.nl/bootverhuur-ijsselmeerBoot verhuur in Friesland
https://brekken.nl/boot-verhuur-in-frieslandBoot huren IJsselmeer
https://brekken.nl/boot-huren-ijsselmeerYet, they all rank on the home page of our website. Some of these words are not even mentioned on our home page, or just once. I just don't see why they don't rank on their respective keyword which we optimized for, since this works for many of our other websites.
-
@ginovdw En effet, donnez plus d'informations, sinon ce n'est pas tout à fait clair avec quoi vous comparez et avec quoi voulez-vous vous classer ?
-
@tom-capper Thanks for your response! It's exactly what you mentioned. We have many pages optimized for those terms, but they all rank on our homepage.
-
@ginovdw Heya
Could you explain a little more what you're running in to?
For example, when you say that keywords are indexed on the homepage, do you mean that your homepage is ranking for all terms, even though you have optimized other pages for those terms?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Wild fluctuations in ranking, any thoughts on why or what to do about it?
We have a keyword which is important to us and used to have rock steady rankings for over a decade. Over the past 9 months or so it has been fluctuating all over the place. Jumping up or down 20 places or so overnight. Any thoughts on this behaviour and what we should be doing to stabilise the ranking?Screenshot 2024-05-13 at 09.47.jpg
Reporting & Analytics | | Hannahm240 -
Unsolved Persistent Issue with Keyword Research Tool
I keep getting the following error "Getting ranking keywords failed. Please retry your search or refresh this page. " when using this tool. It has been ongoing for a while and is getting really frustrating. Any ideas?
Moz Tools | | csiproducts1 -
How to index e-commerce marketplace product pages
Hello! We are an online marketplace that submitted our sitemap through Google Search Console 2 weeks ago. Although the sitemap has been submitted successfully, out of ~10000 links (we have ~10000 product pages), we only have 25 that have been indexed. I've attached images of the reasons given for not indexing the platform. gsc-dashboard-1 gsc-dashboard-2 How would we go about fixing this?
Technical SEO | | fbcosta0 -
Japanese URL-structured sitemap (pages) not being indexed by Bing Webmaster Tools
Hello everyone, I am facing an issue with the sitemap submission feature in Bing Webmaster Tools for a Japanese language subdirectory domain project. Just to outline the key points: The website is based on a subdirectory URL ( example.com/ja/ ) The Japanese URLs (when pages are published in WordPress) are not being encoded. They are entered in pure Kanji. Google Webmaster Tools, for instance, has no issues reading and indexing the page's URLs in its sitemap submission area (all pages are being indexed). When it comes to Bing Webmaster Tools it's a different story, though. Basically, after the sitemap has been submitted ( example.com/ja/sitemap.xml ), it does report an error that it failed to download this part of the sitemap: "page-sitemap.xml" (basically the sitemap featuring all the sites pages). That means that no URLs have been submitted to Bing either. My apprehension is that Bing Webmaster Tools does not understand the Japanese URLs (or the Kanji for that matter). Therefore, I generally wonder what the correct way is to go on about this. When viewing the sitemap ( example.com/ja/page-sitemap.xml ) in a web browser, though, the Japanese URL's characters are already displayed as encoded. I am not sure if submitting the Kanji style URLs separately is a solution. In Bing Webmaster Tools this can only be done on the root domain level ( example.com ). However, surely there must be a way to make Bing's sitemap submission understand Japanese style sitemaps? Many thanks everyone for any advice!
Technical SEO | | Hermski0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
How to check if an individual page is indexed by Google?
So my understanding is that you can use site: [page url without http] to check if a page is indexed by Google, is this 100% reliable though? Just recently Ive worked on a few pages that have not shown up when Ive checked them using site: but they do show up when using info: and also show their cached versions, also the rest of the site and pages above it (the url I was checking was quite deep) are indexed just fine. What does this mean? thank you p.s I do not have WMT or GA access for these sites
Technical SEO | | linklander0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
Dynamically-generated .PDF files, instead of normal pages, indexed by and ranking in Google
Hi, I come across a tough problem. I am working on an online-store website which contains the functionlaity of viewing products details in .PDF format (by the way, the website is built on Joomla CMS), now when I search my site's name in Google, the SERP simply displays my .PDF files in the first couple positions (shown in normal .PDF files format: [PDF]...)and I cannot find the normal pages there on SERP #1 unless I search the full site domain in Google. I really don't want this! Would you please tell me how to figure the problem out and solve it. I can actually remove the corresponding component (Virtuemart) that are in charge of generating the .PDF files. Now I am trying to redirect all the .PDF pages ranking in Google to a 404 page and remove the functionality, I plan to regenerate a sitemap of my site and submit it to Google, will it be working for me? I really appreciate that if you could help solve this problem. Thanks very much. Sincerely SEOmoz Pro Member
Technical SEO | | fugu0