Any idea why pages are not being indexed?
-
Hi Everyone,
One section on our website is not being indexed. The product pages are, but not some of the subcategories. These are very old pages, so thought it was strange. Here is an example one one:
https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html
If you take a chunk of text, it is not found in Google. No issues in Bing/Yahoo, only Google. You think it takes a submission to Search Console?
Jeff
-
So I am testing removing some of the restrictions in the robots.txt file and see if that helps as I still can't get it to be indexed.
-
Yeah...it's very close to what I have. I also checked other websites I own with the same category structure and robots.txt file...no issues.
I even checked other subcats on www.moregems.com, and no issues. It seems to be all the pages under "GEMSTONES" that are not being indexed. Any thoughts there?
-
I usually do robots.txt for Magento sites custom. But I did find a good example to use. Check out this site:https://www.magikcommerce.com/blog/set-up-robots-txt-in-magento/
I would edit anything that doesn't fit your site.
Hope this helps!
-
So I submitted https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html and fetched it in Google Search Console a few hours ago. Still not being indexed. I don't see any issues in the robots.txt file. Any thoughts?
-
Hi Nicholas,
I asked him this as well, but do you have any resources for a "good" Magento specific robots.txt file? I want to try updating it, as it has been the same for about 7 years.
The strange thing is the deeper product pages are indexed, but not the subcats.
-
Hi Christian,
Do you have any resources for a recommended Magento robots.txt file? I added this probably 6-7 years ago, and have not updated it since. I can definitely try that.
Jeff
-
Hi Jeff,
In addition to Christian's recommendation (which I would do first), use Google Search Console's Fetch & Render Tool to request your non-indexed pages to Google's index. Sometimes this tool in GSC will have them indexed immediately.
It is not uncommon for deep links or internal pages of internal pages to not be immediately indexed. It is definitely important to use new pages to link our to other pages on your website, and if possible go in and link to your new pages from older (already indexed) pages on your website
-
Hey Jeff,
I just ran a quick scan of the site, it looks like you have a lot of links, pages, and directories being blocked by your robots.txt file: https://www.moregems.com/robots.txt
I would make sure the pages you want to be indexed by search engines are not being blocked in your robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I "no-index" two exact pages on Google results?
Hello everyone, I recently started a new wordpress website and created a static homepage. I noticed that on Google search results, there are two different URLs landing on same content page. I've attached an image to explain what I saw. Should I "no-index" the page url? Google url.JPG In this picture, the first result is the homepage and I try to rank for that page. The last result is landing on same content with different URL. So, should I no-index last result as shown in image?
Technical SEO | | amanda59640 -
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Deleting 30,000 pages all at once - good idea or bad idea?
We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes. We no longer want them weighing down our database so we are going to delete them. My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?
Technical SEO | | Viewpoints0 -
Page not indexed but still has a PageRank, how?
http://www.optiproerp.com/products.aspx page is not indexed in Google but still has a PageRank of 1. How? Regards
Technical SEO | | IM_Learner0 -
Should I allow index of category / tag pages on Wordpress?
Quite simply, is it best to allow index of category / tag pages on a Wordpress blog or no index them? My thought is Google will / might see it as duplicate content? Thanks, K
Technical SEO | | SEOKeith0 -
Getting more pages indexed by yahoo and bing
Anyone has a reliable way to get more pages indexed in yahoo and bing. Please dont say to get more inner page quality links.
Technical SEO | | mickey110