Any idea why pages are not being indexed?
-
Hi Everyone,
One section on our website is not being indexed. The product pages are, but not some of the subcategories. These are very old pages, so thought it was strange. Here is an example one one:
https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html
If you take a chunk of text, it is not found in Google. No issues in Bing/Yahoo, only Google. You think it takes a submission to Search Console?
Jeff
-
So I am testing removing some of the restrictions in the robots.txt file and see if that helps as I still can't get it to be indexed.
-
Yeah...it's very close to what I have. I also checked other websites I own with the same category structure and robots.txt file...no issues.
I even checked other subcats on www.moregems.com, and no issues. It seems to be all the pages under "GEMSTONES" that are not being indexed. Any thoughts there?
-
I usually do robots.txt for Magento sites custom. But I did find a good example to use. Check out this site:https://www.magikcommerce.com/blog/set-up-robots-txt-in-magento/
I would edit anything that doesn't fit your site.
Hope this helps!
-
So I submitted https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html and fetched it in Google Search Console a few hours ago. Still not being indexed. I don't see any issues in the robots.txt file. Any thoughts?
-
Hi Nicholas,
I asked him this as well, but do you have any resources for a "good" Magento specific robots.txt file? I want to try updating it, as it has been the same for about 7 years.
The strange thing is the deeper product pages are indexed, but not the subcats.
-
Hi Christian,
Do you have any resources for a recommended Magento robots.txt file? I added this probably 6-7 years ago, and have not updated it since. I can definitely try that.
Jeff
-
Hi Jeff,
In addition to Christian's recommendation (which I would do first), use Google Search Console's Fetch & Render Tool to request your non-indexed pages to Google's index. Sometimes this tool in GSC will have them indexed immediately.
It is not uncommon for deep links or internal pages of internal pages to not be immediately indexed. It is definitely important to use new pages to link our to other pages on your website, and if possible go in and link to your new pages from older (already indexed) pages on your website
-
Hey Jeff,
I just ran a quick scan of the site, it looks like you have a lot of links, pages, and directories being blocked by your robots.txt file: https://www.moregems.com/robots.txt
I would make sure the pages you want to be indexed by search engines are not being blocked in your robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Gradual Drop in GWT Indexed Pages for large website
Hey all, I am working on SEO for a massive sports website. The information provided will be limited but I will give you as much context as possible. I just started digging into it and have found several on-page SEO issues of which I will fix when I get to the meat of it but this seems like something else could be going on. I have attached an image below. It doesn't seem like it's a GWT bug as reported at one point either as it's been gradually dropping over the past year. Also, there is about a 20% drop in traffic in Google Analytics over this time as well. This website has hundreds of thousands of pages of player profiles, sports team information and more all marked up with JSON-LD. Some of the on-page stuff that needs to be fixed are the h1 and h2, title tags and meta description. Also, some of the descriptions are pulled from wikipedia and linked to a "view more" area. Anchor text has "sign up" language as well. Not looking for a magic bullet but to be pointed in the right direction. Where should I start checking off to ensure I cover my bases besides the on page stuff above? There aren't any serious errors and I don't see any manual penalties. There are 4,300 404's but I have seen plenty of sites with that many 404's all of which still got traffic. It doesn't look like a sitemap was submitted to GWT and when I try submitting sitemap.xml, I get a 504 error (network unreachable). Thanks for reading. I am just getting started on this project but would like to spend as much time sharpening the axe before getting to work. lJWk8Rh
Technical SEO | | ArashG0 -
Site Crawl -> Duplicate Page Content -> Same pages showing up with duplicates that are not
These, for example: | https://im.tapclicks.com/signup.php/?utm_campaign=july15&utm_medium=organic&utm_source=blog | 1 | 2 | 29 | 2 | 200 |
Technical SEO | | writezach
| https://im.tapclicks.com/signup.php?_ga=1.145821812.1573134750.1440742418 | 1 | 1 | 25 | 2 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=blog&utm_campaign=brightpod-article | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=marketplace&utm_campaign=homepage | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=blog&utm_campaign=first-3-must-watch-videos | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?_ga=1.159789566.2132270851.1418408142 | 1 | 5 | 31 | 2 | 200 |
| https://im.tapclicks.com/signup.php/?utm_source=vocus&utm_medium=PR&utm_campaign=52release | Any suggestions/directions for fixing or should I just disregard this "High Priority" moz issue? Thank you!0 -
How to know how much pages are indexed on Google?
I have a big site, there are a way to know what page are not indexed? I know that you can use site: but with a big site is a mess to check page by page. This is a tool or a system to check a entire site and automatically find non-indexed pages?
Technical SEO | | markovald0 -
Deleting 30,000 pages all at once - good idea or bad idea?
We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes. We no longer want them weighing down our database so we are going to delete them. My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?
Technical SEO | | Viewpoints0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Best on-line tool for checking indexed pages (or just for a Mac)
Hey guys, I'm on a Mac and that's why I can't use the usual PC software for checking if my links have been indexed. Here's the deal. I ordered some guest posts. The guest poster did it for me and put my back links. Now, I want to quickly check which pages (with my backlinks) have been indexed. I have a lot of guest posts. So, I need something that can check if those pages have been indexed by Google. I need an online tool or something that will work for my Mac. Help. 🙂
Technical SEO | | VinceWicks0 -
Targeting multiple keywords with index page
Quick keyword question.... I just started working with a client that is ranking fairly well for a number of keywords with his index page. Right now he has a bunch of duplicate titles, descriptions, etc across the entire site. There are 5 different keywords in the title of the index page alone. I am wondering if it OK to target 3 different keywords with the index page? Or, if I should cut it down to 1. Think blue widget, red widget, and widget making machines. I want each of the individual keywords to improve but don't want to lose what I have either. Any ideas? THANKS!!!!
Technical SEO | | SixTwoInteractive0 -
Will rel=canonical cause a page to be indexed?
Say I have 2 pages with duplicate content: One of them is: http://www.originalsite.com/originalpage This page is the one I want to be indexed on google (domain rank already built, etc.) http://www.originalpage.com is more of an ease of use domain, primarily for printed material. If both of these sites are identical, will rel=canonical pointing to "http://www.originalsite.com/originalpage" cause it to be indexed? I do not plan on having any links on my site going to "http://www.originalsite.com/originalpage", they would instead go to "http://www.originalpage.com".
Technical SEO | | jgower0