Why will google not index my pages?
-
About 6 weeks ago we moved a subcategory out to becomne a main category using all the same content. We also removed 100's of old products and replaced these with new variation listings to remove duplicate content issues.
The problem is google will not index 12 critcal pages and our ranking have slumped for the keywords in the categories.
What can i do to entice google to index these pages?
-
In google webmaster tools the sitemaps report 14 urls in our category listings not indexed. However, I have had trouble identifying which urls it is
-
It looks like that page and the category pages have been indexed:-
site:towelsrus.co.uk/Towels/Hand-Towels/prodlist_ct472.htm
site:towelsrus.co.uk/Towels/Face-Cloths-And-Flannels/prodlist_ct471.htm
site:towelsrus.co.uk/Towels/catlist_fnct561.htmThe cache view of the above pages is the same as the current versions of the pages on your site.
Are you confusing pages not being indexed with pages not ranking?
-
Ok, I think the fatal mistake made is that these category pages are identical to those which it replaced and those origional pages I guess have not been removed from Googles Index.
How do I get out of this hole and get these critcal pages indexed? Would fetching as google bot do anything realistically?
-
You cant entice them.....you can make sure they get indexed properly.
Make sure they are being crawled and their are no errors on the page. (no duplicate title tags, etc.) You can do this with the campaign tool in SEOMOZ.
Make sure you build your site with a logical link structure as each page should be reachable from at least one static text link.
Make sure that your sitemap has them listed correctly.
Make sure they are original content and not duplicated somewhere else on the internet or on another of your sites.
This should get you in good shape. Hope this helps.
Mark
-
Fraser, one thing you can try (if you haven't already) is to use the "Fetch as Google" feature in webmaster tools.
Once you've entered the url of your page, and Google retrieves the data, you have the option to then submit that url to the index.
I've done this a couple of times and it seems to work - of course, Google might have been ready to index the pages anyway, but who knows!
-
Yes, there is 12 pages within this category. All category pages
-
Hi Fraser,
Do you have a link to one of the pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do uncrawled but indexed pages affect seo?
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link) I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart. Disallow: /cart But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time? -I can't change anything from the robots.txt -I could try to nofollow those internal links What do you think?
Intermediate & Advanced SEO | | cuarto7150 -
22 Pages 7 Indexed
So I submitted my sitemap to Google twice this week the first time everything was just peachy, but when I went back to do it again Google only indexed 7 out of 22. The website is www.theinboundspot.com. My MOZ Campaign shows no issues and Google Webmaster shows none. Should I just resubmit it?
Intermediate & Advanced SEO | | theinboundspot1 -
Will a Google manual action affect all new links, too?
I have had a Google manual action (Unnatural links to your site; affects: all) that was spurred on by a PRWeb press release where publishers took it upon themselves to remove the embedded "nofollow" tags on links. I have been spending the past few weeks cleaning things up and have submitted a second pass at a reconsideration request. In the meantime, I have been creating new content, boosting social activity, guest blogging and working with other publishers to generate more natural inbound links. My question is this: knowing that this manual action affects "all," are the new links that I am building being negatively tainted as well? When the penalty is lifted, will they regain their strength? Is there any hope of my rankings improving while the penalty is in effect?
Intermediate & Advanced SEO | | barberm1 -
Can Google index PDFs with flash?
Does anyone know if Google can index PDF with Flash embedded? I would assume that the regular flash recommendations are still valid, even when embedded in another document. I would assume there is a list of the filetype and version which Google can index with the search appliance, but was not able to find any. Does anyone have a link or a list?
Intermediate & Advanced SEO | | andreas.wpv0 -
Can links indexed by google "link:" be bad? or this is like a good example by google
Can links indexed by google "link:" be bad? Or this is like a good example shown by google. We are cleaning our links from Penguin and dont know what to do with these ones. Some of them does not look quality.
Intermediate & Advanced SEO | | bele0 -
Previously ranking #1 in google, web page has 301 / url rewrite, indexed but now showing for keyword search?
Two web pages on my website, previously ranked well in google, consistent top 3 places for 6months+, but when the site was modified, these two pages previously ending .php had the page names changed to the keyword to further improve (or so I thought). Since then the page doesn't rank at all for that search term in google. I used google webmaster tools to remove the previous page from Cache and search results, re submitted a sitemap, and where possible fixed links to the new page from other sites. On previous advice to fix I purchased links, web directories, social and articles etc to the new page but so far nothing... Its been almost 5 months and its very frustrating as these two pages previously ranked well and as a landing page ended in conversions. This problem is only appearing in google. The pages still rank well in Bing and Yahoo. Google has got the page indexed if I do a search by the url, but the page never shows under any search term it should, despite being heavily optimised for certain terms. I've spoke to my developers and they are stumped also, they've now added this text to the effected page(s) to see if this helps. Header("HTTP/1.1 301 Moved Permanently");
Intermediate & Advanced SEO | | seanclc
$newurl=SITE_URL.$seo;
Header("Location:$newurl"); Can Google still index a web page but refuse to show it in search results? All other pages on my site rank well, just these two that were once called something different has caused issues? Any advice? Any ideas, Have I missed something? Im at a loss...0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Google indexing issue?
Hey Guys, After a lot of hard work, we finally fixed the problem on our site that didn't seem to show Meta Descriptions in Google, as well as "noindex, follow" on tags. Here's my question: In our source code, I am seeing both Meta descriptions on pages, and posts, as well as noindex, follow on tag pages, however, they are still showing the old results and tags are also still showing in Google search after about 36 hours. Is it just a matter of time now or is something else wrong?
Intermediate & Advanced SEO | | ttb0