Can too many pages hurt crawling and ranking?
-
Hi,
I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good.
We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)?
Please advice
-
Hi,
I don't believe having toooooooo many pages will hurt crawling and ranking. Actually having a lot of pages will give crawl bots more pages to crawl and when someone searches for keywords related to your pages, your pages might show up.
The only 2 problems I see from having too many pages are
-
With all these pages, are they all unique? With a lot of pages, it will be hard to manager and to keep track if all of them are unique. If you don't have unique pages and have a lot of duplicate, that will hurt your ranking.
-
The second problem is are you inter-linking all your pages? Can the bot crawl all your pages? You will need to have a good linking system and direct bots to different pages for them to crawl. Having a lot of pages will be difficult to manage as I mentioned above. Can you interlink all of them so the bots can crawl all of them? One solution I see to this is submitting a Sitemap but I am not sure if they will index everything since I had a problem with Google only indexing 4% of my sitemap and still can't find solution.
Hope this helps!
-
-
This is really just speculation...
It sounds like you're solid on the on-page, site architecture side. I would assume that crawling and indexation will slow down though if your offsite signals don't keep up though. By this, I mean that Google might see that you're doing everything right on your end, but that over time you're not creating content that very many people care to link to, share, etc, so they'll stop wasting resources on you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase in pages crawled per day
What does it mean when GWT abruptly jump from 15k to 30k pages crawled per day? I am used to see spikes, like 10k average and a couple of time per month 50k pages crawled. But in this case 10 days ago moved from 15k to 30k per day and it's staying there. I know it's a good sign, the crawler is crawling more pages per day, so it's picking up changes more often, but I have no idea of why is doing it, what good signals usually drive google crawler to choose to increase the number of pages crawled per day? Anyone knows?
Technical SEO | | max.favilli1 -
Advice on improving ecommerce product detail pages to rank better in google search results.
Hi all, I run an ecommerce website, not a great ranked site, however i want to try and improve the product detail pages. To do this, i am first going to focus on 1 page (this one: http://goo.gl/eS62SU) If i type the product code directly into google.co.uk search i am on the 8th page (see https://www.google.co.uk/#q=hac-hfw2220r-z&start=70) which is a bit poor to say the least. I see this kind of thing for a lot of my products. Hence, i am going to see if over the next month or two i can get this one page moving up the rankings purely with on page optimisation. I would like to ask a couple of things: 1. Is there anything that jumps out at you as to why that product detail page could NOT ever rank well, i.e some code / set up of page etc that prevents google ranking it 2. Any advice you could give that might improve that page in rankings for its product code. FYI - I can not change the dynamic URL, I only have control over such things as product name / summary / features / spec etc any advice welcome
Technical SEO | | isntworkdull0 -
All other things equal, do server rendered websites rank higher than JavaScript web apps that follow the AJAX Crawling Spec?
I instinctively feel like server rendered websites should rank higher since Google doesn't truly know that the content its getting from an AJAX site is what the user is seeing and Google isn't exactly sure of the page load time (and thus user experience). I can't find any evidence that would prove this, however. A website like Monocle.io uses pushstate, loads fast, has good page titles, etc., but it is a JavaScript single page application. Does it make any difference?
Technical SEO | | jeffwhelpley0 -
Is page rank lost through a 301 redirect?
Hi everyone. I'd really appreciate your help with this one 🙂 I've just watched Matt Cutt's video 'what percentage of PageRank is lost through a 301 redirect?' and I am confused. I had taken this to mean that a re-direct would always lose you page rank, but watching it again I am not so sure. He says that the amount of page rank lost through a 301 redirect is the same as any other link. Does this mean that no page rank at all is lost during site migrations? Or is it the case that first page rank would be lost from the original link and then more page rank would be lost from any subsequent redirects? watch?v=Filv4pP-1nw
Technical SEO | | RG_SEO0 -
Indexed pages and current pages - Big difference?
Our website shows ~22k pages in the sitemap but ~56k are showing indexed on Google through the "site:" command. Firstly, how much attention should we paying to the discrepancy? If we should be worried what's the best way to find the cause of the difference? The domain canonical is set so can't really figure out if we've got a problem or not?
Technical SEO | | Nathan.Smith0 -
Htaccess redirects how many can i have and does it slow site down
I have had to redesign my site this year www.in2town.co.uk because my hosting company made a huge mistake while trying to update the joomla on my site which resulted in me losing thousands of pages and links. What i would like to know is, i have put some of the old urls in my htaccess file but i would like to know how many old urls i can have in my htaccess file as i am unsure how to use it properly. my idea was, to have some of the lost url links to my site and put them in my htaccess file and have them pointing to similar pages. not sure if this is a good idea or not. i think i have lost a few hundred good links but i would like to know if these urls in the htaccess file would slow down my joomla site any advice would be great.
Technical SEO | | ClaireH-1848860 -
Why the number of crawled pages is so low¿?
Hi, my website is www.theprinterdepo.com and I have been in seomoz pro for 2 months. When it started it crawled 10000 pages, then I modified robots.txt to disallow some specific parameters in the pages to be crawled. We have about 3500 products, so thhe number of crawled pages should be close to that number In the last crawl, it shows only 1700, What should I do?
Technical SEO | | levalencia10 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10