PR Dilution and Number of Pages Indexed
-
Hi Mozzers,
My client is really pushing for me to get thousands, if not millions of pages indexed through the use of long-tail keywords.
I know that I can probably get quite a few of them into Google, but will this dilute the PR on my site?
These pages would be worthwhile in that if anyone actually visits them, there is a solid chance they will convert to a lead do to the nature of the long-tail keywords.
My suggestion is to run all the keywords for these thousands of pages through adwords to check the number of queries and only create pages for the ones which actually receive searches.
What do you guys think?
I know that the content needs to have value and can't be scraped/low-quality and pulling these pages out of my butt won't end well, but I need solid evidence to make a case either for or against it to my clients.
-
Ah, e-commerce product pages - that makes more sense!
-
Thanks, Doug.
I know that these pages will have a solid conversion rate. The long tail keywords are for product pages on an e-commerce site.
After posting, I took a look at the competitors' sites in the industry, and most of them have 150k - 300k of these similar product pages indexed. This client only has about 40k, so I think we will go ahead and try to beef it up.
-
OK, I get what he's thinking, but there's another problem in addition to diluting or canibalising your money keywords/head-terms, or running into crawl problems etc..
These pages also need to do more than sit there getting traffic for long-tale keywords. They also need to support the goals of your business/website and get people to engage with your website and writing content (thousands of pages) that'll do this is a tough ask!
It's not just about the volume of the traffic - but the relevance and intent of that traffic and what ultimately that traffic is worth to you or your client.
If visitors click through to your content and bounce straight back to the search results then you're just wasting your time and your time/money. ( http://moz.com/blog/solving-the-pogo-stick-problem-whiteboard-friday )
Take a look at he engagement metrics for your current long-tale keywords. What's the bounce rate / page depth for these visitors. Do any of them actually likely to convert?
Don't know if that'll help persuade your client.... good luck!
-
Hi, Travis-
I wouldn't be concerned with "diluting" any pagerank that you already have, if that's your question. You may be setting up new pages in competition with existing pages, if they're going after the same terms, though. The longer the existing pages have been out there, unmodified, the easier a new page may outrank it, all other things being equal.
Unfortunately, without seeing the site, it's impossible to even render a guestimate. For instance, how's your crawl budget? if it's limited, QDF could possibly allow newer pages to push older pages into the shadows. You might check out this post: http://www.algohunters.com/building-solid-index-presence-optimizing-crawl-budget/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect to Home Page or Sub-Page?
What do you think about 301 redirect of good expired domain to a sub-page instead of the home page? I'm doing this so I don't hurt my brand name. Let me know your thoughts please. Thank you
Intermediate & Advanced SEO | | JuanWork0 -
Why would my total number of indexed pages stop increasing?
I have an ecommerce marketplace that has new items added daily. In search consoloe my pages have always gone up almost every week. It hasn't increased in 5 weeks. We haven't made any changes to the site and the sitemap looks good. Any ideas on what I should look for?
Intermediate & Advanced SEO | | EcommerceSite0 -
How can a Page indexed without crawled?
Hey moz fans,
Intermediate & Advanced SEO | | atakala
In the google getting started guide it says **"
Note: **Pages may be indexed despite never having been crawled: the two processes are independent of each other. If enough information is available about a page, and the page is deemed relevant to users, search engine algorithms may decide to include it in the search results despite never having had access to the content directly. That said, there are simple mechanisms such as robots meta tags to make sure that pages are not indexed.
" How can it happen, I dont really get the point.
Thank you0 -
My indexed pages count is shrinking in webmaster tools. Is this normal ?
I noticed that our total # of indexed pages dropped recently by a substantial amount (see chart below) Is this normal? http://imgur.com/4GWzkph Also, 3 weeks after this started dropping, we got a message on increased # of crawl errors and found that a site update was causing 300+ new 404s. could this be related ?
Intermediate & Advanced SEO | | znotes0 -
How can Google index a page that it can't crawl completely?
I recently posted a question regarding a product page that appeared to have no content. [http://www.seomoz.org/q/why-is-ose-showing-now-data-for-this-url] What puzzles me is that this page got indexed anyway. Was it indexed based on Google knowing that there was once content on the page? Was it indexed based on the trust level of our root domain? What are your thoughts? I'm asking not only because I don't know the answer, but because I know the argument is going to be made that if Google indexed the page then it must have been crawlable...therefore we didn't really have a crawlability problem. Why Google index a page it can't crawl?
Intermediate & Advanced SEO | | danatanseo0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
1 of the sites i work on keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page
1 of the sites i work on (www.eva-alexander.com) keeps having its home page "de-indexed" by google every few months, I then apply for a review and they put it back up. But i have no idea why this keeps happening and its only the home page I have no idea why and have never experienced this before
Intermediate & Advanced SEO | | GMD10 -
Number of forum posts per topic page
When optimizing a forum topic page for SEO, would it be better to have a higher number of posts per page than seperating the topic up into multiple pages? For example, out of the box a forum may display 15 posts per topic page - would there be any SEO benifit in changing that number to say 30 posts per page? I.e. more content per page and decreasing unnecessary "page 2, page 3, page 4"... etc. Your thoughts and comments are most appreciated.
Intermediate & Advanced SEO | | Peter2640