Questions created by DrewProZ
-
Large site with content silo's - best practice for deep indexing silo content
Thanks in advance for any advice/links/discussion. This honestly might be a scenario where we need to do some A/B testing. We have a massive (5 Million) content silo that is the basis for our long tail search strategy. Organic search traffic hits our individual "product" pages and we've divided our silo with a parent category & then secondarily with a field (so we can cross link to other content silo's using the same parent/field categorizations). We don't anticipate, nor expect to have top level category pages receive organic traffic - most people are searching for the individual/specific product (long tail). We're not trying to rank or get traffic for searches of all products in "category X" and others are competing and spending a lot in that area (head). The intent/purpose of the site structure/taxonomy is to more easily enable bots/crawlers to get deeper into our content silos. We've built the page for humans, but included link structure/taxonomy to assist crawlers. So here's my question on best practices. How to handle categories with 1,000+ pages/pagination. With our most popular product categories, there might be 100,000's products in one category. My top level hub page for a category looks like www.mysite/categoryA and the page build is showing 50 products and then pagination from 1-1000+. Currently we're using rel=next for pagination and for pages like www.mysite/categoryA?page=6 we make it reference itself as canonical (not the first/top page www.mysite/categoryA). Our goal is deep crawl/indexation of our silo. I use ScreamingFrog and SEOMoz campaign crawl to sample (site takes a week+ to fully crawl) and with each of these tools it "looks" like crawlers have gotten a bit "bogged down" with large categories with tons of pagination. For example rather than crawl multiple categories or fields to get to multiple product pages, some bots will hit all 1,000 (rel=next) pages of a single category. I don't want to waste crawl budget going through 1,000 pages of a single category, versus discovering/crawling more categories. I can't seem to find a consensus as to how to approach the issue. I can't have a page that lists "all" - there's just too much, so we're going to need pagination. I'm not worried about category pagination pages cannibalizing traffic as I don't expect any (should I make pages 2-1,000) noindex and canonically reference the main/first page in the category?). Should I worry about crawlers going deep in pagination among 1 category versus getting to more top level categories? Thanks!
Moz Pro | | DrewProZ1 -
Will adding 1M (legitimate/correct) internal backlinks to an orphan page trip algo penalty?
We have a massive long tail user generated gamification strategy that has worked really well. Because of that success we haven't really been paying enough attention to SEO and in looking around caught some glaring issues. The section of our site that works as long tail goes from overview page > first classification > sub classification > specific long tail term page. Looks like we were relying on google to crawl/use forms to go from our overview page to the first classification BUT those resulting pages were orphaned - so www.mysite.com/product/category_1 defaulted back to the search page creating duplicate issues. www.mysite.com/product/category_1 and www.mysite.com/product/category_2 and www.mysite.com/product/category_3 all had duplicate content as they all reverted to the overview page. It's clear we need to make an actual breadcrumb trail and proper site taxonomy/linkage. I'm wanting to do this on just this one area first, but it's a big section with over 3M indexed "specific long tail term pages". I want to just add a simple breadcurmb trail in a sub navigation menu but doing so will literally create millions of new internal backlinks from specific term pages to their sub & parent category pages. Although we're missing the intermediary category breadcrumbs, we did have a breadcrumb coming back to the main overview page - that was tagged nofollow. So now I'm contemplating adding millions of (proper) backlinks and removing a nofollow tag from another million internal back links. All of this seems in line with "best practices" but what I have not been able to determine is if there is a proper/better way to roll these changes out so as to not trigger an algorithm penalty. I am also reticent about making too many changes too quickly but these are SEO 101 basics that need to be rectified. Is it a mistake to make good improvements too quickly? Thanks!
On-Page Optimization | | DrewProZ1