Diluting your authority - adding pages diluting rankings of other pages?
-
I'm looking after a site that has around 400 pages. All of these pages rank pretty well for the KW they are targetting.
My question is: if we add another 400 pages without doing any link building work, holding DA the same, 1) would the rankings of those 400 previously good pages diminish? and 2) Would the new pages, as more and more new ones are created, rank less and less well?
-
If the new pages are fantastic content, this will only help.
They will likely attract more inbound links if done well.
I would focus on ensuring that the link real estate (taxonomy, navigation) doesn't change in this case if you want to retain the rankings you have.
If the new pages are sub categories / children of the current pages, then link down to them from those pages, and back up to the current pages in breadcrumb (which will further reinforce those rankings)
Hope this helps!
-
I wouldn't worry about it. Add the 400 pages and make sure it has good content on it. Don't submit thin content. If you don't add those 400 new product pages, then you will miss out on new opportunities. Make sure you are NOT just copying the description from the manufacturer for duplicate content reasons.
-
Hi
Actually the 400 new pages would not be similar content at all, the site sells loads of products and these are just static product pages, all completely unique products that are not targetting the same KW.
So, the reality is that, creating new content will not "thin out" the page authority of the already existing pages?
-
Hi
What your saying has two ways of looking at it.
1. Adding new pages that target the or similar same keywords as the ranking pages will dilute those rankings as google will not have the one focused page to rank anymore
2. Adding new pages that dont target the same keywords will help rankings of the current pages (aslong as not targeting same or similar keywords) as google loves new content.
So with the above, if your going to create new content, dont go for 400 new pages all at once, try and create '2nd tier' pages that help support some of the '1st tier' pages which are ranking.
For example, say one of your ranked pages is a product page selling 'red dog collars' ranking for the the keyword 'red dog collars' - you could create a new page called 'how to correctly fasten a collar to your dog' which talks about this product in general terms - then you simply link this new page to your ranked product page thus achieving two goals in one go!
Hope the above makes sense and helps clarify your question!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal pages ranking over the homepage: How to optimise to rank better at Google?
Hi, We have experienced a shift in SERP from internal pages ranking over website homepage for more than a year. Previously website homepages used to rank for the primary keyword like moz.com for "SEO". Now we can see that internal pages like moz.com/learn/seo/what-is-seo been ranking for the primary keyword "SEO". Google is picking up these "what is ABC" pages than the homepage. All our competitor sites are ranking with these internal pages which are about "what is (primary keyword)". We do have the same internal pages "what is....", but this pages is not ranking; only our homepage is ranking. Moreover we dropped more than 15 positions after this shift in SERP. How to diagnose this? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate website pages indexed: Ranking dropped. Does Google checks the duplicate domain association?
Hi all, Our duplicate website which is used for testing new optimisations got indexed and we dropped in rankings. But I am not sure whether this is exact reason as it happened earlier too where I don't find much drop in rankings. Also I got replies in the past that it'll not really impact original website but duplicate website. I think this rule applies to the third party websites. But if our own domain has exact duplicate content; will Google knows that we own the website from any other way we are associated like IP addresses and servers, etc..to find the duplicate website is hosted by us? I wonder how Google treats duplicate content from third party domains and own domains. Thanks
Algorithm Updates | | vtmoz0 -
Bing not indexing pages
We have taken all recommended steps to index our site sitegeek.com pages to Bing Bot but failed to index them. Bing bot crawled more than 5,000 pages every day but strange why pages are not getting index ? if we query site:sitegeek.com in Bing Bing Search Engine shows only 1,200 pages got indexed. but we query site:sitegeek.com in Google Google Search Engine show more 546,000 pages got indexed. For example : https://www.sitegeek.com/000webhost Above page crawled by Google but Bing. Can anyone suggest what we are missing on this page? what need to change to index such pages? Thanks! Rajiv
Algorithm Updates | | gamesecure0 -
Specific Page Penalty?
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me. Checked by using gInfinity extension and searched for the page URL. Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised? The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised. Would appreciate any advice. Thanks
Algorithm Updates | | notnem0 -
Measuring Author Rank
It's pretty clear that "AuthorRank" is going to be a big thing for SEO. Although the main principles seem to be pretty straightforward (http://www.seomoz.org/blog/authorship-google-plus-link-building) what I'm less clear about is how we can start to think about author influence as a measurable metric. Webmaster tools gives us Author Stats as an impact on the site's impressions/CTR, but how do we measure the influence of an individual author? Are those factors even defined? Will we get to a stage where authors can be given a Klout-like score for Google Authorship? If not that, how will it look? This will be a HUGE question to solve for future content development strategies, and is something I'm thinking a lot about right now. Best, Matt.
Algorithm Updates | | MattBarker3 -
Google showing different pages for same search term in uk and usa
Hi Guys, I have an interesting question and think Google is being a bit strange.. Can anyone tell me why when I input the term design agency in Google.co.uk it shows one page, but when i tyupe in the same search term in Google.com (worldwide search) it shows another page.. Any ideas guys? Is this not bit strange?? Any help here be much appreciated.. Thanks Gareth
Algorithm Updates | | GAZ090 -
Google and Content at Top of Page Change?
We always hear about how Google made this change or that change this month to their algorithm. Sometimes it's true and other times it's just a rumor. So this week I was speaking with someone in the SEO field who said that this week a change occurred at Google and is going to become more prevalent where content placed at the "top of the fold" on merchant sites with products are going to get better placement, rather than if you have your products at top with some content beneath them at the bottom of the page. Any comments on this?
Algorithm Updates | | applesofgold0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0