Should I noindex user-created fundraising pages?
-
Hello Moz community!
I work for a nonprofit where users are able to create their own fundraising pages on the website for supporters to directly donate. Some of them are rarely used, others get updated frequently by the host. There are likely a ton of these on our site. Moz crawl says we have ~54K pages, and when I do a "site:[url]" search on Google, 90% of the first 100 results are fundraising pages.
These are not controlled by our staff members, but I'm wondering if meta noindexing these pages could have a big effect on our SEO rankings. Has anyone tried anything similar or know if this strategy could have legs for our site?
My only concern is whether users wouldn't be able to find their fundraising page in our Google CSE implemented on the website.
Any insight you fine folks could provide would be greatly appreciated!
-
I'd tread very carefully here as thing 1 and thing 2 seem to contradict each other at face value. You're right, Google can send traffic to a site in ways other than keywords, but it's not the norm. The next thing I'd look at is, hmm - how are we tracking keyword rankings? Is it an online, cloud based rank tracker that relies on you specifying all of (and all of the right) keywords to track? Most of those trackers track between 50 and 300 KWs (daily, weekly) but it's not uncommon for such sites to have 10,000+ keywords contributing. If they're not all in there, it's a bad sample you are looking at. Connect Google Search Console to Google Analytics. let it run for a few weeks, analyse the 'search query' data from within Google Analytics (which can be done once it's all hooked up). GSC only lets you export 1k keywords (usually, sometimes it can be more) but GA will take 5k and that's much better for your analysis. You might be surprised to find, those pages rank for more keywords than you thought. maybe hundreds of little ones, instead of a few big ones
-
Effectdigital is right in looking at your analytics and backlinks to help make this decision.
In the Moz case study we referenced earlier, they were getting rid of pages that didn't provide value at all to anyone. Those pages probably didn't have any links pointing to them at all. So it made sense to get rid of them.
Since your pages are providing value (it seems) and your getting 1/3 of your traffic coming into those pages, we would tread carefully on meta noindexing them.
You might only consider meta noindexing a group of them that haven't brought in any traffic this whole year and that don't have any links pointing to them. That way, you won't lose any existing traffic that your getting but you can see if the trimming helps your site's overall traffic and rankings.
-
Appreciate the word of caution, I'm relatively new and am looking for well-rounded opinions about the repercussions such a massive move could make for our site. As a response:
Thing #1: We don't have many fundraising pages that rank highly for keywords, as we're still working on juicing up our regular site pages as is to improve in the SERP results. I was mainly wondering whether the glut of fundraising pages could be harming our SERP results. Some certainly have duplicate content but that's beyond our control, and I'm not sure if that could significantly be harming our results. Any thoughts on that?
Thing #2: Great call on checking the data. YTD nearly 1/3 of our user sessions have landed on one of these fundraising pages. I'm guessing that's likely either the hosts using google to find their page and then subsequently log in, or friends searching for it on google and then navigating and donating. We do still have a Google Custom Search Engine on our site. Presumably people could find them that way?
If you have any additional opinions or feedback given what I detailed above, I'd very much appreciate it!
-
Be VERY careful
Thing #1) Just because you stop Google indexing and crawling some pages, that doesn't mean they will give that same traffic (keywords linking to those pages) to other URLs on your site. They may decide that your other URLs, do not satisfy the specific keywords connecting with the fundraiser URLs
Thing#2) CHECK. Go onto Google Analytics and actually check what percentage of your Google traffic (and overall traffic, I guess) comes specifically through these URLs. If it's like 2-3%, no big deal. If most of your traffic comes to and lands on these pages, no-indexing them all could be the single largest mistake you'll ever make
Blog posts and articles are fun but no substitute for checking your own, real, actual, factual data. Always always do that
-
Thanks! I've been wondering about it for awhile and actually stumbled upon this very article today - which prompted the question
-
Britney Muller, with Moz, did just that when she meta noindexed over 70,000 low quality profile pages created by users. As a result, Moz saw an increase in organic users, almost 9% the following month and then they saw a lift of 13.7% year-over-year for organic traffic the following month.
You can read all about it or watch the interview about it here: https://www.getcredo.com/britney-muller/
We think it's worth a try for sure.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forums and soical network pages - link analysis
Hi, I am working on reputation management project and i reach the point when i found that forums pages and social pages are hard to analyse their links and page rank. for example if you search these kind of pages on http://www.opensiteexplorer.org/ , it wont show any result And if you try to find the page rank on Moz toolbar it will show 1 pagerank How would you do guys when u are in this situation, Do you consider these kind of pages different than normal pages? Thanks
Moz Pro | | MohammadSabbagh0 -
Duplicate Page Titles & Content
We have just launched a new version of a website and after running it through SEOMOZ we have over 6000 duplicate title & content errors. (awesome) 😕 We have products that show up multiple times under different URLs however we "thought" we had implemented the rel=canonical correctly. My question is - do these errors still show up in SEOMOZ despite the canonical tags being there OR if they were "correct" would we be getting "zero" errors?
Moz Pro | | ZaddleMarketing0 -
Concerned About Individual Pages
Okay. I've setup a campaign for www.site.com and given a list of keywords. So after the initial crawl we'll have some results. What I'm looking for tho is how do individual pages on my site rank for the list of keywords given. And then be able to go to a screen in seomoz with data for that particular page with recommendations and stuff like that. Is this what's going to happen or do I need to create a campaign for each url i want to track? If all will work as I'd like in the example above, should I then add the second list of keywords that some other pages should rank for? Will it get to be a big mess or can I relate the keywords to pages in some way? It seems like what I'm looking for is what this program should be... Thanks!
Moz Pro | | martJ0 -
Crawl Disgnosis only crawling 250 pages not 10,000
My crawl diagnosis has suddenly dropped from 10,000 pages to just 250. I've been tracking and working on an ecommerce website with 102,000 pages (www.heatingreplacementparts.co.uk) and the history for this was showing some great improvements. Suddenly the CD report today is showing only 250 pages! What has happened? Not only is this frustrating to work with as I was chipping away at the errors and warnings, but also my graphs for reporting to my client are now all screwed up. I have a pro plan and nothing has (or should have!) changed.
Moz Pro | | eseyo0 -
Google Hiding Indexed Pages from SERPS?
Trying to troubleshoot an issue with one of our websites and noticed a weird discrepancy. Our site should only have 3 pages in the index. The main landing page with a contact form and two policy pages, yet google reports over 1,100 pages (that part is not a mystery, I know where they are coming from.....multi site installations of popular CMS's leave much to be desired in actually separating websites) Here is a screen shot showing the results of the site command: http://www.diigo.com/item/image/2jing/oseh I have set my search settings to show 100 (the max number of results) results per page. Everything is fine until I get to page three where I get the standard "In order to show you the most relevant results, we have omitted some entries very similar to the 122 already displayed." But wait a second, I clicked on page three, now there are only two pages of results and the number of results reported has dropped to 122 http://www.diigo.com/item/image/2jing/r8c9 When I click on the "show omitted results" I do get some more results, and the returned results jumps back up to 1,100. However I only get three pages of results. And when I click on the last page the number of results returned changes to 205 http://www.diigo.com/item/image/2jing/jd4h Is this a difference between indexes (same thing happens when I turn instant search back on, Shows over 1,100 results but when I get to the last page of results it changes to 205). Any other way of getting this info? I am trying to go in and identify how these pages are being generated, but I have to know what ones are showing up in the index for that to happen. Only being able to access 1/5th of the pages indexed is not cool. Anyone have any idea about this or experience with it? For reference I was going through with SEOmoz's excellent toolbar and exporting the results to csv (using the Mozilla plugin). I guess google doesn't like people doing that so maybe this is a way to protect against scraping by only showing limited results in the Site: command. Thanks!
Moz Pro | | prima-2535090 -
Wrong duplicated page content
I found out that some errors on my website are considered as "duplicated page content" while they are not, the content is different on each page. I wonder why ? Is it an issue from Seomoz ?
Moz Pro | | Amadeus_eBC0 -
Need to find all pages that link to list of pages/pdf's
I know I can do this in OSE page by page, but is there a way I can do this in a large batch? There are 200+ PDF's that I need to figure out what pages (if any) link to the PDF. I'd rather not do this page by page, but rather copy-paste the entire list of pages I'm looking for. Any tools you know of that can do this?
Moz Pro | | ryanwats0 -
An error in the SeoMoz On page note?
Hello folks, Whenever I go the OnPage link in SeoMoz some of my links show a F ranking note. And when I click in one of them to see the detail of the page rank, it shows me as an A ranking note. Do you have seen the same problem? Which note shall I rely on? Thanks!!
Moz Pro | | jgomes0