Fresh Content Still As important?
-
We have an internal debate, that perhaps y'all can help us resolve.
In the past "freshness" of content has been important, correct? (Google's QDF for example) In the past (to present) when we build a site with the intent to SEO the site, we build the core pages with the expectation that we will be adding more site pages as the project progresses, thus settling the "fresh content" factor.
But it has been proposed to us, from a client, that completely building the site out with all the pages you hope to rank, getting the upfront bang for your buck. The expectation is that the traffic soars right-off.
Now the client says that he has been doing this for years and has not been affected by any alog changes. (although we have not seen proof of this from him)
So our question is this: Is it better to provide a website full of fresh content at the beginning of the project, for a jumpstart on traffic, then leave the site alone ( for the most part)
or
Is it better to have core pages of fresh content at the start, and build out new pages from their, so the website remains fresh every month?
And can you prove your argument? (we need cold hard facts to be convinced
-
EGOL, a big time member on these forums posted years ago that there will be a day when the only thing that a search engine truly judges a website on is Keywords and content. Now I'm not entirely sure I'm completely on board with that (I'm about 95%), but I do agree that content, especially after the recent SE updates, has shifted back into power.
My father owns a business, we make educational materials for people with mild to severe autism. He is very successful, and he personally doesn't have the time or energy to spend in writing a daily blog, and unfortunately doesn't trust anybody to ghost write for him.
So we came up with an alternative. A combo of original content mixed with educational reports, interesting studies, and every now and then some strange funny story from theOnion. We would post at least one original piece a week, if we could 2, and then everything else from there. I made a few Bullying Infographics for his business to post and share on social media. Now, it wasn't always keyword heavy content, but as long as it was content worth sharing, it did get us a lot of links.
At the end of the day, if I have to make a decision on how Google is doing something, I try to remind myself Google is in the business of making money. They do that by providing the best, accurate, human, natural, semantic, organic, pefect-beacue-I-am-a-snowflake, result. Google, in my opinion, will take how current the website is, into account.
Content is King.
-
This is our thought as well. A continuous feed of fresh content is a better approach than a one off. This is how we've been doing it, but we're really interested in knowing if others have tried this other approach, with any lasting sustainability in traffic or ranking. ( we kind of doubt it, but would love to see proof that it works)
-
The QDF is aimed at hot/current topics right ? So while it might be important for a news site or a celebrity gossip site I don't think it will be relevant for every site.
You have mentioned that the client has proposed to build the site with "all the pages you hope to rank for", which means the topic is restrictive and there is a limit to what you can write about the subject. But then to launch the site with this approach you need to get all the content ready and that might take some time.
A much more sensible approach would be to launch the site with a reasonable amount of content and then add the rest of the content when possible. This way you can start with the link building, social sharing process early.
I don't think just because you launch a site with lots of fresh content it will give you a jump start in traffic, but I'm interested to see if anyone had success with this method.
-
Fresh content is definitely important and while you may get the boost at the start you'll quickly loose it if you're not putting up new content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Big drop in organic traffic after moving the website-should we still do 301 redirects?
Hi,
Reporting & Analytics | | martin1970
We have a website that got redesigned with new urls in Jan 31, 2018. Since then our SEO traffic has gone down big time and to never recover. We did not do any 301 redirects back then (very stupid I know but I was not in charge then). So my question is would it be beneficial to 301 redirect old urls that were once ranked but now have all 404 errors or is it too late to do these 301 to gain any benefits? If a page that was once ranked and then have a 404 error, how long does google keep that 404 page in their database? I have heard information saying that although the page is a 404 it may still be indexed in their backend for some time and then it completely drops off all together. If so do you know how long time they would keep those 404 in their database? The old urls may have had good backlinks pointed to them because the organic traffic was good back then. So I wonder if doing 301 right now would help send some link juice over to the new urls? Or would this be a complete waste of time? Cheers Martin1 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
On Google Analytics, Pages that were 301 redirected are still being crawled. What's the issue here?
URL that we redirected are being crawled on Google Analytics. Since they dont exist, they have high bounce rates. What can the issue be?
Reporting & Analytics | | prestigeluxuryrentals.com0 -
Google Analytics is treating my blog like all the content is just on the home page.
Hello all, I installed Google Analytics on a main website and a blog (blog.travelexinsurance.com) While it appears to be tracking correctly (and when I test it in real time it shows that I'm visiting) but it is treating the entire blog as though it's one page. So I can't see data on blog post X. All I see is that X visitors came to my blog in aggregate. So I see blog.travelex.com has 999 visitors, but it doesn't show that /travel-luggage got 50 visits, while /insurace-tips got 75 and so forth. I assume I screwed up the tracking somehow, but can't figure out where I went wrong. Tracking on the main domain works just fine. It's specific to the blog.
Reporting & Analytics | | Patrick_G0 -
Duplicate page content
I'm seeing duplicate page content for tagged URLs. For example:
Reporting & Analytics | | DolbySEO
http://www.dolby.com/us/en/about-us/careers/landing.html
http://www.dolby.com/us/en/about-us/careers/landing.html?onlnk=al-sc as well as PPC campaigns. We tag certain landing pages purposefully in order to understand that traffic comes from these pages, since we use Google Analytics and don't have the abiility to see clickpaths in the package we have. Is there a way to set parameters for crawling to exclude certain pages or tagged content, such as those set up for PPC campaigns?0 -
Why seomoz shows me "missing meta discription" on this plugin: http://villasdiani.com/wp-content/plugins/dopbsp/frontend-ajax.php ? Should I edit? how?? IIs it posible??
Good day to all! I am very confuse about results on seomoz Crawl Diagnostics Summary, especially with 6 Crawl Warnings Found. It says title too short: http://villasdiani.com/category/mombasa/, http://villasdiani.com/category/watamu/ , http://villasdiani.com/sitemap/ why would Google punish me for this??? why should I make longer title for sitemap? or Watamu? It is the name of the place - Watamu or Mombasa. It is very confusing for me. I have very big mess with the website and it is not ranking:-( what I have done:-( and is it possible to meta description for this plugin: http://villasdiani.com/wp-content/plugins/dopbsp/frontend-ajax.php how?? I even do not know where is it.
Reporting & Analytics | | VillasDiani0 -
Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Hi Friends, This issue is crimping my analytics efforts and I really need some help. I just don't trust the analytics data at this point. I don't know if my problem should be called duplicate content or what, but the SEOmoz crawler shows the following URLS (below) on my nonprofit's website. These are all versions of our main landing pages, and all google analytics data is getting split between them. For instance, I'll get stats for the /camp page and different stats for the /camp/ page. In order to make my report I need to consolidate the 2 sets of stats and re-do all the calculations. My CMS is looking into the issue and has supposedly set up redirects to the pages w/out the trailing slash, but they said that setting up the "ref canonical" is not relevant to our situation. If anyone has insights or suggestions I would be grateful to hear them. I'm at my wit's end (and it was a short journey from my wit's beginning ...) Thanks. URL www.enf.org/camp www.enf.org/camp/ www.enf.org/foundation www.enf.org/foundation/ www.enf.org/Garden www.enf.org/garden www.enf.org/Hante_Adventures www.enf.org/hante_adventures www.enf.org/hante_adventures/ www.enf.org/oases www.enf.org/oases/ www.enf.org/outdoor_academy www.enf.org/outdoor_academy/
Reporting & Analytics | | DMoff0 -
Setting up Analytics on a Site that Uses Frames For Some Content
I work with a real estate agent and he uses strings from another tool to populate the listings on his site. In an attempt to be able to track traffic to both the framed pages and the non-framed pages he has two sets of analytics code on his site - one inside the frame and one for the regular part of the site. (there's also a third that the company who hosts his site and provides all these other tools put on his site - but I don't think that's really important to this conversation). Not only is it confusing looking at the analytics data, his bounce rate is down right unmanageable. As soon as anyone clicks on any of the listings they've bounced away. Here's a page - all of those listings below " Here are the most recent Toronto Beaches Real Estate Listings" are part of a frame. http://eastendtorontohomes.com/toronto-beach-real-estate-search/ I'm not really sure what to do about it or how to deal with it? Anyone out there got any good advice? And just in case you're wondering there aren't any other options - apart from spending thousands to build his own database thingie. We've thought about that (as other agents in the city have done that), but just aren't sure it's worth it. And, quite frankly he doesn't want to spend the money.
Reporting & Analytics | | annasus0