Fresh Content Still As important?
-
We have an internal debate, that perhaps y'all can help us resolve.
In the past "freshness" of content has been important, correct? (Google's QDF for example) In the past (to present) when we build a site with the intent to SEO the site, we build the core pages with the expectation that we will be adding more site pages as the project progresses, thus settling the "fresh content" factor.
But it has been proposed to us, from a client, that completely building the site out with all the pages you hope to rank, getting the upfront bang for your buck. The expectation is that the traffic soars right-off.
Now the client says that he has been doing this for years and has not been affected by any alog changes. (although we have not seen proof of this from him)
So our question is this: Is it better to provide a website full of fresh content at the beginning of the project, for a jumpstart on traffic, then leave the site alone ( for the most part)
or
Is it better to have core pages of fresh content at the start, and build out new pages from their, so the website remains fresh every month?
And can you prove your argument? (we need cold hard facts to be convinced
-
EGOL, a big time member on these forums posted years ago that there will be a day when the only thing that a search engine truly judges a website on is Keywords and content. Now I'm not entirely sure I'm completely on board with that (I'm about 95%), but I do agree that content, especially after the recent SE updates, has shifted back into power.
My father owns a business, we make educational materials for people with mild to severe autism. He is very successful, and he personally doesn't have the time or energy to spend in writing a daily blog, and unfortunately doesn't trust anybody to ghost write for him.
So we came up with an alternative. A combo of original content mixed with educational reports, interesting studies, and every now and then some strange funny story from theOnion. We would post at least one original piece a week, if we could 2, and then everything else from there. I made a few Bullying Infographics for his business to post and share on social media. Now, it wasn't always keyword heavy content, but as long as it was content worth sharing, it did get us a lot of links.
At the end of the day, if I have to make a decision on how Google is doing something, I try to remind myself Google is in the business of making money. They do that by providing the best, accurate, human, natural, semantic, organic, pefect-beacue-I-am-a-snowflake, result. Google, in my opinion, will take how current the website is, into account.
Content is King.
-
This is our thought as well. A continuous feed of fresh content is a better approach than a one off. This is how we've been doing it, but we're really interested in knowing if others have tried this other approach, with any lasting sustainability in traffic or ranking. ( we kind of doubt it, but would love to see proof that it works)
-
The QDF is aimed at hot/current topics right ? So while it might be important for a news site or a celebrity gossip site I don't think it will be relevant for every site.
You have mentioned that the client has proposed to build the site with "all the pages you hope to rank for", which means the topic is restrictive and there is a limit to what you can write about the subject. But then to launch the site with this approach you need to get all the content ready and that might take some time.
A much more sensible approach would be to launch the site with a reasonable amount of content and then add the rest of the content when possible. This way you can start with the link building, social sharing process early.
I don't think just because you launch a site with lots of fresh content it will give you a jump start in traffic, but I'm interested to see if anyone had success with this method.
-
Fresh content is definitely important and while you may get the boost at the start you'll quickly loose it if you're not putting up new content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Important updates on Google Analytics Data Retention and the General Data Protection Regulation (GDPR)
Hi Everyone, I'm sure many of you received the email from Google over the past few days with the subject line: [Action Required] Important updates on Google Analytics Data Retention and the General Data Protection Regulation (GDPR). I hope I'm not alone in not knowing what exactly this whole notification was in regards to. I realize it's for Data but are we no longer able to pull stats from the past? If anyone has a "dumbed down" explanation for what this update entails, I would be very interested - I don't want to miss out on any important updates and info, but I'm just not grasping this content. Below is the full email in its entirety for those who are interested as well: Dear Google Analytics Administrator,
Reporting & Analytics | | MainstreamMktg
Over the past year we've shared how we are preparing to meet the requirements of the GDPR, the new data protection law coming into force on May 25, 2018. Today we are sharing more about important product changes that may impact your Google Analytics data, and other updates in preparation for the GDPR. This e-mail requires your attention and action even if your users are not based in the European Economic Area (EEA).
Product Updates
Today we introduced granular data retention controls that allow you to manage how long your user and event data is held on our servers. Starting May 25, 2018, user and event data will be retained according to these settings; Google Analytics will automatically delete user and event data that is older than the retention period you select. Note that these settings will not affect reports based on aggregated data.
Action: Please review these data retention settings and modify as needed.
Before May 25, we will also introduce a new user deletion tool that allows you to manage the deletion of all data associated with an individual user (e.g. site visitor) from your Google Analytics and/or Analytics 360 properties. This new automated tool will work based on any of the common identifiers sent to Analytics Client ID (i.e. standard Google Analytics first party cookie), User ID (if enabled), or App Instance ID (if using Google Analytics for Firebase). Details will be available on our Developers site shortly.
As always, we remain committed to providing ways to safeguard your data. Google Analytics and Analytics 360 will continue to offer a number of other features and policies around data collection, use, and retention to assist you in safeguarding your data. For example, features for customizable cookie settings, privacy controls, data sharing settings, data deletion on account termination, and IP anonymization may prove useful as you evaluate the impact of the GDPR for your company’s unique situation and Analytics implementation.
Contract And User Consent Related Updates
Contract changes
Google has been rolling out updates to our contractual terms for many products since last August, reflecting Google’s status as either data processor or data controller under the new law (see full classification of our Ads products). The new GDPR terms will supplement your current contract with Google and will come into force on May 25, 2018.
In both Google Analytics and Analytics 360, Google operates as a processor of personal data that is handled in the service.
• For Google Analytics clients based outside the EEA and all Analytics 360 customers, updated data processing terms are available for your review/acceptance in your accounts (Admin ➝ Account Settings).
• For Google Analytics clients based in the EEA, updated data processing terms have already been included in your terms.
• If you don’t contract with Google for your use of our measurement products, you should seek advice from the parties with whom you contract.
Updated EU User Consent Policy
Per our advertising features policy, both Google Analytics and Analytics 360 customers using advertising features must comply with Google’s EU User Consent Policy. Google's EU User Consent Policy is being updated to reflect new legal requirements of the GDPR. It sets out your responsibilities for making disclosures to, and obtaining consent from, end users of your sites and apps in the EEA.
Action: Even if you are not based in the EEA, please consider together with your legal department or advisors, whether your business will be in scope of the GDPR when using Google Analytics and Analytics 360 and review/accept the updated data processing terms as well as define your path for compliance with the EU User Consent Policy.
Find Out More
You can refer to privacy.google.com/businesses to learn more about Google’s data privacy policies and approach, as well as view our data processing terms.
We will continue to share further information on our plans in the coming weeks and will update relevant developer and help center documentation where necessary.
Thanks,
The Google Analytics Team6 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
Migrated website but Google Analytics still displays old URL's and none new?!
I migrated a website from a .aspx to a .php and hence had to 301 all the old urls to the new php ones. It's been months after and I'm not seeing any of the php pages showing results but I'm still getting results from the old .aspx pages. Has any one had any experience with this issue or knows what to do? Many thanks,
Reporting & Analytics | | CoGri0 -
Is Google still differentiating TLD?
I need to find an article or something with factual findings that shows Google no longer gives extra value to .com, .net, .us, edu. etc. or proving that is still does.
Reporting & Analytics | | PPI0 -
Can someone clarify the importance of Scribe SEO?
Hi guys, I was reading The Beginner's Guide to SEO and was confused about the importance of keyword density. As I see it, the main purpose of tools like Scribe SEO revolve around analyzing keyword density, however, Chapter 9 of "The Beginner's Guide to SEO" seems to downplay its importance and says "Despite being proven untrue time and again, this myth has legs. Many SEO tools still feed on the concept that keyword density is an important metric. It's not." If this is true, what is the real value of tools like Scribe SEO? Currently, I follow keyword analysis tools very closely, and try to get the recommended density in my articles to help build back links. Should I be focusing heavily on the density and prominence of keywords like I am in the picture below, or is there another way you suggest I go about using these tools? PhJnV PhJnV
Reporting & Analytics | | samersultan10 -
Google Analytics Content Experiments don't deliver 50/50?
Our A/B test is actually delivering at about a 70/30 page view rate. 70% in favor of the original version and only 30% of the new. We are sending 100% of our traffic to this homepage test. Has anyone else experienced this? There seems to be a lot of folks experiencing this.....anyone know why?
Reporting & Analytics | | VistageSEO0 -
Is Google able to determine duplicate content every day/ month?
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site). Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday. Have you seen or heard something similar?
Reporting & Analytics | | Martijn_Scheijbeler0 -
Duplicate page content
I have a website which "houses" five different and completely separate departments, so the content is separated by subfolders. e.g. domain.com/department1 domain.com/department2 etc. and each have their own individual top navigation menus. There is an "About Us" section for each department which has about 6 subpages (Work for us, What we do, Awards etc.) but the problem is that the content for each department is exactly the same. The only difference is the navigation menu and the breadcrumbs. This isn't ideal as a change to one page means having to make the change to all 5 and from an SEO perspective it's duplicate content x5 (apart from the Nav). One solution I can see is to have the "About Us" section moved to the root level (domain.com/about-us) and have a generic nav, possibly with the department names on it. The only problem with this is that it disrupts the user journey if they are forced away from the department that they're chosen. Basically i'm looking for suggestions or examples of other sites that have got around this problem, I need inspiration! Any help would be greatly appreciated.
Reporting & Analytics | | haydennz0