How do I make a content calendar to increase my rank for a key word?
-
I've watched more than a few seminars on having a content calendar. Now I'm curious as to what I would need to do to increase ranking for a specific keyword in local SEO.
Let's say I wanted to help them increase their rank for used trucks in buffalo, NY. Would I regularly publish blog posts about used trucks?
Thanks!
-
"What questions do people call in about?"
Good idea Keri.
You can also come up with a list of what a consumer that is buying a model of truck will type. Such as:
Common problems of model
Easy fixes for issues with model
Consumer reviews for model
Publisher reviews for modelThings like that. As Egol stated, you can easily waste a TON of time blogging about stuff no one will read or care about. Do a few searches yourself. I'm sure if you spent 10 minutes searching around you could find some topics worth covering.
-
To help find things to write about that won't waste your time, spend a moment to think about what will save your time. What questions do you get on the sales floor? What questions do you get emails about? What questions do people call in about? That's what your customers want to know.
-
Let's say I wanted to help them increase their rank for used trucks in buffalo, NY.
If they have a sales lot within Buffalo then using local search is the best place to start.
Would I regularly publish blog posts about used trucks?
This is a good way to waste a lot of time. I only use the content attack when I have topics to write about that are of very high interest. You gotta do a great job. There are tons and tons and tons of stuff about used trucks on the web. Most blog posts about them will be lost on the forest. If you are not producing content that stands out then you might be wasting your time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do i check content is fresh or duplicate?
Hello there, As per google we need Fresh content For our website, i have content writer, but if i want to check it is duplicate before Submitting any where , Then How can i check ?? please any body let me know. Thanks,
White Hat / Black Hat SEO | | poojaverify060 -
Duplication Effects on Page Rank and Domain Authority
Hi Does page rank and domain authority page rank drop due to duplication issues on a web domain or on a web page? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Do Ghost Traffic/Spam Referrals factor into rankings, or do they just affect the CTR and Bounce Rate in Analytics?
So, by now I'm sure everyone that pays attention to their Analytics/GWT's (or Search Console, now) has seen spam referral traffic and ghost traffic showing up (Ilovevitaly.com, simple-share-buttons.com, semalt.com, etc). Here is my question(s)... Does this factor into rankings in anyway? We all know that click through rate and bounce rate (might) send signals to the algorithm and signal a low quality site, which could affect rankings. I guess what I'm asking is are they getting any of that data from Analytics? Since ghost referral traffic never actually visits my site, how could it affect the CTR our Bounce Rate that the algorithm is seeing? I'm hoping that it only affects my Bounce/CTR in Analytics and I can just filter that stuff out with filters in Analytics and it won't ever affect my rankings. But.... since we don't know where exactly the algorithm is pulling data on CTR and bounce rate, I guess I'm just worried that having a large amount of this spam/ghost traffic that I see in analytics could be causing harm to my rankings.... Sorry, long winded way of saying... Should I pay attention to this traffic? Should I care about it? Will it harm my site or my rankings at all? And finally... when is google going to shut these open back doors in Analytics so that Vitaly and his ilk are shut down forever?
White Hat / Black Hat SEO | | seequs2 -
Image Optimization & Duplicate Content Issues
Hello Everyone, I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website: We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages. The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue? Thanks in advance, Scott
White Hat / Black Hat SEO | | ccbamatx0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Is it still valuable to place content in subdirectories to represent hierarchy or is it better to have every URL off the root?
Is it still valuable to place content in subdirectories to represent hierarchy on the site or is it better to have every URL off the root? I have seen websites structured both ways. It seems having everything off the root would dilute the value associated with pages closest to the homepage. Also, from a user perspective, I see the value in a visual hierarchy in the URL.
White Hat / Black Hat SEO | | belcaro19860 -
Should i buy this domain because it has a keyword that im trying to rank for?
Im trying to rank for a keyword and i saw that this domain is available the top listing under this search team is a domain authority 14 and according to seo site explorer has 1 page linking to it and its a no follow link. Would it make since to buy this domain optimize it and forward it back to my main site or is this bad to do.
White Hat / Black Hat SEO | | gslc0 -
Single-words high keyword density. How many is too many.
Dear All SeoMoz users, I'm a web designer for some time now. Doing some basic SEO from time to time. I just started up with brand new website. The website is not ranking very well for 2nd line keyword (keyword density < 2%), but the problem is not ranking at all for for my main keyword. I think the problem is the keyword density. For phrases that are 3-words long my keyword density is less than 4%. I suspect the problem is that keyword density for single-word phrases is between 8-12%. Please note that the 3 words with highest keyword density make my main 3-words long keyword. Is this the case? Should I be avoiding keyword density larger than 4% for single-word phrases as well? What is you experiences is this matter? Could my single-word phrases be treated as keyword stuffing by Google?
White Hat / Black Hat SEO | | pseefeld0