How Long Does It Take Content Strategy to Improve SEO?
-
After 6 months of effort with an SEO provider, the results of our campaign have been minimal. we are in the process of reevaluating our effort to cut costs and improve ROI. Our site is for a commercial real estate brokerage in New York City.
Which of these options would have the best shot of creating results in the not too long term future:
-Create a keyword matrix and optimize pages for specific terms. Maybe optimize 50 pages.
-Add content to "thin" pages. Rewrite 150-250 listing and building pages.
-Audit user interface and adjust the design of forms and pages to improve conversions.
-Link building campaign to improve the link profile of a site with not many links (most of those being of low quality).I would really like to do something about links, but have been told this will have no effect until the next "Penguin refresh". In fact I have been told the best bet is to improve user interface since it is becoming increasingly difficult to improve ranking.
Any thoughts? Thanks, lan
-
What questions do clients and potential clients have about offices? If they called your company, would you be able to tell them more than 400 words over the phone? Try putting some of that information on your page.
-
Hi Alan,
Can't say your SEO company is completely wrong. As most SEOs normally start with making sure the on-site part / technical SEO is set up OK. Otherwise any efforts you'll put in creating off-site attention and trust are worthless.
-
Hi Jane:
Great observation regarding listing pages on real estate sites!! It is very difficult for me to add unique content for hundreds of listings. There are only so many ways to describe an office or loft space. Writing more than 150-200 words per listing is difficult since the product is generic. So what is the best way around this? My SEO provider suggested I "no-index" these pages. But I am concerned about non-indexing 300 of my product pages, concerned that Google would view 300 no-index on a 650 page site as suspect, like I am trying to hide something.
My SEO provider believes that we were hit by Penguin 1.0 in April 2012. Traffic dropped about 60% at that point. It recovered in October partially as there was a Google update at that time. It has dropped in the last two months, with the drop accelerating as an upgrade of the site was launched in June.
I think my SEO firm is discouraging further SEO as I am maxed out financially after spending more than $25,000 on SEO, coding, design in the last 8 months with nothing to show for it. They may think I hold them accountable (I do, I expect some results eventually) so they are not encouraging me to move forward. I find it very discouraging that after such a major effort there are no results. Maybe results can be achieved but I would need to budget $50,000-$100,000 to get some momentum. But that is out of the question unfortunately.
-
Hi Martijn:
Thanks for the response!! A few questions:
-What do I do about real estate listing pages? It is difficult to create a 400 word description for an office. I guess I could beef up the content to maybe 150 words but the language will be similar.
-We had many toxic links removed in the last 60 days so maybe that is creating the drop. Which will be a faster way to create quality links: post high quality content on our blog or solicit links? I am very surprised my SEO company did not choose to go down this road initially.
Thanks for your assistance. Alan
-
One of my first posts to you was to get a list of all of those pages, go through them one-by-one to find out where they are coming from.
Are they pages that were made by you or company staff, where they spawned as tags or pagination or something else by a content management system, were they made by a hacker posting Ugg boots on your site?
If you want to know where these pages are coming from you gotta spend the time and do some manual work.
-
Hi EGOL:
There is certainly a verifiable, correct answer. Question is how to obtain it. I tend to agree with you, I think some technical issue is going on. My SEO company may or may not have their own agenda or may have over looked this. The jump to 851 pages from 675 in the Google Index in early June is really suspicious. The site map only contains 635 pages. So seems technical. Question is what to I attempt to fix first and at what cost.
In any case, thanks again!!! Alan
-
with all due respect, my SEO provider (MOZ approved) disagrees with you.
That's OK. Lots of SEO providers disagree with me.
You can believe whatever answer you like the best, or whichever one you think is right.
-
-
Hi Egol:
I understand. But I am trying to prioritize what needs to be done first. I have spent about $40,000 in the last 8 months on a series of SEO audits, coding, wireframes without any improvement. Implementing all the suggestions made in the SEO audits. Don't want to continue to waste resources without generating results.
Regarding technical issues, with all due respect, my SEO provider (MOZ approved) disagrees with you. An excerpt form their message to me yesterday below. So based on their suggestions it may make sense to focus on content, but I am not sure, which is why I have posted to this forum. Sorry if my questions overlap at times, but this is so technical that it is better to be safe than sorry. I see that you respond often to these posts and I really appreciate the time you take to respond in such a thorough manner. -
XXXX is not overly concerned that technical/indexation issues are at the heart of the SEO issues. We are seeing conflicting indexation numbers from Google (see screenshot) showing 442 results with an "omitted results" message from Google. Additionally GWT reports conflicting index numbers depending on what report you use: Some at 843, other reports in the 539 range. Also, keep in mind that there has yet to be a Penguin update and therefore Google may not be accounting for some of the positive things that were done in past months.
-
Your developers appear to handling things correctly with the no index issues of listings pages, but we did find the following indexed pages:
-
Subdomain listings.nyc-officespace-leader.com has 37 pages indexed.
-
There are other irrelevant pages indexed on the www subdomain (20-30+), such as:
-
The /listings/search? pages are out of Google's index (this is a good thing).
-
Per our recommendations from our last meeting, you should probably noindex,follow /listings/ pages that have 'stock' (duplicate) content, do not drive much traffic and are low quality pages etc. We outlined this strategy in the deck we provide from our last meeting.
-
Most of your building pages have 80%+ bounce rates: This obviously is a content issue (e.g. no listings for people to view/click on the building page), and this is also an SEO issue (Google does take into account bounce rates from your pages back to the SERPs).
-
Traffic dropped by about 10-15% on 5/19 when Panda 4 came out (see attached screenshot showing drop in Google/Organic traffic on May 20 -- Panda update was May 19). The loss in traffic is probably from lower rankings on some extremely high converting keywords, which would explain the general drop in leads/inquiries/etc. Bottom line, you need to improve content. However, as we've said in the past, there is no guarantee with just a content marketing based approach and it could take months to recover. Thus, content improvement via conversion optimization is the best approach in order to generate more immediate business and still provide longer term organic traffic growth. We could supplement traffic with PPC to generate more immediate traffic, but this is not worth doing until the website is improved since it would likely be wasted money.
-
-
I am only going to say that you should read the feedback given to some of your previous questions before you go much farther. Your site has technical issues and content issues. It could have Penguin issues. I don't know.
If a site is performing at 50% efficiency then you are only going to get 50% benefit out of any content, SEO, marketing, etc. that you put into it.
The smart spend is to fix the issues or start over.
-
Hi there,
I would really like to do something about links, but have been told this will have no effect until the next "Penguin refresh".
This is only true if you are currently under a penguin-related penalty. That penalty won't be lifted until the algorithm update is rolled out, but if you are not currently under this type of penalty, link development will help your site's authority and, most likely, rankings. As long ago as 2007, you'd need to wait a few months to see improvements from any type of SEO work you did, but results usually come in a little quicker nowadays. That said, when I worked for an agency, we used three months as the minimum time frame to judge progress and make adjustments. Still, you'd see changes sooner than that in most cases.
If you do not have a good idea of your target keywords and which pages should be ranking for those keywords, I would put that in place, i.e. the matrix. I would have really thought the SEO agency would have done this early on, to be honest, even just for reporting purposes.
If you think you have thin pages, this can be an issue with Google's Panda algorithm, whose purpose is in part to devalue websites with too much thin, "useless" content. Unfortunately, it hit real estate websites particularly hard in some cases due to the nature of the industry. If you have a feed of listings, the content of that feed doesn't differ much property-to-property, and is duplicated extensively if those properties are also shared via a feed with aggregators or partner agents. As such, fleshing out these pages and sections can be very useful, so I would put this in as an important set as far as on-page work goes, after you have identified your primary keywords and developed a content plan to optimise for them.
Make sure you're really creating good content for those primary keywords, however - not "doorway pages" or content with no purpose besides attracting search bots and clicks.
The user interface testing for conversions is absolutely essential but is not an SEO item per se. Conversion rate optimisation can increase sales by hundreds of percentage points very quickly and if you have good internal or external resource to do it, do it now! That said, if SEOs are telling you to do this because "it's too hard to improve rankings" and not because it's an important part of their overall service, I would take this as somewhat of a red flag. It's not too difficult or impossible to improve rankings unless you are penalised or can't afford to invest in SEO (meaning you'd be unlikely to afford good CRO too). I am a big fan of using good SEO and CRO together - the results in terms of improved revenue can be quite astounding.
Cheers,
Jane
-
Hi Ian,
Really hard to say, I would really look in to two issues; adding content to the 'thin' pages. As having 150-200 pages with thin content probably already identifies that there could be an issue with more pages that need to be or excluded from Google or fixed in another stadium.
Besides that I would focus on auditing the user interface way before you start doing any other things on SEO. Otherwise you could end up losing customers you could have had if you've had your sites conversion rate up.
Btw, that links will not have any effect until a next Penguin update is just bullshit.
Hope this helps!
-
Rankings can be improved within a short while (2 months isn't a stretch) and good links are still king. There just isn't a better way at the moment for Google to figure out if one page is superior to another without relying on them as a huge signal. You can add all the content you want but without links your site will stand still or you might get the occasional long tailed search from an organic visit.
I say do the basics before you try anything to difficult. Get your KWs figured out, and apply them in the appropriate format. Ranking a page for more than two keywords and their slight variants has proven super difficult and I'd almost go as far as saying that going beyond two per page is unrealistic. Once you know your KWs then apply SEO onsite best practices, use Mozs on page grader if you have questions about best practices. also don't forget about internal linking as you're optimizing pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Why does some sites rank with no seo
Why is it that some site rank with zero efforts? I have been working on some seo for a while on my main site and i have been getting more info competition analysis with sem and moz. Looking at the states from this website which tends to popup often in the searches on page 1-2 before my site. This site is not keyword optimized, meaning they arent even trying to rank.
Intermediate & Advanced SEO | | CooperStrzelecki
There is no content, articles etc.,
6 backlinks (nothing powerful just 2 directory links and 2 from developer)
Site really isnt even designed to get traffic as its a trade only ecommerce website
I doubt they are hiding anything as far as backlinks etc. as it will get them too many visitors they dont want
The city i am searching isnt even on the page (it is a city within a city so maybe google still relates it)
PA 24 DA 15 Now my site:
Optimized reasearched keywords
175 backlinks
All my main pages have content with images, alt tags, internal linking
full of content, blogs, videos, products (probably 4000, could a site being too big be an issue?)
Site gets regular updates
I probably have 200 citations
All the social media which gets done often
PA 32 DA 20 They do get a good bit of traffic but that is probably the only thing i would see but it would be direct traffic mostly i believe as it would be people going to order regularly since it is a print reseller. They may have some age on me 15 vs 8 years. Could it be some kind of penalty i am not sure about lingering? According to what i know to check everyything looks ok, no shady links accoding to sem. I am working more and more on all the pages but this competittion site really doesnt have crap going on probably 8 pages and 1 page does all the ordering. What the hell does google want from me exactly!0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Expired News Content
Ive read some stuff about expired content here, but have yet to find an answer so I thought I would post my question is regarding a news based site and expired content issues. So my site does Recaps, and Previews for sporing events. Well eventually the content does become not relevant as nobody cares about a prediction after the game is done. What would be the best method to deal with this? Should I just leave it there or 301 redirect it to the more relevant games. The reason why I'm asking is because when I have added a more recent game such as New York vs Boston, when I would search for that keyword in google, the page google would show would be like Atlanta vs L.A. Thanks in advance!
Intermediate & Advanced SEO | | ravashjalil0 -
Hosting Providers and SEO
I have been wondering for a while which web host provider is the best for SEO purposes? Things to consider. Shared Hosting vs Dedicated Server Location of the Host Provider Site Up Time One question that I have been thinking about is what impact would changing a host provider have on a websites serps ranking? Is there a possible negative impact and if so how can it be avoided? Name the top 3 Web Hosts for SEO.
Intermediate & Advanced SEO | | bronxpad0 -
SEO Strategy for Microsite
I am working on a project to build a microsite of sorts that will represent a joint program between two large organizations with established web presences and strong domains. Each of the organizations has dedicated sections on their sites speaking to the program, but the leadership has decided the joint program deserves it's own site with dedicated content. The two larger sites perform very well for SEO, and I don't necessarily want to jeopordize thir rankings by delivering content that competes directly with them. So I am doing some keyword research to find some opportunities that will alllow me to use the new site to target keywords not yet being captialized by the larger sites. My grand scheme is to have the three sites targeting the broadest array of keywords possible, thus maximizing exposure and avoiding competition. Here is the rub: the content between the three sites will be different but very similar, and there will be plenty of cross linking, especially from the existing sites to the new site, as we grow the brand of the joint program. I'm curious to here some expert opinions on what the puitfalls of the strategy are and what are some of the things I can do to avoid falling in the black hat category - I recognize that proliferating sites around a single topic and cross linking them is black hat. The organizations simply want to build a brand around a joint program and we are striggling to do that without a dedicated website.
Intermediate & Advanced SEO | | AmyLB0 -
What is the effect on using jQuery sliders for content on SEO?
I know using css in subversive manners gets you dinged for points. I didnt know if JS counted the same since you are essentially hiding parts of the content and showing it in intervals as slides. The goal would be having key items for a client in divs and rotating those divs via a slider plugin as slides. I was just curious if that effected things in any way. Thanks! ~Paul
Intermediate & Advanced SEO | | peb72680