600+ Visitors a day after 6 months, can you do it?
-
So since the Penguin update the clients of the company I work for have gradually been losing traffic and money. Noone (except for me) has noticed this yet and connected the dots.
Yesterday we all get called in to have a bollocking and the manager asks the head of our department if he would be confident of being able to get 600+ visitors a day to an 'average website' that has just started up, to which he replied 'yes'.
Since I started here back in February there has not been a single new client that has been able to gain that many visitors (many have not gained even 25% of that figure), which in the post-panda and post-pegnuin world, I find completely understandable.
When I first started here they were using SEO 'tactics' which people used to employ 5+ years ago and didn't even use exact match keyword data.I have had a few talks with them about how SEO has changed over the last few years and they still don't seem to understand that it is now significantly more difficult to gain traffic using SEO than it once was.
If you were asked about the same question, thinking about the 'average' client you might get, would you be confident enough to guarantee that at the 6 month mark they would be getting 600+ visitors a day?
-
Egol is correct. This doesn't sound like a company i'd want to be working for yet alone paying money to look after my online interests. One good point of this company is you, in that you are wise enough to instantly spot the flaws in what was asked by the manager and in what was promised by your head of dept - no doubt in a panic.
This not only puts unrealistic pressure on him/herself but also on the employees reporting to the manager. All in all i'd take Egols advice and bail.
It sounds like its all too desperate based on lack of knowledge. Things arn't going right, so they pluck a figure out of the air and promise to hit it - they should be asking the guys for a brainstorming session to pool ideas and generate some realistic goals to acheive.
Hitting 600+ could be possible, as mentioned if the search volume is there then why not? But how will it be achieved, where will the visits be from and will they add value to the business(s)?
This could be an opportunity for you within the company to demonstrate your understanding of the situation and current landscape and prove your worth. Or seriously consider moving on.
-
Visitors is an SEO stat that your company can show to the client as proof of effort, but doesn't necessarily mean much in terms of the customer's business. Fewer highly targeted visitors can provide a better return than a large number of poorly targeted visitors. I'm working with a company now that had fairly high site traffic as a result of a PPC campaign, but almost none of these converted into customers. Their traffic is now about half of what it was, but consistently generates leads for them.
-
The simple answer to this question is yes you can get 600+ visitors to your site. The real issue is the type of traffic and the method you employ getting that traffic. You need to look at your industry, level of competition, your current etc, more like a SWOT analysis.
If you fly blind then you might attract traffic that is not related to your product of service, get high bounce rate and zero benefit to the business.
Its all about having a plan and executing it very well.
-
You really can't commit to raw numbers like that. (Even if "visitors" was a good way to measure success on its own!)
People forget that the pool of real prospects out there is not "the entire internet!" Not everyone is looking for your offerings and things like geography, seasonality etc get in the way...
Even once you've narrowed down your range of realistic prospects, you need to factor in the competition!
But - does the client really need 600+ visits a day? I've worked with some who will tell you that a few extra relevant visitors a month is all it takes to drive through another sale - and that more than justifies the creation of a new piece of content or outreach activity.
Without understanding the cost of acquisition, the conversion rates and the conversion/goal value how can the customer know that they're getting value for money...
Does your head of department have 600+ friends he can call on?
-
If you were asked about the same question, thinking about the 'average' client you might get, would you be confident enough to guarantee that at the 6 month mark they would be getting 600+ visitors a day?
If I was allowed to "pick" the topic of the site and worked on that site full time, I am pretty confident I could do it.
However, if the client was a plumber or dentist in Bugtussle, WV then I don't think that there are 600 relevant queries in a month so the answer to your question in that case is "it's impossible". Now, we could slap up content to deliver lots of irrelevant, low-quality traffic, but for genuine traffic that is relevant to the business owner, it just isn't there.
If I was working at this place I would be looking for another job. They don't understand the core of their business. They are going to crash and burn. They don't know what they don't know. Bail out!
-
Doesn't this totally depend on the industry the client is in? 600+ visitors a day is a lot of visitors and some areas may not even get 600 'Relevant' searches a day.
-
_I can understand your pain. It seems most of the online marketers are oblivious of the fact that the traditional link building things do not work anymore. Now, as they do not know anything other than spammy link building, they are adamant on following those cliché techniques even at the expense of losing visitors and courting penalty.
Yes, it is almost next to impossible for a website to get as many as +600 visitors per day unless the website is producing share worthy content or have some great tools under its sleeves. Now, this will take really long time and the client needs to spend money to build a great website._
You need to talk to the clients. Make him see how things have changed and why it is impossible for you reach that target. Tough call indeed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Geo ip filtering / Subdomain can't be crawled
My client has "load balancing" site traffic in the following way: domain: www.example.com traffic from US IP redirected to usa.example.com traffic from non-US IP redirected to www2.example.com The reason for doing this is that site contents on the www2 contains herbal medicine info banned by FDA."usa.example.com" is a "cleaned" site. Using HK IP, when I google an Eng keyword, I can see that www.example.com is indexed. When googling a Chi keyword, nothing is indexed - neither the domain or www2 subdomain. From Google Search Console, it shows a Dell Sonicwall geo ip filtering alert for www2 (Connection initiated from country: United States). GSC data also confirms that www2 has never been indexed by Google. Questions: Is geo ip filtering the very reason why www2 isn't indexed? What should I do in order to get www2 to be indexed? Thanks guys!
Technical SEO | | irene7890 -
Should I keep a website which is outdated or close it down? It has a few links. If I keep it can I redirect people to our newer site?
We are in the process of buying some intellectual property, and it's websites are very dated and only have around 5 external links each. What's the best course of action? Do we close down the sites; then redirect the urls to our current website, or do we leave the sites up, but redirect people to our new site. Reference: current website: www.psychometrics.com Old sites that come with the intellectual property: http://www.eri.com/ plus http://www.hrpq.com/ Thanks, Dan Costigan
Technical SEO | | dcostigan0 -
Can you use Screaming Frog to find all instances of relative or absolute linking?
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _? I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either. Ex. X-path: //*[starts-with(@href, “http://”)][1] Ex. Regex: href=\”//
Technical SEO | | Merkle-Impaqt0 -
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
One URL To All Sites, How Can I Avoid ?
I am using EMD and have an only 1 page which is the main url. Now my question is how can i avoid the penalty of submitting the same URL to the different platform like Web2.0, Article Directory etc. Please help.
Technical SEO | | seodadoo5670 -
What can I do if Google Webmaster Tools doesn't recognize the robots.txt file?
I'm working on a recently hacked site for a client and and in trying to identify how exactly the hack is running I need to use the fetch as Google bot feature in GWT. I'd love to use this but it thinks the robots.txt is blocking it's acces but the only thing in the robots.txt file is a link to the sitemap. Unde the Blocked URLs section of the GWT it shows that the robots.txt was last downloaded yesterday but it's incorrect information. Is there a way to force Google to look again?
Technical SEO | | DotCar0 -
What can be the cause of my inner pages ranking higher than my home page?
If you do a search for my own company name or products we sell the inner pages rank higher than the homepage and if you do a search for exact content from my home page my home page doesn't show in the results. My homepage shows when you do a site: search so not sure what is causing this.
Technical SEO | | deciph220 -
How can I improve my google places ranking?
I am currently registered with google places for 'video conferencing in Melbourne australia' however I don't show up on page 1 of the places sesrch results for this search term. How can I improve it. I do note that my office address is in a residential area and not Melbourne CBD. Thanks Dan
Technical SEO | | dantmurphy0