Shoemaker with ugly shoes : Agency site performing badly, what's our best bet?
-
Hi everyone,
We're a web agency and our site www.axialdev.com is not performing well. We have very little traffic from relevant keywords.Local competitors with worse On-page Grader scores and very few backlinks outrank us. For example, we're 17th for the keyword "agence web sherbrooke" in Google.ca in French.
Background info:
- In the past, we included 3 keywords-rich in the footer of every site we made (hundreds of sites by now). We're working to remove those links on poor sites and to use a single nofollow link on our best sites.
- Since this is on-going and we know we won't be able to remove everything, our link profile sucks (OSE).
- We have a lot of sites on our C-Block, some of poor quality.
- We've never received a manual penalty. Still, we've disavowed links as a precaution after running Link D-Tox.
- We receive a lot of trafic via our blog where we used to post technical articles about Drupal, Node js, plugins, etc. These visits don't drive business.
- Only a third of our organic visits come from Canada.
What are our options?
- Change domain and delete the current one?
- Disallow the blog except for a few good articles, hoping it helps Google understand what we really do.
- Keep donating to Adwords?
Any help greatly appreciated!
Thanks! -
Ahh I get it now, redirect every URL from the old site to its homepage. Makes sense!
For point 2) I meant the URL Removal tool to de-index the whole site but this would no longer be needed if I apply the above suggestion.
Thanks a bunch!
-
Yep. The site isn't done. Every time we try to finish it, another couple of referrals come in.
Regarding "non-google sanction duplicate content" that's just my way with words. You have a French version of the site and an English version of the site. Without proper hreflang usage, that is duplicate content.
-
Well spotted, Travis!
-
ABSOLUTELY do NOT 301 anything from the old site to the new site...or you risk transferring the penalty!
I'm not sure what Google will do if you disallow via robots.txt AND 301. Most likely, this is safe, Google will remove the old site from the index and ignore the 301s. But I think there's some risk here that Google will read the pages anyway, see the 301s, and perhaps transfer the penalty.
Deleting the domain in webmaster tools will have no effect, other than to prevent you from seeing what Google thinks about the old domain :-/. Google will continue to index the old domain, follow redirected links, see duplicate content, etc.
-
Hello / Bonjour.
It looks like you might have an awful lot of duplicate content (e.g. category pages, date archives) on the site. I'd try getting rid of that before deciding to switch domains.
-
Hi Travis, thanks for your response.
I swear those hreflangs were OK not long ago!
We'll fix them up, thanks!
Can you give an example of "non-google sanctioned duplicate content"?
The robots.txt file seems OK even though it's a bit heavy in verbatim. I'll ask to shrink it a bit. (By the way, I was curious about PhysVisible's robots.txt but looks like you're disallowing everything. Thought I'd let you know!)
Thanks again!
-
Merci Michael!
Can you elaborate on "Keep the old site running, but 301 redirect all of the pages to the home page..." ? Should any URL on www.oldsite.com redirect to the homepage of www.newsite.com?
We had these options in mind. What do you think of those?
-
robots.txt disallow the old site and map every URL with a 301 to help our users get to the right page while Googlebot won't follow those links (to be tested but seems logical), and/or...
-
Delete the whole old domain in GWT.
Thanks for your time!
-
-
Full disclosure: I've been studying hreflang/rel=alternate for the glorious day when someone wants, and will pay for, a solid Spanish translation. That day has not come. But I wanted to be prepared. So here goes:
Your English pages are pointing the canonical at the French pages. No nationalized form of English is mentioned in the hfrelang alternate. If your English speaking audience is Canadian, put en-ca in the empty quotes after hreflang=. Example from /en:
rel="alternate" hreflang="" href="http://www.axialdev.com/en/" />
All of your canonicals point to the fr-ca version of the pages. For the en-ca pages, they should point to the en-ca pages.
I grew up in Michigan. I have quite a few Canadian friends. The only thing that's different about spoken Canadian English is the pronunciation of 'about' and they tend toward en-gb in spelling. But you should use en-ca anyway.
Yep, you have a lot of site-wide links. That is true. That may be part of the problem. But right now, you have a lot of non-google sanctioned duplicate content.
The site also has one of the most involved robots.txt pages I've seen in a month or so. It may not be a good idea to call any old user agent, *, and not give them a directive. Check the end of the file.
A site should not nose dive within the space of a couple weeks without extensive search engine manipulation, or serious on-page issues. Your site has been live for seven years. It's better to doubt on-page first though.
-
Bonjour! (I lived in Montreal for 6 years :-).
I do a lot of penalty recovery work, and you're in the same situation as a number of my clients: algorithmic penalty (probably), and you've disavowed links, but....no Penguin update for a year.
The next Penguin data update is mostly likely very soon, from mutterings from Matt at SMX Advanced. It's been almost a year since the last one. Your disavows won't take effect until there IS a data update.
I would wait for the data update, and see if you recover on rankings for the 3 terms you had in your footer links from client sites. If you do, then great, continue on...
If not, then I'd be inclined to start a new domain, and move your content from your old site (and blog) to the new site, WITHOUT 301 redirecting. Keep the old site running, but 301 redirect all of the pages to the home page....you want Google to successfully fetch all of those blog pages with the great content, but find it's permanently moved to your home page, where that content no longer exists. This way your new site's content will not be seen as duplicate by Google (if you just 404 the pages, Google will presume the content is still as it was before it 404'd....FOR MONTHS).
It's worth going through all of the backlinks for the old site, seeing which ones are from healthy sites, and manually asking those webmasters if they'd kindly update their links to point to your new site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can subdomains hurt your primary domain's SEO?
Our primary website https://domain.com has a subdomain https://subDomain.domain.com and on that subdomain we have a jive-hosted community, with a few links to and fro. In GA they are set up as different properties but there are many SEO issues in the jive-hosted site, in which many different people can create content, delete content, comment, etc. There are issues related to how jive structures content, broken links, etc. My question is this: Aside from the SEO issues with the subdomain, can the performance of that subdomain negatively impact the SEO performance and rank of the primary domain? I've heard and read conflicting reports about this and it would be nice to hear from the MOZ community about options to resolve such issues if they exist. Thanks.
Intermediate & Advanced SEO | | BHeffernan1 -
Community Discussion - What's the ROI of "pruning" content from your ecommerce site?
Happy Friday, everyone! 🙂 This week's Community Discussion comes from Monday's blog post by Everett Sizemore. Everett suggests that pruning underperforming product pages and other content from your ecommerce site can provide the greatest ROI a larger site can get in 2016. Do you agree or disagree? While the "pruning" tactic here is suggested for ecommerce and for larger sites, do you think you could implement a similar protocol on your own site with positive results? What would you change? What would you test?
Intermediate & Advanced SEO | | MattRoney2 -
Robots.txt - Googlebot - Allow... what's it for?
Hello - I just came across this in robots.txt for the first time, and was wondering why it is used? Why would you have to proactively tell Googlebot to crawl JS/CSS and why would you want it to? Any help would be much appreciated - thanks, Luke User-Agent: Googlebot Allow: /.js Allow: /.css
Intermediate & Advanced SEO | | McTaggart0 -
Should I disallow via robots.txt for my sub folder country TLD's?
Hello, My website is in default English and Spanish as a sub folder TLD. Because of my Joomla platform, Google is listing hundreds of soft 404 links of French, Chinese, German etc. sub TLD's. Again, i never created these country sub folder url's, but Google is crawling them. Is it best to just "Disallow" these sub folder TLD's like the example below, then "mark as fixed" in my crawl errors section in Google Webmaster tools?: User-agent: * Disallow: /de/ Disallow: /fr/ Disallow: /cn/ Thank you, Shawn
Intermediate & Advanced SEO | | Shawn1240 -
Need help on SEO for my site. Can't figure out what is wrong.
My site, findyogi.com, isn't ranking well in google SERPs. For some good content and matching keyword, my pages are ranking 200+ whereas other sites that have similar or lower authority are ranking in top 10. I must be doing something fundamentally wrong but can't seem to figure out what. I am not looking at ranking 1 on google right now but my pages don't appear even on page 2-4. Sample Keyword- "Samsung galaxy s4 price in india" . Matching page - www.findyogi.com/mobiles/samsung/samsung-galaxy-s4-b94a37/price Please help.
Intermediate & Advanced SEO | | namansr0 -
Best solutions when homepage won't rank in Google?
My homepage (www.LeatherHideStore.com) will not rank for my keywords in Google - with Google mostly pulling product pages and some categories for serp results. In contrast, my homepage consistently shows for Yahoo and Bing with exceptions where a category is a better match for the keyword. In other words, it is working exactly as it should in Yahoo and Bing. After a year of this frustration I just upgraded to a new site on Magento Community and surprise, the same problem! The SEO moz analyzer has flagged significant duplicate content issues which I think is at the heart of my problem. I have asked my developer to address these but let's just say that customer service is not his forte. I am even starting to doubt he knows what to do although the site appears is well done. Given that it is a brand new site and duplicate content in Magento is to be expected (from what I have now read), I am deeply discouraged that my developer did not or could not plan for this so here I am again! Can anyone give me guidance on what to do? I have read a lot about canonicalization and it seems complicated especially if you have 1000 duplicate page titles. I have seen that there are some extensions (i.e. Ultimate SEO Suite by aheadWorks) for Magento that claim to be able to solve duplicate content problems but I am really just grasping at straws and do not have the confidence or skills to implement this on my own. Can anyone please help? Thanks! Hunter
Intermediate & Advanced SEO | | leatherhidestore0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Our site has been penalized and it's proving to be very hard to get our rankings back...
So I have a question. We have used nearly every trick in the book to rank our site, including a ton of white hat stuff.... but then also a lot of black hat practices that resulted in us dropping in the rankings by about 30-40 positions. And getting back to where we were (top 10 for most keywords) is proving to be nearly impossible. We have a ton of great content coming off of the site and we actually offer a quality product. We follow most of the guidelines advocated here on SEOmoz. But the black hat stuff we did has really taken a toll. And it's gonna be pretty much impossible to go back in time and erase all of the Black Hat stuff we did. So what should we do? Should we design a completely new website with a new domain? What can be done to help?
Intermediate & Advanced SEO | | LilyRay0