Not sure which way to go or what to do?
-
Hi there,
I have been a pro member of SEOmoz for a while now but this is my question in the forum and although I have looked through so much helpful information I was wondering if someone could give me some further advice and guidance?
I have a 3 year old ecommerce website personalisedmugs.co.uk which until May 2012 had some excellent growth, we then lost around 50% of traffic due to reduced organic rankings in google. We then noticed a further drop again in September.
From researching information I believe this drop was from the penguin update and EMD update?
Since these updates we have:
*Stopped working with a company in India whom was looking after SEO for us for 18 months
- redeveloped/designed website and upgraded software version
- constantly refreshed website with content as we always have done
- Modified internal anchor text (this did seem keyword rich)
My next steps I believe before giving up is checking our links coming into website?
Is anybody able to please help me with regards to our links or point me in the right direction. I have no idea where to start or what do now?
Someone may see something really obvious so any help or guidance is greatly appreciated to assist me in gaining some UK organic rankings back.
Kind Regards,
Mark
-
Wow, thank you ever so much for this awesome response Cyrus! There is alot of information you have provided for me to work through, I appreciate your time.
A response to your question, we did receive a unnatural links warning back in 2011, being quite naive to this type of warning we stupidly ignored it
Once again many thanks,
Mark
-
Howdy,
Most of the major Google updates in the past 12 months have targeted backlinks more than anything else. So while it's always good to look at on-site issues and improve your site in every way, a thourough backlink audit is almost always prudent when you experience a major rankings drop - especially when the drop coincides with known Google updates.
Looking at your links in Open Site Explorer, we find several types of links that seem "suspicious" or "unnatural"
- http://personalisedtshirts.blogspot.com/ (sitewide, exact match anchor text - I highly suspect this is part of a blog network)
- http://www.wokietokie.blog.seo-catalog.com/Shopping/Gifts/ (penalized, deindexed directory)
- http://ww.w.b3directory.com/Business_and_Economy/Marketing_and_Advertising/?p=8 (low quality directory)
Unfortunately, these are the types of links targeted by Google and we often see these lead to penalties and or devaluation.
Have you received any messages/warnings in Google Webmaster Tools? That would be a good place to start. Typically, the best way to recover from this is to perform a complete link audit, then go through the arduous task of trying to get the links cleaned up, submit reconsideration requests, and possibly use the disavow tool.
I've been through it myself with several client sites and it can be a lengthy process the first time you do it. Some tips to consider:
1. What links to look at? First of all, John Mueller of Google recommends starting with the links listed in Google Webmaster Tools, as these are the links most likely to effect your rankings: http://www.youtube.com/watch?v=UX...
2. Third Party Tools: A couple of companies have developed tools to help identify bad links.
- SEOGadget Link Analyser - http://tools.seogadget.co.uk/ (has a great API - highly recommend this site)
- Link Detox - http://www.linkdetox.com/ (haven't used it, but I've heard decent things)
3. Manual Review - In the end, you're own eye is the best tool you have. You want to look for:
- links with keyword-rich, optimized anchor text
- Comment Signature and Forum Signature links (these are different from the more legitimate forum links)
- Sitewide links, such as in the sidebar or footer
- Obviously paid, or suspicious looking links on low-quality sites
Our friend Paddy Moogan wrote a great guide on how to do this step by step. You should check it out. I also made a video about earning high quality links. It might be worth a look: http://www.seomoz.org/blog/high-value-tactics-futureproof-link-building-whiteboard-friday
It usually takes Google 2-4 weeks to reply to reconsideration requests. In my experience they like to see both a manual effort to clean links, and I've also had success supplementing this with the disavow tool.
Lot's of information here. Hope it helps. Best of luck getting back on track!
-
Thanks for your answer, how do I work out which ones are spammy or not?
-
Hi Mark,
Go ahead and check all the links pointing to your website and start deleting all the spamy ones.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bigcommerce only allows us to have https for our store only, not the other pages on our site, so we have a mix of https and http, how is this hurting us and what's the best way to fix?
So we aren't interested in paying a thousand dollars a month just to have https when we feel it's the only selling point of that package, so we have https for our store and the rest of the site blogs and all are http. I'm wondering if this would count as duplicate content or give us some other unforeseen penalty due to the half way approach of https being implemented. If this is hurting us, what would you recommend as a solution?
Technical SEO | | Deacyde0 -
What is the best way to deal with https?
Currently, the site I am working on is using HTTPS throughout the website. The non-HTTPS pages are redirected through a 301 redirect to the HTTPS- this happens for all pages. Is this the best strategy going forward? if not, what changes would you suggest?
Technical SEO | | adarsh880 -
Is there a way for me to automatically download a website's sitemap.xml every month?
From now on we want to store all our sitemap.xml over the next years. Its a nice archive to have that allows us to analyse how many pages we have on our website and which ones were removed/redirected. Any suggestions? Thanks
Technical SEO | | DeptAgency0 -
Best Way to Break Down Paginated Content?
(Sorry for my english) I have lots of user reviews on my website and in some cases, there are more than a thousand reviews for a single product/service. I am looking for the best way to break down these reviews in several sub-pages. Here are the options I thought of: 1. Break down reviews into multiple pages / URL http://www.mysite.com/blue-widget-review-page1
Technical SEO | | sbrault74
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be indexed by search engines. Pros: all the reviews are getting indexed Cons: It will be harder to rank for "blue widget review" as their will be many similar pages 2. Break down reviews into multiple pages / URL with noindex + canonical tag http://www.mysite.com/blue-widget-review-page1
http://www.mysite.com/blue-widget-review-page2
etc... In this case, each page would be set to noindex and the canonical tag would point to the first review page. Pros: only one URL can potentially rank for "blue widget review" Cons: Subpages are not indexed 3. Load all the reviews into one page and handle pagination using Javascript reviews, reviews, reviews
more reviews, more reviews, more reviews
etc... Each page would be loaded in a different which would be shown or hidden using Javascript when browsing through the pages. Could that be considered as cloaking?!? Pros: all the reviews are getting indexed Cons: large page size (kb) - maybe too large for search engines? 4. Load only the first page and load sub-pages dynamically using AJAX Display only the first review page on initial load. I would use AJAX to load additional reviews into the . It would be similar to some blog commenting systems where you have to click on "Load more comments" to see all the comments. Pros: Fast initial loading time + faster loading time for subpages = better user experience Cons: Only the first review page is indexed by search engines ========================================================= My main competitor who's achieving great rankings (no black hat of course) is using technique #3. What's your opinion?0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Which is The Best Way to Handle Query Parameters?
Hi mozzers, I would like to know the best way to handle query parameters. Say my site is example.com. Here are two scenarios. Scenario #1: Duplicate content example.com/category?page=1
Technical SEO | | jombay
example.com/category?order=updated_at+DESC
example.com/category
example.com/category?page=1&sr=blog-header All have the same content. Scenario #2: Pagination example.com/category?page=1
example.com/category?page=2 and so on. What is the best way to solve both? Do I need to use Rel=next and Rel=prev or is it better to use Google Webmaster tools parameter handling? Right now I am concerned about Google traffic only. For solving the duplicate content issue, do we need to use canonical tags on each such URL's? I am not using WordPress. My site is built on Ruby on Rails platform. Thanks!0 -
Way to find how many sites within a given set link to a specific site?
Hi, Does anyone have an idea on how to determine how many sites within a list of 50 sites link to a specific site? Thanks!
Technical SEO | | SparkplugDigital0 -
What is the best way to change your sites folder structure?
Hi, Our site was originally created with a very flat folder structure - most of the pages are at the top level. Because we will adding more content I want to tidy up the structure first. I just wanted to check what the best way to go about this was. Is it best to: First configure all the new 301 redirects to point to the new pages, while leaving the actual links on our site pointing to the old pages. Then changing the links on the site after a few weeks. Configure the redirects and change the actual links on my website at the same time to point to the new locations. My thinking that if I go with option 1 route then I will give Google a chance to process all the redirects and change the locations in their index before I start pointing them to the new locations. But does it make any difference? What is the best wat to go about making this sort of change to minimize any loss in rankings, page rank etc? Thanks for the help.
Technical SEO | | Maximise0