My ranking dropped 3 pages on 18 november 2012
-
Hi There
my site ranking dropped suddenly today for my main keywords such as security companies in london and security services in london from first page to 4th-5th page.
these keywords were ranked on homepage http://www.armstrongsecurity.co.uk/
other keywords from some internal pages, such as this one http://www.armstrongsecurity.co.uk/security-services/event-security-london.html
theygot hit slightly and went couple of listings down the road for event security london, event security companies london as well.
same slight hitting happened on this page for main keywords http://www.armstrongsecurity.co.uk/bodyguard-for-hire-london.html
can anyone help me, how to get the rankings back?
my site authority is around 60 which is far better than most sites ranking higher than me now.
these are some problems that i understand so far.
keyword rich anchor text link profile for my main keywords
over optimised pages
let me know if anything you might find suspicious on my site that i can fix either on site or in my link profile.
looking forward to your help.
thanks
gill
-
the site we are working with that experienced a current drop as well.
-
SEO5,
Is there a specific tool you used to find over optimization or was it a spot check? I am dealing with a similar situation. Our website has been on the top page of Google for the keyword plumbing for years and just dropped to position 81.. The only things we have been doing is the things that Seomoz.org suggests like adding schema.org information, and adding more user qualified content. (no keyword specification really)
-
http://www.armstrongsecurity.co.uk/bodyguard-for-hire-london.html is way to over optimized. for bodyguard services etc... also clean up
tags should only be one per page.
-
Just looking at these pages, you might be over optimizing them for your targeted keywords. I can see that the targeted keywords that have been used extensively all over the pages.
Clean the pages up with new title tags, header tags, image tags and internal links. Also check your backlink profile for the site and use the disavow tool to highlight any bad links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Why a drop in certain keyword but not others?
Hi. I am looking into a potential clients SEO issues and reasons why their rankings for specific keywords that they ranked for have dropped but others stayed the same or sometimes have improved. They received a manual spam action that was revoked after some disavow and so on. In May of this year they noticed a huge drop in more generic terms. The main keyword that was ranked 6th, dropped to 35th yet other keywords rose slightly. I have noticed this issue for another potential client as well. They ranked 1st for their brand name, then received a spam action that was revoked. Now they do not rank for their brand name but do for other long tail keywords? Any ideas the best way to investigate this or root out the issue and build to improve rank for more generic keywords? Thanks
White Hat / Black Hat SEO | | YNWA0 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
Should I redirect old pages
I have taken over SEO on a site. The old people built thousands of pages with duplicate content. I am converting site to wordpress and was wondering if I should take the time to 301 redirect all 10,000 or so pages with duplicate content. The 10,000 pages all have links back to different to different pages as well as to the homepage. Should I just let them go to a 404 page not found.
White Hat / Black Hat SEO | | Roots70 -
Link quality warning from GWT and drop in keyword ranking.
So last December we saw our hard work pay off as our Panda penalty was lifted and our traffic shot back up to pre-Panda levels. Then in February we received this note: We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. Since December we've lost position on 80% of our top 100 keywords. I've gone through our links and can't figure out what the problem may be. Maybe I'm not using OSE properly. We don't buy links so I'm not sure what the problem is. If someone can walk me through using OSE to see what the problem may be I would appreciate it. Our domain is http://bit.ly/rbkYkp
White Hat / Black Hat SEO | | IanTheScot0 -
How to rank internal pages?
Hello, I have a website about consoles, on the homepage are a few thoughts about what consoles are and a short history. The main attraction are the pages about Xbox 360, PlayStation 3, Nintendo Wii, PSP Vita. So, I want to rank my homepage and my internal pages about the consoles ranking for "xbox360", "play station 3" each one on a separate page of course. Basically I want to rank brands. My main questions are: 1. How much link builing should I do for my homepage considering that I'm not really interested in ranking it as much as the internal pages? In percentage how it would look like? Random (stupid) example: 60% links to homepage, 10% to each internal page? 2. I guess I must do links for internal pages otherwise they won't rank good, only linking to homepage. 3. Considering the penguin update, my main keyword should be around what % of the overall anchors to each internal page? Thank you very much for your help!
White Hat / Black Hat SEO | | corodan0 -
Pages For Products That Don't Exist Yet?
Hi, I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for. Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger. What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party? The pages would be like "coming soon" pages, but still optimized to the main product search term. About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking. What do you think? Thanks!
White Hat / Black Hat SEO | | 945010