Specific Page Penalty?
-
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me.
Checked by using gInfinity extension and searched for the page URL.
Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised?
The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised.
Would appreciate any advice.
Thanks
-
You may get no traffic from ranking #4 these days, especially on queries with a competitive paid portion of the SERP.
What I would do is stop assessing the "what if" scenario's and start focusing all your energy towards acquiring those editorial type links grasshopper was talking about.. right now! You'll get that ranking and secure it for long-term traffic.
-
There's no way to give an accurate time-scale answer to that question. If you're able to get editorial links from authoritative, trusted sites, you can see substantial movement within a week or two of the links being crawled. However, if your links are from lower-quality sites, or are weighted heavily toward devalued methods of link building (directories, reciprocal links, three-way links, etc), engines may not give those links much weight, if any, no matter how long you wait.
-
It has ranked well previously, according to Rank tracker on SEOmoz - it was ranking 4th last week however I don't think that is correct.
Is it a reliable tool??
Organic traffic shows no drop for keywords for the page in 2012 nor does page views for the page. If it was over-optimised, these would be noticeble in Google Analytics..
-
To me, the true clue would be whether or not the URL ranked well previously.
If it has not.. you need more links. It is probably a page authority issue.
If it has.. you may have over-optimized on the anchor text, sitewide links will do this. You may rank well for awhile, then you'll find yourself on page five shaking your fist at Google.
-
Hi Grasshopper,
I know the keywords I am trying to rank for are competitive.
I will take that into consideration and start working on these. How long do this take effect in Google engines?
Thanks
-
Hi Ronan,
Since it passes tests 1,2 and 4, I would say that #3 is the culprit. Having solid on-page optimization is great, but link authority is the name of the game for achieving ranking, especially if the keywords you're trying to rank for are competitive.
Run the Keyword Difficulty Tool against the keywords you're trying to rank for. I would expect that the URLs on page 1 of the SERPs all have significantly significantly stronger, more trustworthy link profiles than your URL does.
If that's the case, all the standard advice applies - create a truly differentiated page that offers content / resources / tools above and beyond what your competitors offer, and market the hell out of it.
-
The page is indexed
-
Hi Grasshopper,
Thanks for your input. I have checked each one and appears to be fine:
-
Yes
-
Text is true content on the page
-
The page itself does have low inbound links. It might be this?
-
Appearing first
Despite low number of inbound links, I wouldn't say this alone would cause the ranking issue as the page is well optimised and similar to competitors.
-
-
Hi Ronan,
First, to your general question - yes, it is possible for one page of a domain to be penalized / filtered, while the rest of the domain is not. However, it seems extremely unlikely that the URL in question would rank in Google Places if it was penalized. There are a few things you want to check:
-
The first thing you want to check is whether or not the page is indexed and cached, which is a simple query [cache:mypage.com/this-page]. Does it return a result?
-
If so, in the gray banner across the top of the cached page, click on "Text-only version". Does the machine-readable text match the true content on the page? If you have large amounts of machine-readable text that are only visible to an engine, and not a user, that can trip an algorithmic spam filter. Also, look for off-topic words - sometimes sites get hacked and hackers inject all kinds of spammy garbage and links, which can also trip the filter.
-
If the page is cached, and rendering the intended content, does it have sufficient link authority to rank for the terms you intend? It's quite possible that your page is in a competitive keyword space, and doesn't have enough juice to push past the competition.
-
If you want to see if it has enough juice to rank for anything at all, pick an sentence in the first paragraph of text, and search for it enclosed in quotes, ["Some random sentence from my first paragraph here."] Is your URL the #1 result? It should be. If there are other sites that you've syndicated your content to, or have scraped your content and are more authoritative than your site, it's possible that your URL isn't ranking because it's being (incorrectly) filtered out as duplicate content.
Hope that helps.
-
-
Is the URL no longer in Google's index at all?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens when we canonical and point to a page which has been redirected to another page? Google response!
Hi all, I would like to know the different scenarios Google going to respond when we use canonical and redirect for duplicate pages. Let's say A to B are duplicate pages with 95% same content and C Doesn't have same content but context wise similar and priority page we expect to rank for. What happens if we canonical from A to B and set redirect from B to C? What if both A and B are pointed to C with canonical? What if A or B deleted and other one is canonical to C? Note: We can noindex or 301 redirect as they have their own visitors. This is more about showing most relevant content to the audience and avoid duplicate content in search results. Thanks
Algorithm Updates | | vtmoz0 -
Why is Page Authority dropping?
Hi I'm trying to review pages which have previously ranked, but in March have dropped out completely. Some of these pages I can see have dropped to having a Page Authority of 1, we haven't changed anything on these pages, so is there a reason why the authority has dropped? These pages only had around 8 - 10 Page Authority to begin with. I'm trying to identify why we have lost keywords, and if it has anything to do with the Google Updates in March Here are examples of the pages with drops: http://www.key.co.uk/en/key/heavy-duty-shelving-1830x1830mm-blue-orange
Algorithm Updates | | BeckyKey
http://www.key.co.uk/en/key/metal-feet-for-heavy-duty-steel-shelving
http://www.key.co.uk/en/key/health-and-safety-law-poster-a2 Thank you!0 -
Can we ignore "broken links" without redirecting to "new pages"?
Let's say we have reaplced www.website.com/page1 with www.website.com/page2. Do we need to redirect page1 to page2 even page1 doesn't have any back-links? If it's not a replacement, can we ignore a "lost page"? Many websites loose hundreds of pages periodically. What's Google's stand on this. If a website has replaced or lost hundreds of links without reclaiming old links by redirection, will that hurts?
Algorithm Updates | | vtmoz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
Impact of recent On Page Optimisation changes had negative impact !
Hi I recently updated some page titles, H1 tags & on page content which overall has seen search results slip down following the first site crawl by google I assume. My question is, should I try to get back the rankings and test and change one thing at a time to see the impact right now or should i wait for a period of time for it to settle down once goggle has crawled the site a few times or will the subsequent crawls have no impact? Thanks Ash
Algorithm Updates | | AshShep10 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Google place page Images
Is there any real difference in uploading an images directly to your google places page or linking an image from another site? I have heard that you get better results if you upload a photo to photo bucket then to insider pages then post that link to your google places page. To me it just seems a bit odd to do things this way. I get that it's suppose to give you more back links however I don't think it would necessarily be relevant or useful for the user. Any thoughts??
Algorithm Updates | | christinarule0