Specific Page Penalty?
-
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me.
Checked by using gInfinity extension and searched for the page URL.
Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised?
The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised.
Would appreciate any advice.
Thanks
-
You may get no traffic from ranking #4 these days, especially on queries with a competitive paid portion of the SERP.
What I would do is stop assessing the "what if" scenario's and start focusing all your energy towards acquiring those editorial type links grasshopper was talking about.. right now! You'll get that ranking and secure it for long-term traffic.
-
There's no way to give an accurate time-scale answer to that question. If you're able to get editorial links from authoritative, trusted sites, you can see substantial movement within a week or two of the links being crawled. However, if your links are from lower-quality sites, or are weighted heavily toward devalued methods of link building (directories, reciprocal links, three-way links, etc), engines may not give those links much weight, if any, no matter how long you wait.
-
It has ranked well previously, according to Rank tracker on SEOmoz - it was ranking 4th last week however I don't think that is correct.
Is it a reliable tool??
Organic traffic shows no drop for keywords for the page in 2012 nor does page views for the page. If it was over-optimised, these would be noticeble in Google Analytics..
-
To me, the true clue would be whether or not the URL ranked well previously.
If it has not.. you need more links. It is probably a page authority issue.
If it has.. you may have over-optimized on the anchor text, sitewide links will do this. You may rank well for awhile, then you'll find yourself on page five shaking your fist at Google.
-
Hi Grasshopper,
I know the keywords I am trying to rank for are competitive.
I will take that into consideration and start working on these. How long do this take effect in Google engines?
Thanks
-
Hi Ronan,
Since it passes tests 1,2 and 4, I would say that #3 is the culprit. Having solid on-page optimization is great, but link authority is the name of the game for achieving ranking, especially if the keywords you're trying to rank for are competitive.
Run the Keyword Difficulty Tool against the keywords you're trying to rank for. I would expect that the URLs on page 1 of the SERPs all have significantly significantly stronger, more trustworthy link profiles than your URL does.
If that's the case, all the standard advice applies - create a truly differentiated page that offers content / resources / tools above and beyond what your competitors offer, and market the hell out of it.
-
The page is indexed
-
Hi Grasshopper,
Thanks for your input. I have checked each one and appears to be fine:
-
Yes
-
Text is true content on the page
-
The page itself does have low inbound links. It might be this?
-
Appearing first
Despite low number of inbound links, I wouldn't say this alone would cause the ranking issue as the page is well optimised and similar to competitors.
-
-
Hi Ronan,
First, to your general question - yes, it is possible for one page of a domain to be penalized / filtered, while the rest of the domain is not. However, it seems extremely unlikely that the URL in question would rank in Google Places if it was penalized. There are a few things you want to check:
-
The first thing you want to check is whether or not the page is indexed and cached, which is a simple query [cache:mypage.com/this-page]. Does it return a result?
-
If so, in the gray banner across the top of the cached page, click on "Text-only version". Does the machine-readable text match the true content on the page? If you have large amounts of machine-readable text that are only visible to an engine, and not a user, that can trip an algorithmic spam filter. Also, look for off-topic words - sometimes sites get hacked and hackers inject all kinds of spammy garbage and links, which can also trip the filter.
-
If the page is cached, and rendering the intended content, does it have sufficient link authority to rank for the terms you intend? It's quite possible that your page is in a competitive keyword space, and doesn't have enough juice to push past the competition.
-
If you want to see if it has enough juice to rank for anything at all, pick an sentence in the first paragraph of text, and search for it enclosed in quotes, ["Some random sentence from my first paragraph here."] Is your URL the #1 result? It should be. If there are other sites that you've syndicated your content to, or have scraped your content and are more authoritative than your site, it's possible that your URL isn't ranking because it's being (incorrectly) filtered out as duplicate content.
Hope that helps.
-
-
Is the URL no longer in Google's index at all?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we need to maintain consistency in page titles suffix?
Hi all, We usually give "brand & primary keyword" across all pages in website like "vertigo tiles". Do we need to maintain this suffix across all page titles? What if we change according to the page? Will Google downlook for not maintaining these page titles suffix like I mentioned? Thanks
Algorithm Updates | | vtmoz0 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Latest Best Practices for Single Page Applications
What are the latest best practices for SPA (single page application) experiences? Google is obviously crawling Javascript now, but is there any data to support that they crawl it as effectively as they do static content? Considering Bing (and Yahoo) as well as social (FB, Pinterest, etc) - what is the best practice that will cater to the lowest-common denominator bots and work across the board? Is a prerender solution still the advised route? Escaped fragments with snapshots at the expanded URLs, with SEO-friendly URL rewrites?
Algorithm Updates | | edmundsseo2 -
Can a page be 100% topically relevant to a search query?
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent. Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
Algorithm Updates | | Christy-Correll1 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
Increasing Brands/Products thus increasing pages - improve SEO?
We curently have 5 brands on our website and roughly 200 pages. Does increasing the number of products you stock and thus increasing the number of pages improve your SEO?
Algorithm Updates | | babski0 -
Best Practices for Page Titles | RSS Feeds
Good Morning MOZers, Quick question for the community: when creating an RSS feed for one of your websites, how do you title your RSS feed? Currently, the sites I'm managing use the 'rss.xml' for the file name, but I was curious to know whether or not it would, in any way, benefit my SERP if I were to add my domain to precede the 'rss.xml', i.e. 'my-sites-rss.xml' or something of that nature. Beyond that, are there any 'best practices' for creating RSS feed page titles or is there a preferred method of implementation? Anybody have any solutions
Algorithm Updates | | NiallSmith0 -
Site name appended to page title in google search
Hi there, I have a strange problem concerning how the search results for my site appears in Google. The site is Texaspoker.dk and for some strange reason that name is appended at the end of the page title when I search for it in Google. The site name is not added to the page titles on the site. If I search in Google.dk (the relevant search engine for the country I am targeting) for "Unibet Fast Poker" I get the following page title displayed in the search results: Unibet Fast Poker starter i dag - få €10 og prøv ... - Texaspoker.dk If you visit the actual page you can see that there is no site name added to the page title: http://www.texaspoker.dk/unibet-fast-poker It looks like it is only being appended to the pages that contains rich snippets markup and not he forum threads where the rich snippets for some reason doesn't work. If I do a search for "Afstemning: Foretrukne TOPS Events" the title appears as it should without the site name being added: Afstemning: Foretrukne TOPS Events Anybody have any experience regarding this or an idea to why this is happening? Maybe the rich snippets are automatically pulling the publisher name from my Google+ account... edited: It doesn't seem to have anything to do with rich snippets, if I search for "Billeder og stuff v.2" the site name is also appended and if I search for "bedste poker bonus" the site name is not.
Algorithm Updates | | MPO0