How to beat Wikipedia article from the top spot on SERPS?
-
Hi Guys,
One of our clients has a good web site with lots of content that is ranked already on #2 for the top keyword (singular and plural) on Google UK. The keyword itself is a competitive one. The top spot is occupied by a wikipedia article that doesn't have much content in general. Can anyone come up with an advice what strategy we have to apply to outplace that article? Thanks!
-
Thank you guys!
-
Wikipedia can be really really really hard to beat.... did I say really really really really hard to beat?
Just keep working on your site like wikipedia was any other competitor. Build great content that gets liked and tweeted.. .stuff that engages your visitors.
There is no special bullet for killing wikipedia. You simply must overpower them by brute force of a great website.
Good luck.
-
I have seen some results for Wikipedia where you pull your hair out hehe, but the thing is the site is soo high authority and the internal link value it holds.
To be honest I have had experience taking down Wikipedia where you deal with big brand websites and you can highly target the site.
Really do some analysis on the links the wikipedia page has, see what you may be missing if content is not the problem.
Just keep pushing fresh content and social signals too if you can try and implement people to search for your website and drive higher CTR on the serp page.
-
I wouldn't suspect so. Wiki is seen as an incredibly Authoritive site and has many high quality links pointing to it, so it's high rankings are mainly down to the site being so authoritive and huge.
Wiki fulfills many of the factors within the periodic table of SEO ranking factors at http://searchengineland.com/seotable It's a difficult site to beat, though can and is certainly achieved.
Glad you like the suggestions, they will help to get there.
Regards
Simon
-
Thanks Simon, will try those. Do you think that google applies different ranking factors when it comes to Wikipedia in general?
-
Hi Ivaylo
I shall share a few pointers with you here for consideration;
-
Perform an on-page analysis of the website to identify and help resolve any issues that might come up, such as too many on-page links or too many no-followed links pointing in, any issues with titles or descriptions... (The SEOmoz toolset is great at helping with this).
-
Research what valuable links are pointing to the Wiki page and try and get some of the same links pointing to your clients' site (new followed links from different reputable websites will help a lot). Also, identify existing links where the anchor text could be improved.
-
Keep the content fresh, relevant and interesting.
-
Depending on what your clients' site offers, consider if there are any tools/widgets that could be developed to help make the site more useful.
-
Consider building upon the Social aspect, such as engaging with people on Twitter, Forums and Guest Blogging to attract more visitors and more sharing of your content.
Hope that helps,
Regards
Simon
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will URLS With Existing 301 Redirects Be as Powerful As New URLS In Serps?
Most products on our site have redirects to them from years of switching platform and merely trying to get a great and optimised URL for SEO purposes. My question is this: If a product URL has alot of redirects (301's), would it be more beneficial to me to create a duplicated version of the product and start fresh with a new URL? I am not on here trying to gain backlinks but my site is tn nursery dot net (proof:)
Intermediate & Advanced SEO | | tammysons
I need some quality help figuring out what to do.
Tammy0 -
Does Google Delay or Graduate SERP Changes?
You can request re-indexing of a single page via Google Search Console. It would seem to me you could use this feature to experiment with on-page changes to see the rank change to determine which changes have the most effect. For the sake of this thread, lets temporarily forget that the relative importance on various on-page factors has already been reverse engineered to a degree so we already have a general idea to som extent. It would seem to me if I were Google, I would introduce either a random delay period, or, temper rank change after reindexing. What I mean by that latter point is say a reindex takes a page from position 20 to 10. If it is 'tempered' so to speak on Day 2 after reindexing it might be at 18, day 5 it's at 16, day 7 it's at 16 until it reaches the actual "real" rank. Both the delay and or the tempering of rank change would make it difficult more difficult to reverse engineer relative importance of on-page factors. OR, does Google realize there are large SEO firms doing SEO over several years for many sites that can examine aggregate data to determine these factors so Google doesn't delay (aka sandbox) or temper rank changes due to manual re-indexing?
Intermediate & Advanced SEO | | Semush0 -
_Styling in my SERP, why?_
So Google is using an tag for my SERP snippet and there is styling in that that uses _and for some reason that _is showing up in the snippet itself. So "blah, blah blah, _19.99 _blah" is what is being displayed. How do I prevent this from happening? I thought styling within the header tags was fine? So what I am trying to do basically: Adventure Bear 16 oz or Adventure Bear 1.1oz Is this allowed or am I making a terrible mistake here? Thanks,
Intermediate & Advanced SEO | | DRSearchEngOpt
Chris Birkholm____0 -
Best places to seed articles UK based
Hi I have written 2 articles for 2 seperate businesses and markets. One article is a Top 10 tips on choosing a conservatory How would I go about promoting and seeding this around related home improvement websites around the UK, use stumbleupon? 2. I also have recipes for a restaurant which I need to seed and promote online in order to gain links and promote the restaurant. Again which methods are best in finding sources to list these recipes and to related blogs etc Many Thanks
Intermediate & Advanced SEO | | ocelot0 -
How to remove "Results 1 - 20 of 47" from Google SERP Snippet
We are trying to optimise our SERP snippet in Google to increase CTR, but we have this horrid "Results 1 - 20 of 47" in the description. We feel this gets in the way of the message and so wish to remove it, but how?? Any ideas apart from removing the paging from the page?
Intermediate & Advanced SEO | | speedyseo0 -
New server update + wrong robots.txt = lost SERP rankings
Over the weekend, we updated our store to a new server. Before the switch, we had a robots.txt file on the new server that disallowed its contents from being indexed (we didn't want duplicate pages from both old and new servers). When we finally made the switch, we somehow forgot to remove that robots.txt file, so the new pages weren't indexed. We quickly put our good robots.txt in place, and we submitted a request for a re-crawl of the site. The problem is that many of our search rankings have changed. We were ranking #2 for some keywords, and now we're not showing up at all. Is there anything we can do? Google Webmaster Tools says that the next crawl could take up to weeks! Any suggestions will be much appreciated.
Intermediate & Advanced SEO | | 9Studios0 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0 -
Duplicate Content from Article Directories
I have a small client with a website PR2, 268 links from 21 root domains with mozTrusts 5.5, MozRank 4.5 However whenever I check in google for the amount of link: Google always give the response none. My client has a blog and many articles on the blog. However they have submitted their blog article every time to article directories as well, plain and simle creating duplicate and content. Is this the reason why their link: is coming up as none? Is there something to correct the situation?
Intermediate & Advanced SEO | | danielkamen0