Competitors Dodgy Link Profile - How to handle?
-
Hi All,
One of my competitors doesn't seem to mind collecting links with no relevance and over optimisation of anchor text also leaves me scratching my head as to ho wit's possible for them to still get the rankings they do.
My question is how do I handle them from a competitive point of view? Do I pay for a batch of poor links to topple this competitor? Or do I bide my time?
Thanks
-
I really don't think it is worth doing that!
What Istvan says is 100% correct, you will beat him in time. Why start wasting your time on your competitor's site rather than focusing on your own?
-
Oh, I understand... Well, I wouldn't. Instead try to figure out why your website isn't ranking that well, try to get yourself an SEO to do list and check both on-site and off-site, are you giving enough juice to contribute to the quality of the site? Where are your "weak" spots, how could you resolve those, etc.
If I would in your situation, I would approach first my site, am I really doing everything to rank or not? Then I would check competitors (but just in order to see which are their strong points, their weaknesses), use their strong points on your website, make it even better and try to avoid the same mistakes that they are doing.
I hope this helped,
Istvan
-
No I meant point them to the competitors site/
-
Hey,
Focus on quality vs. quantity. Your competitor might be collecting links from every scratch website, but if you focus on quality, with time you will beat him.
Gr.,
Istvan
P.S. If you are going to pay for poor quality links... Why would your website rank better?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does reciprocal linking carry any value?
No matter how much I research this one, there's no definite answer and there's a lot of contradictions. Basically we're looking to launch an article on 24 expert interior design tips for 2015. Each tip is submitted from a different interior designer we have chosen who have a reputable, trusted website. The main goal for this article is to generate various inbound links for our site from the designers and it will help to create engagement on social media. Although if we're giving out links to these designers for their contributions, the inbound links we receive in return will be little or no value as this is reciprocal linking? Some say this is okay as it's completely natural within the blog posts, others say to avoid it as it can be seen as an obsolete practice to deceive Google. Does anyone have any more information on this and how it should be carried out? Would a better process be to link to their social media accounts? Rather than reciprocal linking? Thanks
Technical SEO | | Jseddon920 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
How to Handle Subdomains with Irrelevant Content
Hi Everyone, My company is currently doing a redesign for a website and in the process of planning their 301 redirect strategy, I ran across several subdomains that aren't set up and are pointing to content on another website. The site is on a server that has a dedicated IP address that is shared with the other site. What should we do with these subdomains? Is it okay to 301 them to the homepage of the new site, even though the content is from another site? Should we try to set them up to go to the 404 page on the new site?
Technical SEO | | PapercutInteractive0 -
Should we rel=nofollow these links ?
On our website, we have a section of free to low-cost tools that could help small business increase their productivity without spending big bucks. For example, this is the page for online collaboration tools: http://www.bdc.ca/EN/solutions/smart_tech/tech_advice/free_low_cost_applications/Pages/online_collaboration_tools.aspx None of the company pay anything to be on these list. We actually do quite a lot of research to chose which should be listed there and which should not. Recently, one of the company in our lists asked us to add rel=nofollow to the link to their website because they add been targeted by a manual action on Google and want their link profile to be as clean as possible (probably too clean). My question is : Should we add rel=nofollow to all these links ? Thanks, Jean-François Monfette
Technical SEO | | jfmonfette0 -
Link Detox
Hey guys, I'm currently working on cleaning up our link profile and have been looking at several tools. Has any one used this from http://www.linkresearchtools.com do you think its worth investing in? Matthew
Technical SEO | | EwanFisher0 -
HTTP301 or link ?
We have a page on a website (let's name it ABC) which ranks very well on Google for a specific keyword but this keyword is not the main activity of website ABC. For this reason we created website XYZ for offering the services related to the specific keyword. How shall we redirect the visitors from website ABC to website XYZ so XYZ gets all the weight ? Is it best to do an HTTP301 from the specific page on site ABC or from site ABC, remove nearly all content related to the keyword and create a link to website XYZ ? Your advice is well appreciated.
Technical SEO | | netbuilder0 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0