Do you use your own Blog networks?
-
Do you use a network of sites you own for links to your clients in your seo efforts? I see so many seo companies doing this from such junk sites with all their clients in the blog roll, it seems totally crazy. It seems this stuff works do any of you do this if so how do you keep it white hat?
-
Boy you have to be careful with that approach. I've seen it done successfully but the amount of work it takes to do it right is outrageous.
I do keep a list of link building contacts around to do the same sort of thing. If you can build a solid relationship with blog owners, many will eventually just give you the wordpress login and let you publish on your own. If you deliver really solid content as guest posts, a see bloggers open up to more guest posts often. At this point, I'm pretty confident I could get a hundred or so links up no problem using these contacts. I guess this is how I make the same sort of approach white-hat.
In the end, if you're constructing a link network for your own clients, it's far too easy to make a mistake. Maybe too many of the same sites are on the same host, or even use the same Adwords login. Google can pretty easily tell if the same person has involvement in a large number of sites linking to one site. If they catch on, you're pretty likely to get a penalty.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Webmaster tools: which one do you use? Yandex Yay or Nay?
I usually verify websites on Google and Bing Webmaster. How important it is to verify on Yandex Webmaster if Russia is not one of the targeted locations?
Intermediate & Advanced SEO | | selectitaly0 -
Are all duplicate content issues bad? (Blog article Tags)
If so how bad? We use tags on our blog and this causes duplicate content issues. We don't use wordpress but with such a highly used cms having the same issue it seems quite plausible that Google would be smart enough to deal with duplicate content issues caused by blog article tags and not penalise at all. Here it has been discussed and I'm ready to remove tags from our blog articles or monitor them closely to see how it effects our rankings. Before I do, can you give me some advice around this? Thanks,
Intermediate & Advanced SEO | | Daniel_B
Daniel.0 -
Should we use the rel-canonical tag?
We have a secure version of our site, as we often gather sensitive business information from our clients. Our https pages have been indexed as well as our http version. Could it still be a problem to have an http and an https version of our site indexed by Google? Is this seen as being a duplicate site? If so can this be resolved with a rel=canonical tag pointing to the http version? Thanks
Intermediate & Advanced SEO | | annieplaskett1 -
SEO friendly blog.
i've read somewhere that if you list too many links/articles on one page, google doesn't crawl all of them. In fact, Google will only crawl up to 100 links/articles or so. Is that true? If so, how do I go about creating a page or blog that will be SEO friendly and capable of being completely crawled by google?
Intermediate & Advanced SEO | | greenfoxone0 -
Shoud I be submitting new blog enteries to any directories?
We recently relaunched our blog.towelsrus.co.uk and this is personally a new venture for me Are there site i should be telling we have new content (yet to publish any but close to). As it's a wordpress blog are there any plugins that do this automatically?
Intermediate & Advanced SEO | | Towelsrus0 -
Is blogging enough to keep my site fresh?
I know that one of the things Google looks for is fresh content. My main site is a wordpress site where I have optimized about 12 static pags and I also do blog posts. Assuming I continue to blog, do I need to give my pages a periodic "face lift" or is enough just to keep blogging and add a static page from time to time? Paul
Intermediate & Advanced SEO | | diogenes0 -
Duplicate Content on Blog
I have a blog I'm setting up. I would like to have a mini-about block set up on every page that gives very brief information about me and my blog, as well as a few links to the rest of the site and some social sharing options. I worry that this will get flagged as duplicate content because a significant amount of my pages will contain the same information at the top of the page, front and center. Is there anything I can do to address this? Is it as much of a concern as I am making it? Should I work on finding some javascript/ajax method for loading that content into the page dynamically only for normal browser pageviews? Any thoughts or help would be great.
Intermediate & Advanced SEO | | grayloon0