Best strategy for "product blocks" linking to sister site? Penguin Penalty?
-
Here is the scenario -- we own several different tennis based websites and want to be able to maximize traffic between them. Ideally we would have them ALL in 1 site/domain but 2 of the 3 are a partnership which we own 50% of and why are they are off as a separate domain. Big question is how do we link the "products" from the 2 different websites without looking spammy? Here is the breakdown of sites:
Site1: Tennis Retail website --> about 1200 tennis products
Site2: Tennis team and league management site --> about 60k unique visitors/month
Site3: Tennis coaching tip website --> about 10k unique visitors/month
The interesting thing was right after we launched the retail store website (site1), google was cranking up and sending upwards of 25k search impressions/day within the first 45 days. Orders kept trickling in and doing well overall for first launching. Interesting thing was Google "impressions" peaked at about 60 days post launch and then started trickling down farther and farther and now at about 3k-5k impressions/day. Many keywords phrases were originally on page 1 (position 6-10) and now on page 3-8 instead.
Next step was to start putting "product links" (3 products per page) on site2 and site3 -- about 10k pages in total with about 6 links per page off to the product page (1 per product and 1 per category). We actually divided up about 100 different products to be displayed so this would mean about 2k links per product depending on the page.
FYI, those original 10k pages from site2 and site3 already rank very well in Google and have been indexed for the past 2+ years in there. Most popular word on the sites is Tennis so very related.
Our rationale was "all the websites are tennis related" and figured that the links on the latest and greatest products would be good for our audience. Pre-Penguin, we also figured this strategy would also help us rank for these products as well for when users are searching on them.
We are thinking through since traffic and gone down and down and down from the peak of 45 days ago, that Penguin doesn't like all these links -- so what to do now?
How to fix it and make the Penguin happy? Here are a couple of my thoughts on fixing it:
1. Remove the "category link" in our "product grouping" which would cut down the link by 1/3rd.
2. Place a "nofollow" on all the links for the other "product links". This would allow us to get the "user clicks" from these while the user is on that page.
3. On our homepage (site2 & site3), place 3 core products that change frequently (weekly) and showcase the latest and greatest products/deals. Thought is to NOT use the "nofollow" on these links since it is the homepage and only about 5 links overall.
Heck part of me debated on taking our top 1000 pages (from the 10k page) and put the links ONLY on those and distribute about 500 products on them so this would mean only 2 links per product -- it would mean though about 4k links going there. Still thinking #2 above could be better?
Any other thoughts would be great!
Thanks,
Jeremy
-
If you are saying you have 75k links from your domains cross linking then I would agree. Product websites have been under attack for awhile now.Tough one.
-
Thomas,
Thanks for your thoughts and good questions...
All 3 of the websites are hosted on different dedicated Amazon AWS instances - each having their own independent IP addresses. Yeah it is possible google could be picking up something here, but not sure.
Content is decent overall -- it does need more though and with that many products, some of the content is the "default" manufacturer text so we will need to beef it up as well and make it more unique text. As for product uniqueness, all the tennis online retailers sell the same products so the key will be unique text since the names/prices are all set from the manufacturers.
All of our product URLs are "search engine safe" urls with the following methodology -> /product/[mfg-name]/[product-name]/ I am thinking we are pretty good with the URLs.
Unfortunately I think Google is making products harder now since there normally isn't lots of "content" on a Wilson ProStaff Six.One Tennis Racquet. My guess is this is also why they are charging for google merchant as well.
I am still thinking we should drop down the "total links" (followed links) in our "linking" since right now with the different product links from Site2 and Site3 the count is about 75k. Plus also work on more unique content online as well.
-
In my opinion, linking from sister sites can be done if the links are useful. Seomoz.org links to Mozcast.com. I link from my offsite blog to my main site. I believe that when the link is relevant then you can link it over. If the links are footer or done by a program that links all "key-word" mentions to the other site then it looks bad. The best place for links is in the content and where it is relevant. Diversify the anchor text. Make it more natural. Your hosting on these sites could also be affecting your linking. Are all the links from the same IP address? Are all of links coming form within your 3 sites?
There are many other reasons your rank could be dropping. Duplicate content, thin content, bad links.. For most product websites it seems that the product description is either too thin, or duplicate from all the other product descriptions for that item online. How unique is your product content?
When ever I approach a poor performing website, I try to evaluate all the onsite elements first. After I've fixed all of the onsite elements then I will look at linking problems.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
INTERNAL LINKS strategy on our website
Hi Moz-ers, Currently doing an audit of our website. I have two questions on links. How can I see the current state of my internal links? Also, how can I improve our internal links on the website? what is a good framework to follow what should I avoid Thanks, looking forward to learning more on Moz!
Intermediate & Advanced SEO | | Eric_S
Eric0 -
Can I change a URL on a site that has only a few back links?
I have a site that wants to change their URL, It's a very basic site with hardly any backlinks. http://www.cproofingandexteriors.com/ The only change they want to make is taking out the 'and'.. so it would be cproofingexteriors.com they already own the domain. What should I do?? Thanks
Intermediate & Advanced SEO | | MissThumann0 -
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
Penguin 4.0 and homepage level penalties
Hey folks Looking to get some input from what other people are seeing with Penguin 4.0 and historically penalised sites. We have three sites we are looking at currently - all had historically brutal penguin penalties. All have done extensive clean up and are respectable businesses and have seen some manor of recovery or improvement. However, we are seeing issues at a homepage level with these three sites in that the homepage currently does not rank for the main terms but an inner page does in it's place (but not as well as we would expect given everything else). This applies to a single keyword on all three of these sites - add a modifier to that keyword and they rank top of first page (often 1st place). Example of modifiers being 'installer', 'uk', 'supplier' etc. That main keyword though only ranks top of 3rd page in this instance and it is an inner page and not the homepage which is the best fit for the targeted term. Question Is anyone else seeing this? Sites that have gone from no visibility in top 50 for a previously abused term that are now seeing some visibility page 2 / page 3 for the big terms and top of page 1 visibility for those terms + modifiers. Thoughts This seems a bit odd to me and hard to understand in light of the Penguin 4.0 announcement if there is no demotion and only devaluation of bad links then why would a single page still be seemingly so heavily effected how can an algorithm that focuses on devaluation of bad links still be granular as this seems to be a penalty of sorts that effects a specific page for a specific keyword (the one most abused historically in terms of link building). two of these are big companies, biggest in their industry in the real world with lots of high visibility clients like TV shows, IKEA etc. Lots of natural highly authoritative links, good content etc - we are digging in further but certainly looks like they have their house largely in order. Note We have one other client that I believe is seeing something similar on an internal page and that page was the main link target for spammy links of old that are now removed. However, it appears Google has a memory regarding even these removed links. I mention this primarily as I don't believe this is homepage specific but rather that is the case as the homepage was the main link target historically. Summary These sites are seeing movement - huge movement. Not exactly what we would expect though given the extensive clean up and talk around how this release of the algorithm works. Be interested to see what you are seeing out there folks and if anyone has seen anything similar. Cheers
Intermediate & Advanced SEO | | Marcus_Miller
Marcus0 -
Best Way To Go About Fixing "HTML Improvements"
So I have a site and I was creating dynamic pages for a while, what happened was some of them accidentally had lots of similar meta tags and titles. I then changed up my site but left those duplicate tags for a while, not knowing what had happened. Recently I began my SEO campaign once again and noticed that these errors were there. So i did the following. Removed the pages. Removed directories that had these dynamic pages with the remove tool in google webmasters. Blocked google from scanning those pages with the robots.txt. I have verified that the robots.txt works, the pages are longer in google search...however it still shows up in in the html improvements section after a week. (It has updated a few times). So I decided to remove the robots.txt file and now add 301 redirects. Does anyone have any experience with this and am I going about this the right away? Any additional info is greatly appreciated thanks.
Intermediate & Advanced SEO | | tarafaraz0 -
Because Goolge chose this link to my site?
I'm better ranked in Google for that link (http://www.vipgoldrj.com/paginas/ensaios.html) and not in (http://www.vipgoldrj.com/), you know you explain why? In all keywords, except that (luxury escorts in Rio de Janeiro) Sorry my english, I'm from Brazil and I'm using Google translator.
Intermediate & Advanced SEO | | WebMaster0210 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
How to remove bad link to your site?
Hello, Our website www.footballshirtblog.co.uk recently suffered a major Google penalty, wiping out 6 months of hard work. We went from getting 6000-10000 hits a day to absolutely nothing from Google. We have been baffled by the penalty as we couldn't think of anything we've done wrong. After some analysis of Open Site Explorer, it seems I may have found the answer. There is a ton of bad links pointing to us. A few example domains are: ru.gg/ gogopzh.com/ 0575bbs.com/ This is nothing to do with us and so I can only assume some competitor has done this. As we were only about 4-5 months old, I guess Google has punished us. What do we do now? This is not a situation I have experienced before and would really appreciate your expert advice.
Intermediate & Advanced SEO | | ukss19840