Damage Control
-
So here's the deal.....I HAVE to link my external website www.ldnwicklesscandles.com....and any shopping/ecommerce sections of it to my company sponsored website: ukwicklessscents.scentsy.co.uk so all transactions go through that.
Why do a separate website you might ask? ....our company websites are extremely limited in design and content we're allowed to add like blogs, etc, etc and things to add content to help with longtail searches. So most of us involved with the company do external sites and link to our other sites.
Which means I have link juice spread over the two sites because so much of the site is linking out...which most means I've got to work twice as hard.
Although my hands are really tied with a lot of things with this, I'm wondering if there's anything you might recommend to lessen the damage so to speak....maybe like changing the navigational structure of my external website so I'm only linking out when absolutely necessary?
I've been reading about navigational structure and it mentions the home page should only link out to the most important pages? Would this work?
Maybe none of this is something I should worry about? It seems some of the high rank sites for keywords like scentcity.com who are using the left side bar like I have and the main page with lots of links to her company sponsored site seem to rank despite all this??
Any suggestions or advice would be appreciated. x
-
The more you can self-contain the links, the better. If "too many" of your links are outward bound, that can become confusing or annoying to site visitors. That also applies to search engines being able to figure out "what is this site about" and "why are so many links pointing to this other site".
So always be mindful of that, while maintaining the functionality you need.
It's a mess as you well know. Yet keeping the mess to a minimum is vital.
Ultimately, whatever that linking structure, it will come down to how well optimized the external site is. And the more confused the link structure is, the more other tasks you'll need to do for SEO to compensate and keep the topical focus strong.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Control indexed content on Wordpress hosted blog...
I have a client with a blog setup on their domain (example: blog.clientwebsite.com) and even though it loads at that subdomain it's actually a Wordpress-hosted blog. If I attempt to add a plugin like Yoast SEO, I get the attached error message. Their technical team says this is a brick wall for them and they don't want to change how the blog is hosted. So my question is... on a subdomain blog like this... if I can't control what is in the sitemap with a plugin and can't manually add a sitemap because the content is being pulled from a Wordpress-hosted install, what can I do to control what is in the index? I can't add an SEO plugin... I can't add a custom sitemap... I can't add a robots.txt file... The blog is setup with domain mapping so the content isn't actually there. What can I do to avoid tags, categories, author pages, archive pages and other useless content ending up in the search engines? 7Zo93b2.png
Technical SEO | | ShawnW0 -
Creating a Landing Page with a Separate Domain to Control Bounce Rate
I work with a unique situation where we have a site that gets tons of free traffic from internal free resources. We do make revenue from this traffic, but due to its nature, it has a high bounce rate. Data shows that once someone from this source does click a second page, they are engaged, so they either bounce or click multiple pages. After testing various landing pages, I've determined that the best solution would be to create a landing page on a separate domain and hide it from the search engines (to prevent duplicate content and the appearance of link farming). The theory is that once they click through to the site, they will bounce at a lower rate and improve the stats of the website. The landing page would essentially filter out this bad traffic. My question is, how sound is this theory? Will this cause any issues with Google or any other search engines?
Technical SEO | | jhacker0 -
Unused url 'A' contains frameset - can it damage the other site B?
Client has an old unused site 'A' which I've discovered during my backlink research. It contains this source code below which frames the client's 'proper' site B inside the old unused url A in the browser address. Quick question - will google penalise the website B which is the one I'm optimising? Should the client be using a redirect instead? <frameset <span class="webkit-html-attribute-name">border='0' frameborder='0' framespacing='0'></frameset <span> <frame src="http: www.clientwebsite.co.ukb" frameborder="0" noresize="noresize" scrolling="yes"></frame src="http:> Please go to http://www.clientwebsite.co.ukB <noframes></noframes> Thanks, Lu.
Technical SEO | | Webrevolve0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Could getting referral traffic from SEO moz damage your rankings?
Buon Giorno from OS grid reference SE404481 Having just read Googles https://support.google.com/webmasters/bin/answer.py?hl=en&answer=2648487 disavow contnet something troubled me... Is it a possibility you could damage the page rank of a site if you add its url in these posts. Put another way if I added a url pointing to a specific site would googles radar detect the source as SEO and penalise the site in some way? Any insights welcome 🙂
Technical SEO | | Nightwing0 -
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Technical SEO | | greyniumseo0 -
How damaging is duplicate content in a forum?
Hey all; I hunted around for this in previous questions in the Q&A and didn't see anything. I'm just coming back to SEO after a few years out of the field and am preparing recommendations for our web dev team. We use a custom-coded software for our forums, and it creates a giant swathe of duplicate content, as each post has its own link. For example: domain.com/forum/post_topic domain.com/forum/post_topic/post1 domain.com/forum/post_topic/post2 ...and so on. However, since every page of the forum defaults to showing 20 posts, that means that every single forum thread that's 20 posts long has 21 different pages with identical content. Now, our forum is all user-generated content and is not generally a source of much inbound traffic--with occasional exceptions--but I was curious if having a mess of duplicate content in our forums could damage our ability to rate well in a different directory of the site. I've heard that Panda is really cracking down on duplicate content, and last time I was current on SEO trends, rel="canonical" was the hot new thing that everyone was talking about, so I've got a lot of catching up to do. Any guidance from the community would be much appreciated.
Technical SEO | | TheEnigmaticT0 -
How can I prevent sh404SEF Anti-flood control from blocking SEOMoz?
I'm using sh404SEF on my Joomla 1.5 website. Last week, I activated the security functions of the tool, which includes an anti-flood control feature. This morning when I looked at my new crawl statistics in SEOMoz, I noticed a significant drop in the number of webpages crawled, and I'm attributing that to the security configurations that I made earlier in the week. I'm looking for a way to prevent this from happening so the next crawl is accurate. I was thinking of using sh404SEFs "UserAgent white list" feature. Does SEOMoz have a UserAgent string that I could try adding to my white list? Is this what you guys recommend as a solution to this problem?
Technical SEO | | JBradySD0