Does link juice pass along the URL or the folders? 10yr old PR 6 site
-
We have a website that is ~10yrs old and a PR 6. It has a bunch of legitimate links from .edu and .gov sites. Until now the owner has never blogged or added much content to the site. We have suggested that to grow his traffic organically he should add a worpress blog and get agressive with his content.
The IT guy is concerned about putting a wordpress blog on the same server as the main site because of security issues with WP. They have a bunch of credit card info on file.
So, would it be better to just put the blog on a subdomain like blog.mysite.com OR host the blog on another server but have the URL structure be mysite.com/blog?
I have tried to pass as much juice as possible.
Any ideas?
-
This is very helpful information! I believe this is what the admin had proposed. I just wanted to double check with you guys.
I will have to check into the cc info. I am not sure exactly what they have.
Thanks!
-
hmmmm..... yeah I am not sure. I will check into that.
-
The Reverse Proxy capabilities of both Apache and IIS are designed to do exactly what you're trying to do, Jason. A reverse proxy allows you to host the WordPress installation on any server, then proxy it so it shows to the users as served from yourdomain.com/blog.
You definitely want the new blog to sit at yoursite.com/blog if you want it to help the ranking value of the primary site.
Reverse proxies are not trivial to set up, but they're not that difficult for an experienced system administrator - especially in this case as you are building the WordPress blog from scratch (far fewer redirection hassles)
As EGOL notes though - if you have actual cc data stored, you better make sure it meets compliance whether you do the revers proxy or not. If you just mean you have PIO (Personally Identifiable Information) like name, address etc on that server, then a reverse proxy can help keep potential WordPress security issues from compromising that.
Here's a Moz blog post/infographic on reverse proxies as a primer.
Hope that helps?
Paul
-
Why do they have CC info on file? Are they PCI compliant?
I would get rid of the CC data or put it in the hands of a very secure service provider.
I would do that for security and so that I could place the blog in a folder on the primary domain.
-
If you can put the blog in a subdirectory such as www.mysite.com/blog, then that would be ideal because the link juice will be preserved on your site. If you put the blog in a subdomain like blog.mysite.com, then the search engines consider them to be two separate sites and thus the link juice is split between the two sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Suggestions on Link Auditing a 70,000 URL list?
I have a website with nearly 70,000 incoming links, since its a somewhat large site that has been online for 19 years. The rate I was quoted for a link audit from a reputable SEO professional was $2 per, and clearly I don't have $140,000 to spend on a link audit 🙂 !! I was thinking of asking you guys for a tutorial that is the Gold Standard for link auditing checklists - and do it myself. But then I thought maybe its easier to shorten the list by knocking out all the "obviously good" links first. My only concern is that I be 100% certain they are good links. Is there an "easiest approach" to take for shortening this list, so I can give it to a professional to handle the rest?
Intermediate & Advanced SEO | | HLTalk0 -
Large sites linking to us in their menu
Hello, I am digging in to our in-links in WMT and notice that we have a number of sites that link to us in their menu or every page on their site, making for hundreds of thousands of links to our site. Here is an example:
Intermediate & Advanced SEO | | evansluke
http://www.askthebookie.com/ (in the topmost right menu there is a link to our forums) http://www.covers.com/postingforum/ There are about a dozen or so sites that link to our homepage tens-of-thousands of times. Should I disavow them, or is this be viewed as a legitimate link? Thanks in advance for any help!0 -
Can multiple redirects from old URLs hurt SEO?
We have a client that had an existing site with existing rankings. We rebuilt the site using DNN 7 and created/tested 301 redirects from all the Original URLs to the new DNN URLs which are nasty and have /tabid/1234 and will not allow for dashes (-)'s We have found a DNN module that will make the DNN 7 URLs search friendly. However, that will cause us to 301 the current DNN urls to the new URLs so in fact the original will redirect to the DNN and the DNN will redirect to the rewritten SEO friendly URLs. What should we know here before proceeding?
Intermediate & Advanced SEO | | tjkirgin0 -
URL language on Global Sites
Has anyone looked into a page not ranking as well because the URL is in English when the subdomain is geared for a different country and different language? I can defiantly see this taking away from the user experience, but didn't know if there was any concrete evidence or case studies that would show if it is a big deal or not for rankability? I know this is a backwards question to begin with because the priority over rankability is always UX, but there may not be a way to fix it unless I can prove it is a big deal.
Intermediate & Advanced SEO | | Ryan_Henry0 -
Tips for Link Building for Mobile Sites
Hi, I wondered if anyone had any tips and advice for link building for mobile sites. Many thanks
Intermediate & Advanced SEO | | MarkChambers0 -
Google is not Indicating any Links to my site
We built a new store on another ccTLD and linked to it from some of our other domains in a few locations. I am noticing that with the Google operator command "links:" we are seeing nothing linking to our site anywhere. Some things to clarify: These are not no-follow links These pages linking to our new domain are indexed The pages being linked to on our new domain are indexed This is not a flash site or heavy in JavaScript The links existed the day the site was launched so when the new pages were crawled they existed. "Site:" command in Google shows me that my new site is indexed. What could potentially be causing this? I am trying to get these newer ccTLD's to begin ranking and I understand that I need to get links going to these pages since they are fairly new (2.5 months) so I can outrank the .com in the SE's in those locales. (Like Google.co.uk)
Intermediate & Advanced SEO | | DRSearchEngOpt0 -
Migrating a site with new URL structure
I recently redesigned a website that is now in WordPress. It was previously in some odd, custom platform that didn't work very well. The URL's for all the pages are now more search engine friendly and more concise. The problem is, now Google has all of the old pages and all of the new pages in its index. This is a duplicate problem since content is the same. I have set up a 301 redirect for every old URL to it's new counterpart. I was going to do a remove URL request in Webmaster Tools but it seems I need to have a 404 code and not a 301 on those pages to do that. Which is better to do to get the old URL's out of the index? 404 them and do a removal request or 301 them to the new URL? How long will it take Google to find these 301 redirects and keep just the new pages in the index?
Intermediate & Advanced SEO | | DanDeceuster0