301's & Link Juice
-
So lets say we have a site that has 0 page rank (kind of new) has few incoming links, nothing significant compared to the other sites.
Now from what I understand link juice flows throughout the site. So, this site is a news site, and writes sports previews and predictions and what not. After a while, a game from 2 months gets 0 hits, 0 search queries, nobody cares. Wouldn't it make sense to take that type of expired content and have it 301 to a different page. That way the more relevant content gets the juice, thus giving it a better ranking...
Just wondering what everybody's thought its on this link juice thing, and what am i missing..
-
Lots of interesting ideas. Thanks you everyone.
-
a 301 simply redirects a request to a new request with a new url. If the page has no external links then a 301 will do nothing for you. If you don't want the page delete it, remove any internal links and your done
Each request leaks link juice.
If you have links pointing to page A, and you 301ed page A to page B, then any link juice will go to page B , but will lose a bit of link juice, in fact you lose it twice, once for the link, and once for the 301 redirect. If the only links are internal links why not just link to Page B in the first place.but I would not remove the page, all pages have PageRank to start with, the more pages on your site the more PR, but the more pages to share it with, but with smart linking you can sculpt the more PR to fall on pages you want it to and less you don't want it to.
Read this simple explanation http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
-
When I first read your question, the first thing I thought of was just recycling the URL's each sports season...
For instance, each year the Bears play the Packers, so in 2013 you write up your prediction on the page: mypredictionsite.com/bears-vs-packers.html and the page hangs around until 2014 when you re-write and re-publish the page for that year's game. Be sure to use schema and other tags to assign a recent Date for updating the page, and put something big and bold at the top of the page so people know what season the prediction is for. (Maybe also a tally of how well you did predicting their previous matches.)
That way any links it picks up over the years are pertinent to the Bears playing the Packers, and that could help with ranking. Also you don't have to keep track of perpetually growing collection of 301's.
Just a thought...
-
For me, I would need to know that the links had variation. So, for each page say A, B, C, D, and E, you have a week or two in between them. So page A runs on the 1st and it gets a couple of links. Page B runs on the 15th and gets a couple, Page C runs on the 30th and gets a couple and so forth.
The links to be truly helpful (at least at some point of which I could not tell you) cannot be the same couple of links to each page. If they are varied, I can see it having validity, but at some point if you are getting links to the pages from the same site/page/person, I think it has to ring a spam bell at some point. PLEASE NOTE: I cannot show you that anywhere that I am aware of so you are free to test it out. I just am using gut here.
Thanks
-
Hi, thanks for your response. I agree with what your saying. The fact is though, my idea was to 301 all the articles that are no longer relevant to the "This weeks previews" so basically all the old articles would be getting 301's to 1 link. so it wont be 301 to one page, then that page 301 to another site...etc
You know what im getting at? Sorry if im making it confusing.
-
ravashjalil
With a news/sports site in particular you are going to have continuous stories you are writing. When you start doing one 301 to another to another to another to another to another to another... sooner or later it is going to appear to be THE SPAM CITY GAZETTE. You do not want a site like that. With a news or sports story unless huge and it is our byline, etc. you are not going to have enough juice on any given page to really benefit you. You ultimately will be attempting to move juice from internal pages to other internal pages.
You are better served to do it the old fashion way, just keep writing great content, etc. Archive the older stuff and let the visitor do their bit. If you get a page with a lot of links coming to it, you might want to leave it alone as people seem to want to read that...not get sent somewhere else.Best,
Robert
-
Check the Page authority of that page.If you have created new content and its have some page authority I surely suggest you to redirect it to new one. Its surely pass link juice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A client rebranded a few years ago and doesn't want to be associated with it's old brand name. He wishes not to appear when the old brand is searched in Google, is there something we can do?
The problem is there was redirection between the old branded site and the new one, and now when you type in the name of the old brand, the new one comes up. I have desperately tried to convince this client there is nothing we can do about it, dozens of news articles crop up with the two brands together as this was a hot topic a few years ago, but just in case I missed something I thought I'd ask the community of experts here on Moz. An example for this would be Tyco Healthcare that became covidien in 2007. When you type tyco healthcare, covidien crops up here and there. Any ideas? Thanks!
Intermediate & Advanced SEO | | Netsociety0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
How do you find old linking url's that contain uppercase letters?
We have recently moved our back office systems, on the old system we had the ability to use upper and lower case letters int he url's. On the new system we can only use lower case, which we are happy with. However any old url's being used from external sites to link into us that still have uppercase letterign now hit the 404 error page. So, how do we find them and any solutions? Example: http://www.christopherward.co.uk/men.html - works http://www.christopherward.co.uk/Men.html - Fails Kind regards Mark
Intermediate & Advanced SEO | | Duncan_Moss0 -
Link Acquisition - link building
When using Site Explorer to find out my competiters links so I can do some link aquisition SEO do I look for the "inbound" links or or "linking domains"? Also, what filters should I choose? I want to make a spreadsheet as Rand suggested in his video and start to prioritize my link building.
Intermediate & Advanced SEO | | musicforkids0 -
What's the best method for segmenting HTML sitemap?
Hello all, I was wondering if anyone can help me. Currently I'm trying to set up a HTML sitemap for our website and am having trouble with the 500+ pages of content under each category. How do you segment your HTML sitemap in a case like this, keeping in mind the less than 100 links per page rule? For example, http://www.careerbliss.com/salary/ allows our users to search salaries under company, job title, and location. You can imagine how many thousands of pages we need to represent. Any help will be greatly appreciated! Cheers! Reyna
Intermediate & Advanced SEO | | CareerBliss0 -
Reverse Proxys - Lost On It's Purpose To Help Seo
Reverse Proxys - Lost On It's Purpose To Help Seo - read an article on seomoz check link below. When should we use this reverse proxy and is it really worth the trouble at all ? Why create subdomains vs subfolders when organizing different sections of the website ? http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo
Intermediate & Advanced SEO | | helpwanted0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0 -
URL Length or Exact Breadcrumb Navigation URL? What's More Important
Basically my question is as follows, what's better: www.romancingdiamonds.com/gemstone-rings/amethyst-rings/purple-amethyst-ring-14k-white-gold (this would fully match the breadcrumbs). or www.romancingdiamonds.com/amethyst-rings/purple-amethyst-ring-14k-white-gold (cutting out the first level folder to keep the url shorter and the important keywords are closer to the root domain). In this question http://www.seomoz.org/qa/discuss/37982/url-length-vs-url-keywords I was consulted to drop a folder in my url because it may be to long. That's why I'm hesitant to keep the bradcrumb structure the same. To the best of your knowldege do you think it's best to drop a folder in the URL to keep it shorter and sweeter, or to have a longer URL and have it match the breadcrumb structure? Please advise, Shawn
Intermediate & Advanced SEO | | Romancing0