Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
-
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy.
For example - the old structure :
country / city / city area
Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city :
We needed to change the structure to be :
country / region / area / city / cityarea
So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too.
Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301).
So my question is (sorry for long waffle) :
Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?
Thanks for any help anyone can give.
-
Thanks Everett - sorry about delay in coming back to your response.
This 301 issue was one if the things we were worried about (along with a ton of others) so we can at least be a little self-assured that we're prgressing on all fronts and not leaving a gaping problem that will continue to dog us.
Cheers
W
-
I'm just going to answer your question directly. This was your question:
"Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?"
Short Answer: As long as you are working to update those internal links, and you have 301 redirects in place during the meantime, you should be fine.
Technically speaking, it is best practice to link directly to the page internally, rather than relying on 301 redirects. Yes, it is true that a very small (very, VERY small so as to be virtually undetectable) amount of pagerank is lost when redirecting, it only becomes an issue when you begin adding redirect on top of redirect. Keeping your house clean, so-to-speak, by not relying on redirects to fix your broken internal links will keep this from happening, and is exactly what the tiny amount of pagerank loss is said to be created for (to discourage webmasters from relying on redirects to fix broken internal links) - if you believe Matt Cutts.
With that said, you may indeed have many other issues to deal with, as do most sites that have a geotargeted, deep URL structure like the one you have outlined. Panda slammed a lot of sites like that pretty hard. But all of that is beyond the scope of this question.
I hope you find whatever is wrong and get your traffic back. Good luck!
-
Hi Chris
Thanks - I 'love' the loose MC videos - "it is - but it isn't an issue".
That was my gut that there may be a temporary loss of link juice, but it would re-adjust after a period. Which means we have other issues.
Cheers
W
-
Thanks for your advice - amended the question so it is simpler to read. sorry about that.
Well that's what I thought - but anecdotal evidence ( as well as past experience ) is making me wonder whether we're losing a significant passing of link juice. We put the 301s in place about 6 or 7 months ago so any loss of link juice between pages should have come back by now.
Maybe we have some other issues?
W
-
Agree with Chris, thumbs up. I would just add that "ideally" you would have manually gone through all the links ahead of time and had the 301s in place prior to launch. That way there is no downtime/confusion to Google on what they are supposed to do with these pages. If you think about it you have 600 pages that are in limbo and so after a while Google will just say, well, I guess those pages are dead and start to crawl them less often and eventually drop them.
I would make it a priority to go through those pages and setup the new 301s ASAP. Google will keep trying a old page for a while (few months) if it 404s or even if you have a 301. It knows that mistakes happen. So in the case of the 301, it will still crawl the old URL for a while even after it sees the 301 the first time, just to make sure that the 301 is really permanent. You have a bit of a grace period so take advantage of it to get things cleaned up quickly.
-
Hiya,
First off let me post this video from Matt Cutts regards to 301 redirects http://www.youtube.com/watch?v=Filv4pP-1nw
As long as the 301 is pointed towards either the same page or a page of equal value (content wise) you should be good. Whilst going through them manually may loose you a bit of rank over time at least you can know you are directing to the correct pages.
short answer
manual - Short term rank loss long term benifit
Auto - visa vesa
Hope this helps
-
Hello,
I don't quite understand your question, if you are adding more category pages, you should have more pages instead of less, just make sure to 301 redirect every single old page and you shouldn't have a problem.
I had to do something similar to one of my sites like 3 months ago and I did loose pagerank on some pages but ranking got better so I wouldn't worry much about pagerank.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Is page speed important to improve SEO ranking?
I saw on a SEO Agency's site (https://burstdgtl.com/search-engine-optimization/) that page speed apparently affects Google ranking. Is this true? And if it is, how do I improve it, do I need an agency?
On-Page Optimization | | jasparcj0 -
Hreflang Errors 404 vs "Page Not Found"
For a websites that differ between catalogs (PDPs) what hreflang error causes the least harm? Obviously the best solution is to only have hreflang for shared products, but this takes more work to implement. So when no identical product exists... 1. Hreflang points to 404 or 410 error. 2. Hreflang points to 200 status "Page Not Found" page. This obviously has the additional issue of needing to point back to 100+ urls. I want to avoid having Google decide to ignore all hreflang due to errors as many correct urls will exist. Any thoughts?
On-Page Optimization | | rigelcable0 -
How can a page rank for keywords that it does not have on it?
I have a client that is ranking in the top 10 for several keywords on their homepage. Their site has no purposeful SEO in it, there is barely any text on the homepage at all and none of the text are the keywords it is ranking for.
On-Page Optimization | | woodchuckarts2 -
Why does Google pick a low priority page on my site?
Hi Guys. One of my pages ranks quite well for "mid year diaries 14-15" on Google. The problem is it's a really specific product page (A4, Hardback, day-to-a-page diary I think). It would be much better for the user to land on our mid-year diaries category, not really deep into the site. Why is Google prioritizing this product page over our general 'mid year diaries' category? Especially when the category would relate to the search more accurately? I work for TOAD diaries and I think our page rank is 10 for this search. Eagerly awaiting some insight 🙂 Thanks in advance everyone! Isaac.
On-Page Optimization | | isaac6630 -
Rel="canonical" on home page?
I'm using wordpress and the all in one seo pack with the canonical option checked. As I understand it the rel="canonical" tag should be added to pages that are duplicate or similar to tell google that another page (one without the rel="canonical" tag) is the correct one as the url in the tag is pointing google towards it. Why then does the all in one seo pack add rel="canonical" to every page on my site including the home page? Isn't that confusing for google?
On-Page Optimization | | SamCUK0 -
Does page "depth" matter
Would it have a negative effect on SEO to have a link from the home page to this page... http://www.website/com/page1deep/page2deep rather than to this page http://www.website/com/page1deep I'm hoping that made some sense. If not I'll try to clarify. Thanks, Mark
On-Page Optimization | | DenverKelly0 -
Does a page's url have any weight in Google rankings?
I'm sure this question must have been asked before but I can't find it. I'm assuming that the title tag is far more important than the page's url. Is that correct? Does the url have any relevance to Google?
On-Page Optimization | | rdreich490