Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
-
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy.
For example - the old structure :
country / city / city area
Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city :
We needed to change the structure to be :
country / region / area / city / cityarea
So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too.
Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301).
So my question is (sorry for long waffle) :
Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?
Thanks for any help anyone can give.
-
Thanks Everett - sorry about delay in coming back to your response.
This 301 issue was one if the things we were worried about (along with a ton of others) so we can at least be a little self-assured that we're prgressing on all fronts and not leaving a gaping problem that will continue to dog us.
Cheers
W
-
I'm just going to answer your question directly. This was your question:
"Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?"
Short Answer: As long as you are working to update those internal links, and you have 301 redirects in place during the meantime, you should be fine.
Technically speaking, it is best practice to link directly to the page internally, rather than relying on 301 redirects. Yes, it is true that a very small (very, VERY small so as to be virtually undetectable) amount of pagerank is lost when redirecting, it only becomes an issue when you begin adding redirect on top of redirect. Keeping your house clean, so-to-speak, by not relying on redirects to fix your broken internal links will keep this from happening, and is exactly what the tiny amount of pagerank loss is said to be created for (to discourage webmasters from relying on redirects to fix broken internal links) - if you believe Matt Cutts.
With that said, you may indeed have many other issues to deal with, as do most sites that have a geotargeted, deep URL structure like the one you have outlined. Panda slammed a lot of sites like that pretty hard. But all of that is beyond the scope of this question.
I hope you find whatever is wrong and get your traffic back. Good luck!
-
Hi Chris
Thanks - I 'love' the loose MC videos - "it is - but it isn't an issue".
That was my gut that there may be a temporary loss of link juice, but it would re-adjust after a period. Which means we have other issues.
Cheers
W
-
Thanks for your advice - amended the question so it is simpler to read. sorry about that.
Well that's what I thought - but anecdotal evidence ( as well as past experience ) is making me wonder whether we're losing a significant passing of link juice. We put the 301s in place about 6 or 7 months ago so any loss of link juice between pages should have come back by now.
Maybe we have some other issues?
W
-
Agree with Chris, thumbs up. I would just add that "ideally" you would have manually gone through all the links ahead of time and had the 301s in place prior to launch. That way there is no downtime/confusion to Google on what they are supposed to do with these pages. If you think about it you have 600 pages that are in limbo and so after a while Google will just say, well, I guess those pages are dead and start to crawl them less often and eventually drop them.
I would make it a priority to go through those pages and setup the new 301s ASAP. Google will keep trying a old page for a while (few months) if it 404s or even if you have a 301. It knows that mistakes happen. So in the case of the 301, it will still crawl the old URL for a while even after it sees the 301 the first time, just to make sure that the 301 is really permanent. You have a bit of a grace period so take advantage of it to get things cleaned up quickly.
-
Hiya,
First off let me post this video from Matt Cutts regards to 301 redirects http://www.youtube.com/watch?v=Filv4pP-1nw
As long as the 301 is pointed towards either the same page or a page of equal value (content wise) you should be good. Whilst going through them manually may loose you a bit of rank over time at least you can know you are directing to the correct pages.
short answer
manual - Short term rank loss long term benifit
Auto - visa vesa
Hope this helps
-
Hello,
I don't quite understand your question, if you are adding more category pages, you should have more pages instead of less, just make sure to 301 redirect every single old page and you shouldn't have a problem.
I had to do something similar to one of my sites like 3 months ago and I did loose pagerank on some pages but ranking got better so I wouldn't worry much about pagerank.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang Errors 404 vs "Page Not Found"
For a websites that differ between catalogs (PDPs) what hreflang error causes the least harm? Obviously the best solution is to only have hreflang for shared products, but this takes more work to implement. So when no identical product exists... 1. Hreflang points to 404 or 410 error. 2. Hreflang points to 200 status "Page Not Found" page. This obviously has the additional issue of needing to point back to 100+ urls. I want to avoid having Google decide to ignore all hreflang due to errors as many correct urls will exist. Any thoughts?
On-Page Optimization | | rigelcable0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
Inches or " Feet or ' Does Google translate the symbols?
I have a client who sells things that the size is important. In their industry some people say "15 Inch Blue Widget" and others say "15" Blue Widget" using the symbol " for inches. On the page I know we could say both to cover all the bases but I want to get the title right. In their industry there is not one more preferred than the other. Does anybody know if Google translates ' to feet and " to inches. Should I work both into the title for a product or only one?
On-Page Optimization | | JoshuaLindley0 -
Duplicate Content from on Competitor's site?
I've recently discovered large blocks of content on a competitors site that has been copy and pasted from a client's site. From what I know, this will only hurt the competitor and not my client since my guy was the original. Is this true? Is there any risk to my client? Should we take action? Dino
On-Page Optimization | | Dino640 -
Is there a tool that will "grade" content?
Does anybody know of a tool that can "grade" content for Panda compliance. For example, it might look at: • the total number of words on the page • the average number of words in sentences • grammar • spelling • repetitious words and/or phrases • Readability—using algorithms such as: Flesch Kincaid Reading Ease Flesch Kincaid Grade Level Gunning Fog Score Coleman Liau Index Automated Readability Index (ARI) For the last 5 months I've been writing and rewriting literally 100s of catalog descriptions—adhering to the "no duplicate content" and "adding value" rubrics—but in an extremely informal style. I would like to know if I'm at least meeting Google Panda's minimum standards.
On-Page Optimization | | RScime250 -
I have two pages ranking for the same keyword.
The index page and the targeted landing page for that keyword. They have different content, title, meta but I am competing with myself for the main keyword in the industry. What is the best way to fix this? 301 the keyword page to the index page?
On-Page Optimization | | Aftermath_SEO0 -
Does a page's url have any weight in Google rankings?
I'm sure this question must have been asked before but I can't find it. I'm assuming that the title tag is far more important than the page's url. Is that correct? Does the url have any relevance to Google?
On-Page Optimization | | rdreich490 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5