Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
-
Hi
How long aproximately does G take to pass authority via a 301 from an old page to its new replacement page ?
Does Moz Page Authority reflect this in its score once G has passed it ?
All Best
Dan -
I wouldn't get too hung up on the Moz timeline, since that's only correlated with what Google does, to build a broader model. If Google has crawled/cached the 404 and the page actually is no longer in the index, then that page should stop inheriting and passing link equity. It can get complicated, because sometimes 404s have inbound links and other issues tied to them that can confuse the crawlers. So, I'd say it's situational.
Moz (specifically, OSE) can help you determine what links still exist to those URLs, which really should guide whether you let them stay 404s or 301-redirect them to something relevant. The other aspect of the decision is just whether something relevant exists. If you clearly have built a page to replace the old one then 301-redirect it. If the old page is something that ceased to exist for a reason, then a 404 is probably fine unless that old page had a ton of inbound links. In that case, the 404 has essentially cut off those links.
The problem is that those inbound links are still out there, so it's not that the authority has ceased to exist. It's that you've basically cut the pipe through which the authority flows.
-
No I want to 301 a page that became a 404 after devs missed redirecting during site migration where urls changed. Moz saying a few of these 404 urls still have authority. I know should 301.anyway even if they don't but are they likely to still have authority according to G ? In other words how long aftr an old page/url with authority that becomes.a 404 retains it ?
-
Sorry, I'm a little confused - are you 301'ing to a 404? I'm not really sure I understand the situation.
-
thanks Dr Pete !
What i meant was how long aprox do you have to set up a 301 for an old page resolving in a 404 before it loses its page authority ?
If you leave it a month say will it have lost the PA or will it retain it so will still transfer the PA once you 301 it say 4 weeks or more since its became a 404 page?
Or is it more likely that Moz Analytics hasnt updated yet and still attributing the url with an authority score when in fact Google is likely to have dropped its authority since been a 404 for 4 weeks ?
cheers
dan
-
I think our data is getting refreshed every couple of weeks at this point, but I'm not sure if a 404 will drop a page from MozScape/OSE right away at that point. I suspect 301s may be updated more quickly, since a 404 could be a temporary issue. Once the target of a 301 gets passed the PA, the original page should lose it.
-
And how long does it take for the PA on an old page to 'expire' if hasnt been transferred ?
Say if devs have missed some pages in the migration (4 weeks ago) which are now resolving in a 404, which Moz Dashboard is still reporting as having PA.
Will that authority still transfer if set up a 301 4x weeks later ?
I know we should 301 anyway, to reduce 404's, but just interested in knowing if the pages old authority expires at any point (which i'm sure it must do eventually).
Thanks
Dan
-
great thanks Dr Pete !
-
It can vary quite a bit. The page has to be recrawled/recached, which can take anywhere from hours to weeks, depending on how much authority the page has. That's usually the big delay. After that, Google may on occasion delay passing authority, but we don't have proof of that (there are just cases where it seems like they do).
If it's just a handful of pages, re-fetch them through Google Webmaster Tools. It never hurts to kick the crawlers.
-
thanks Keri
any ideas though aprox how long G takes to update/pass authority of an old page to new one via a 301 ?
-
Moz Page Authority is a separate metric. Sadly, we have no pipeline from Google where they tell us exactly what they think of a site. We update our metrics about every month, so it may take a couple of months to see authority changed to a new page in Moz.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ahrefs vs Moz
Hi! I noticed the Moz DA en the Ahrefs DA are very different. Where https://www.123opzeggen.nl/ has a DA of 10 at MOZ, the DA at Ahrefs is 26. Where does this big difference come from? Do you measure in different ways? I hope you can answer this question for me. Thank you in advance!
Moz Pro | | NaomiAdivare2 -
Source page showsI have 2 h1 tags on my page. I can only find one.
When I grade my page it says I have more than one h1 tag. I view the source page and it shows there are two h1 headings with the same wording. If I delete the one h1 heading I can find, the page source shows I have deleted both of them. I don't know how to get to the other heading to delete it. And I'm off page one of google! Can anybody help? Clay Stephens
Moz Pro | | Coot0 -
Pages with URL Too Long
Hello Mozzers! MOZ keeps kindly telling me the URLs are too long. However, this is largely due to the structure of E-commerce site, which has to include 'brand' 'range' and 'products' keyword. For example -
Moz Pro | | tigersohelll
https://www.choicefurnituresuperstore.co.uk/Devonshire-Rustic-Oak-Bedside-Cabinet-1-Drawer-p40668.html MOZ recommends no more than 75 characters. This means we have 25-30 characters for both the brand name and product name. Questions:
If it is an issue, how to fix it on my site?
If it's not an issue, how can we turn off this alert from MOZ?
Anyone know how big an issue URLs are as a ranking factor? I thought pretty low.0 -
Should I set blog category/tag pages as "noindex"? If so, how do I prevent "meta noindex" Moz crawl errors for those pages?
From what I can tell, SEO experts recommend setting blog category and tag pages (ie. "http://site.com/blog/tag/some-product") as "noindex, follow" in order to keep the page quality of indexable pages high. However, I just received a slew of critical crawl warnings from Moz for having these pages set to "noindex." Should the pages be indexed? If not, why am I receiving critical crawl warnings from Moz and how do I prevent this?
Moz Pro | | NichGunn0 -
Comparing New vs. Old Keyword Difficulty Scores
We've had a few questions regarding the new Keyword Difficulty score used in Keyword Explorer, and how it compares to the old score in our stand-alone Keyword Difficulty tool. Specifically, people want to know why some scores are much lower using the new tool. There a general discussion of the math behind the tool in this post: Keyword Research in 2016: Going Beyond Guesswork One of the problems we had with the original Keyword Difficulty score is that, because it's based on our Page Authority (PA) score and PA tends toward the middle of the 0-100 range, Difficulty got a bit bunched up. A Difficulty score in the low-to-mid 20s (via the old tool) is actually very low. So, we set out to re-scale the new tool to broaden that score and use more of the 0-100 range. We hoped this would allow more granularity and better comparisons. While the logic is sound, we're concerned that we may have been too aggressive in this re-scaling, given recent feedback. So, we're going to be analyzing a large set of keywords (anonymously, of course) that people have run through the tool to see if too many Difficulty scores seem too low. If they do, we'll make some adjustments to the math. In the meantime, please be aware that low scores may appear lower in the new tool and very high scores may appear higher. We wanted to address some of the limitations in V1 and feedback over the years, and so the old and new scores really can't be compared directly in a meaningful way. We're sorry for any confusion that has caused, and we will re-evaluate if necessary.
Moz Pro | | Dr-Pete3 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
HTC access 301 redirect rules regarding pagination and striped category base (wp)
I am an admin of a wordpress.org blog and I used to use "Yoast All in one SEO" plugin. While I was using this plugin it stripped the category base from my blog post URL's. With yoast all in one seo: Site.com/topic/subtpoic/page/#
Moz Pro | | notgwenevere
Without yoast all in one seo: Site.com/category/topic/subtopic/page/# Now, that I have switched to another plugin, I am trying to manage the page crawl errors which are tremendous somewhere around 1800, mostly due to pagination. Rather than redirecting each URL individually I would like to develop HTC access 301 redirects rules. However all instructions on how to create these HTC access 301 redirect rules are regarding the suffix rather than the category base. So my question is, can HTC access 301 redirects rules work to fix this problem? Including pagination? And if so, what would this particular HTC access 301 redirect look like? Especially regarding pagination? And do I really have to write a 301 redirect for each pagination page?0 -
Redirect analysis tool
I'm looking for a tool like this: http://www.internetofficer.com/seo-tool/redirect-check/ that can check hundreds/thousands of URLs and give me a report as to which ones have been redirected. Does anyone know of something that can do this?
Moz Pro | | glass010