Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Ahrefs - What Causes a Drastic Loss in Referring Pages?
-
While I was doing research on UK Flower companies I noticed that one particular domain had great rankings (top 3), but has slid quite a bit down to page two. After investigating further I noticed that they had a drastic loss of referring pages, but an increase in total referring domains. See this screenshot from ahrefs.
I took a look at their historical rankings (got them from the original SEO provider's portfolio) and compared it to the Wayback Machine. There did not seem to be any drastic changes in the site structure. My question is what would cause such a dramatic loss in total referring pages while showing a dramatic increase in referring domains? It appears that the SEO company was trying rebound from the loss of links though.
Any thoughts on why this might happen?
-
The Google Disavow tool doesn't work like that. It won't actually remove links from any pages. It is basically just a signal to Google that you want those links to be nofollowed. Ahrefs would have no clue if something has been disavowed or not.
-
Hi,
One thing that comes to mind is using the Disavow tool. The UK Flower company might have used the Disavow tool to remove all the referring pages that they thought were spam.
-
First thing that comes to mind is maybe the site had a lot of site-wide links before. If it had 5,000 or 10,000 links coming from 1 single domain and that website went down, that would be a huge loss of referring pages in a short amount of time. Maybe they were in web directories and asked to be removed? While simultaneously attempting to build some high quality backlinks from more referring pages?
It's all speculation of course, but plausible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it necessary to have unique H1's for pages in a pagination series (i.e. blog)?
A content issue that we're experiencing includes duplicate H1 issues within pages in a pagination series (i.e. blog). Does each separate page within the pagination need a unique H1 tag, or, since each page has unique content (different blog snippets on each page), is it safe to disregard this? Any insight would be appreciated. Thanks!
Algorithm Updates | | BopDesign0 -
Product pages - should the meta description match our product description?
Hi, I am currently adding new products to my website and was wondering, should I use our product description (which is keyword optimised) in the meta description for SEO purposes? Or would this be picked up by Google as duplicate content? Thanks in advance.
Algorithm Updates | | markjoyce1 -
Is it Okay to have "No Response" pages?
Hi all, I can see some "No Response" pages which gives a error message "Site cannot be reached" or keeps on loading but don't. I have got this list from Screaming from spider tool. Do we need to fix these or ignore? Thanks
Algorithm Updates | | vtmoz0 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
How to find which keywords bring traffic to a particular page on my website ?
I have been using Google Analytics and SEOMoz tools for a while now. I know which are my top landing pages and some of the keywords which bring me traffic. But I don't know which are the top searched keywords for my website as these are "not provided" by Google Analytics. More importantly, I want to know which keywords are directing traffic to a particular page on my website. Can anyone help ?
Algorithm Updates | | EricMoore0 -
Does google index non-public pages ie. members logged in page
hi, I was trying to locate resources on the topics regarding how much the google bot indexes in order to qualify a 'good' site on their engine. For example, our site has many pages that are associated with logged in users and not available to the public until they acquire a login username and password. Although those pages show up in google analytics, they should not be made public in the google index which is what happens. In light of Google trying to qualify a site according to how 'engaged' a user is on the site, I would feel that the activities on those member pages are very important. Can anyone offer suggestions on how Google treats those pages since we are planning to do further SEO optimization of those pages. Thanks
Algorithm Updates | | jumpdates0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Using Brand Name in Page titles
Is it a good practice to append our brand name at the end of every page title? We have a very strong brand name but it is also long. Right now what we are doing is saying: Product Name | Long brand name here Product Category | Long brand name here Is this the right way to do it or should we just be going with ONLY the product and category names in our page titles? Right now we often exceed the 70 character recommendation limit.
Algorithm Updates | | mlentner1