Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
disavow link more than 100,000 lines
-
-
I recieved a huge amount of spamy link (most of them has spam score 100)
-
Currently my disavow link is arround 85.000 lines but at least i have 100.000 more domain which i should add them.
-
All of them are domains and i don't have any backlink in my file.
My Problem is that google dosen't accept disavow link which are more than 2MB and showes this message :
File too big: Maximum file size is 100,000 lines and 2MB
What should i do now?
-
-
@theuser23 said in disavow link more than 100,000 lines:
If you have long URLs, just divide the disavow file.
I wanted to say that Google allows uploading the disavow file by property. If you have links pointing to different pages, divide spam links by these pages.
For example, URL A has 30 000, while URL B has 70 000 spam links. Google does not support domain properties.
You can send here a link to the disavow file.
I hope this helps.
-
Hi friend,
The Google Disavow Tool accepts .txt files. If you have domains, not individual pages in your file, the .txt file will be about 200 kb. At the same time, CSV can be about 1.5-1.8 MB. Extension .txt is very light, or your domains are longer than Amazon River.
If you have long URLs just divide the disavow file.
Top 100 000 domains (Alexa rank, .txt file, about 200kb)
https://drive.google.com/file/d/1fuoeNM80DMGghd4CcQoLFC8OeKPCVxFQ/view?usp=sharingI hope this helps. If you have any questions, you are welcome.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm having trouble removing SPAM from my site
I make the list of disavow links every week, plus they keep popping up. Does anyone know any techniques to remove them all at once ?
SEO Tactics | | Gabriel1710 -
How to Diminish the spam score of the site?
I AM HERE TO DISCUSS MY PROBLEM AFTER THE GOOGLE CORE UPDATE OF OCTOBER MONTH MY WEBSITE SPAM SCORE IS INCREASING RAPIDLY HOW TO CONTROL THIS PROBLEM PLEASE HELP ME ASAP MY WEBSITE NAME IS EGYPT DIRECT
Link Building | | aprase1 -
block primary . xxx domain with disavow tool
Hi friends I discovered spam url attack on top sites with good google positions for specific keyword. Can I block primary domain like .xxx with disavow tool? There is hundreds of different domains but primary domain is always the same. for example like this domain:xxx? Thanks
Intermediate & Advanced SEO | | netcomsia0 -
Should we Nofollow Social Links?
I've been asked the question of whether if we should nofollow all of our social links, would this be a wise thing to do? I'm not exactly getting a clear answer from search results and thought you guys would be best to ask 🙂 Thanks in advance.
Technical SEO | | JH_OffLimits0 -
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
How to set up internal linking with subcategories?
I'm building a new website and am setting up internal link structure with subcategories and hoping to do so with best Seo practices in mind. When linking to a subcategory's main page, would I make the internal link www.xxx.com/fishing/ or www.xxx.com/fishing/index.html or does it matter? I'm just trying to avoid duplicate content I guess, if Google saw each page as a separate page. Any other cautions when using subdirectories in my navigation?
Technical SEO | | wplodge0 -
Drop Down Menu - Link Juice Depletion
Hi, We have a site with 7 top level sections all of which contain a large number of subsections which may then contain further sub sections. To try and ensure the best user experience we have a top navigation with the 7 top level sections and when hovered a selection of the key sub sections. Although I like this format for the user as it makes it easier for them to find the most important sections / sub sections it does lead to a lot of links within every page on the site. In general each top section has a drop down with approx 10 - 15 subsections. This has therefore lead to SeoMoz's tools issuing its too many internal links warning. Then alongside this I am left wondering if I shouldn’t have to many links to my subsections and whether I would be better off being more selective of when I link to them. For instance I could choose the top 5 sub sections and place a link to them from our homepage and by doing so I would be passing a greater amount of link juice down the line. So I guess my dilemma is between ensuring the user has as easy a time traversing the site as possible whilst I try to keep a close watch on where, and how, our link juice is distributed. One solution I am considering is whether no-follow links could be utilised within the drop down menus? This way I could then have the desired user navigation and I would be in greater control of what pages link to which sub sections. Would that even work? Any advice would be greatly appreciated, Regards, Guy
Technical SEO | | guycampbell1 -
How to find links to 404 pages?
I know that I used to be able to do this, but I can't seem to remember. One of the sites I am working on has had a lot of pages moving around lately. I am sure some links got lost in the fray that I would like to recover, what is the easiest way to see links going to a domain that are pointing to 404 pages?
Technical SEO | | MarloSchneider0