Following urls should be add in disavow file or not
-
Hey Moz Friends,
Should I include following spam link urls in disavow file or not? OR Will Google handle automatically? These type I have thousands urls.
=>>>web-seek.org/the_worlds_most_visited_web_pages_42.html<<<=
=>>>web-seek.net/the_worlds_most_visited_web_pages_42.html<<<=
=>>>websearching.net/the_worlds_most_visited_web_pages_42/
=>>>websearch.pl/the_worlds_most_visited_web_pages_42.html<<<=
=>>>web-search.net/the_worlds_most_visited_web_pages_42/
=>>>web-pages.org/the_worlds_most_visited_web_pages_42/
=>>>web-page.org/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-world.net/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-internet.tv/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-internet.in/the_worlds_most_visited_web_pages_42.html<<<=
=>>>the-globe.tv/the_worlds_most_visited_web_pages_42/
=>>>theglobe.sk/the_worlds_most_visited_web_pages_42.html<<<=
=>>>theglobe.ru/the_worlds_most_visited_web_pages_42.html<<<=
=>>>theglobe.pl/the_worlds_most_visited_web_pages_42.html<<<=Hope you will give any solution.
Waiting for your positive response.
-
That is just not the case. I saw the same from google, but disavowing globe domains helps in a big way.
-
I have heard the same, but experience shows that disavowing them most certainly helps as of 2/12/2020.
There is a utility that can find every globe domain in a MOZ Linking Domains CSV file, and its free.
https://dynamic.domains/disavow-utility.zip
Just use the_worlds_most_visited in the keyword field, and it will find every single globe domain in your file.
Sam
-
I hear what you are saying. Like you say there are people on both sides of the fence, I get rid of them and I am pretty sure I've seen some examples where it has actually benefited results
-
I disagree, but I know there are lots of people on both sides of the fence on this one!
"The globe is a known spam network", so one would assume Google is aware and just ignores it.
Google say (highlighting is me):
"Google works very hard to make sure that actions on third-party sites do not negatively affect a website. In some circumstances, incoming links can affect Google’s opinion of a page or site. For example, you or a search engine optimizer (SEO) you’ve hired may have built bad links to your site via paid links or other link schemes that violate our quality guidelines." - https://support.google.com/webmasters/answer/2648487?hl=enAnd John Mueller said (highlighting is me):
"Random links collected over the years aren't necessarily harmful, we've seen them for a long time too and can ignore all of those weird pieces of web-graffiti from long ago. Disavow links that were really paid for (or otherwise actively unnaturally placed), don't fret the cruft." - https://twitter.com/JohnMu/status/1088929651593039872If you didn't go out and generate those links, I wouldn't worry about them.
Disavowing a known spam link site does not help Google clear up
-
The Globe is a known spam network and you should disavow links, at the least - from all spam sites. There have been several posts here over the past few months with users either seeing a dive in their results (and then I check their backlinks, finding billions of globe ones) or sharing their own disavow files. In most instances, most users have agreed that The Globe is a nasty spammy network which looks like it is link selling, which could potentially (in some circumstances, not all) lead to negative SEO impacts
If you identify pages as being part of The Globe network, I would strongly advise you to disavow them. We've had to disavow thousands for some clients who were negatively impacted and removing them did help
They are spammy link lists. The pages serve no function or purpose, other than (apparently) SEO. Google say time and again, these types of pages and links clutter up the web and cause lots of problems. They don't like these. Just because a web-pages isn't some kind of satanic gambling bdsm site, that doesn't mean it is a good link that you should keep
These days, standards are higher (IMO). Say no to these kinds of links
-
Thanks for reverting me. Can you share any article or forum where Google says that Google will discount these type links and won't penalize.
-
Sites like that will not do any harm (they won't do anything for your rankings to be fair). I wouldn't worry about disavowing, Google will be well aware of them and discount them, they won't penalise you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical tag On Each Page With Same Page URL - Its Harmful For SEO or Not?
Hi. I have an e-commerce project and they have canonical code in each and every page for it's own URL. (Canonical on Original Page No duplicate page) The url of my wesite is like this: "https://www.website.com/products/produt1"
White Hat / Black Hat SEO | | HuptechWebseo
and the site is having canonical code like this: " This is occurring in each and every products as well as every pages of my website. Now, my question is that "is it harmful for the SEO?" Or "should I remove this tags from all pages?" Is that any benefit for using the canonical tag for the same URL (Original URL)?0 -
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
Url suddenlly diappeared from Google search results
Hi, I am facing a big problem wheel Google stop showing a basic url of my site, It was ranked good for more than 35 keywords from 1st to 8st positions, and suddenly I can find it indexed in Google , this is the URL : http://tv1.alarab.com/view-8/مسلسلات-عربية Thnaks
White Hat / Black Hat SEO | | alarab.net0 -
Can the disavow tool INCREASE rankings?
Hi Mozzers, I have a new client who has some bad links in their profile that are spammy and should be disavowed. They rank on the first page for some longer tail keywords. However, we're aiming at shorter, well-known keywords where they aren't ranking. Will the disavow tool, alone, have the ability to increase rankings (assuming on-site / off-site signals are better than competition)? Thanks, Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
Have just submitted Disavow file to Google: Shall I wait until after they have removed bad links to start new content lead SEO campaign?
Hi guys, I am currently conducting some SEO work for a client. Their previous SEO company had built a lot of low quality/spam links to their site and as a result their rankings and traffic have dropped dramatically. I have analysed their current link profile, and have submitted the spammiest domains to Google via the Disavow tool. The question I had was.. Do I wait until Google removes the spam links that I have submitted, and then start the new content based SEO campaign. Or would it be okay to start the content based SEO campaign now, even though the current spam links havent been removed yet.. Look forward to your replies on this...
White Hat / Black Hat SEO | | sanj50500 -
"NOINDEX,FOLLOW" same as "NOINDEX, FOLLOW" ?
Notice the space between them - I am trying to debug my application and sometimes it put in a space - Will this small difference matter to the bots?
White Hat / Black Hat SEO | | bjs20100 -
Include placename in URL, or not?
Hi Mozzers, I'm wondering whether to put placename in URL or not. This is for a hotel so it's very focused on the county. I have loads of sub pages along the lines of www.hotelname.com/short-breaks-somerset www.hotelname.com/eat-out-somerset and so on but I was wondering whether that placename element would help or hinder. For example, may want to rank for short breaks in other searches (not just those seeking short breaks in Somerset) and was wondering whether the somerset bit may actually hinder this in the future. Also noticed Somerset is mentioned in nearly all of the page urls through the site. Perhaps this is a bit spammy and just not neccesary. I can include the address of the hotel on every page anyway. What do you think? Thanks in advance for your help 🙂 Luke
White Hat / Black Hat SEO | | McTaggart0