Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
-
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
-
Disavowing has nothing to do with traffic.
Disavowing is all about spam signals from spammy links. That and only that.
-
Thanks again for all the advice- Truly appreciated-
What are your thoughts on "disavowing" with google- murrayroofing.com so when it sends traffic to the new murrayroofingllc.com google will hopefully ignore...? Can you see our account in MOZ. You can see the old domain is sending traffic since it is listed on the spammy sites.
-
You are always welcome.
If you got more questions, you can always hit me up on my Twitter @DigitalSpaceman
-
Thank you!!
-
Hard to say who and why is putting you on those websites.
The only way to truly get rid of those backlinks is to reach out to those websites' owners. You'd have to obviously find someone who speaks the language.
Now, what you can do though is this:
- Disavow all those crappy links - that'll get Google to lower the "spam score" of your website;
- Block all traffic by IPs, geolocation and/or hostnames/referrers (that'll prevent from actual unrelated traffic)
That should clean it up pretty good.
Of course, that requires full control and ownership of that domain and website code. If you can't get that - again, my suggestion is just to part ways. -
This is awesome info! Thank you. What are your thoughts on trying to get backlinks removed from sites in China where we have no way to contact them - none of the wording o the sites are in our language- and it seems like it would be impossible to get removed from some of them. Additional thoughts greatly appreciated. In analytics we see "more" traffic from china than the US-
I'm convinced a competitor may be listing us on these sites- Or one of these SEO guys that get really pissed when we turn them down. Could they be out putting our domain on listing sites?
-
Yeah, your suggestion makes sense.
Keep the old one while the new one is ranking up.
Now, here is perfect scenario for you - keep working on the new site, and get full ownership of the old one. Then through IP blocks, cloudflare, removing all spammy backlinks etc, get rid of all or most of the spammy traffic and signals. And then redirect.
-
Thank you again!
I should have been more clear- The old website gets traffic that does convert- If it loaded faster than 10 seconds I'm sure a lot more would convert- Super high bounce rate due to slooooow loading of that site. But we do get "valid leads" every week from it. But not a lot of leads- maybe 5 a week- but our jobs are large dollar jobs.
What is your thought on running both sites separately? We could go in and make sure they are not duplicate and assign different addresses and phone numbers to the old site- But this "seems" black hat- We would not be doing it to get both site to rank- but just so we don't lose the traffic- then in a year or so get rid of it. what are your thoughts?
-
"... maybe a lot of traffic will convert. "
WILL convert? so it's not converting now? If so, it's kind of optimistic that will change, no?
Since you don't own old domain, you can't really reliably do anything about it anyway.
At this point, I would say not to forward at all, start from scratch.
-
Thank you- Yes some of the traffic - maybe a lot of traffic will convert. The problem is old "printed" directories and other places where we can't update the domain. We get a lot of business from a printed catalog that won;t change for a year or more.
I will look at the suggestions you made about IP limitations. The other issue is we don't "own" the original domain so we have to ask the owner who is also our IT guy to change settings. This is another reason we bough the new domain.
Again thank you!
-
Couple ways you can go about it.
-
Is any of the traffic going to the old spammy domain any good? Does it convert? If not, then don't worry about redirecting, there wouldn't be any point, only spam signals
-
If there is some good traffic, then do IP limitations, hostnames limitations etc. That can be done in htaccess or on the server itself. There are other more elaborate ways to filter out spam traffic as well, but that depends on how you or your IT guy is familiar with it. One of the simplest solutions is to route all traffic through CloudFlare, it has quite nice spam filtering, and it's free.
Hope this helps.
-
-
Thank you- we're talking about murrayroofinllc.com in particular- we are not sure how to forward the old domain to the new- We "know how" we just don't know if we should- The reason we developed murrayroofingllc.com is because murray roofing.com had a high spam score and we got advice from this string to go for a new domain-
Now the concern is- if we forward all the traffic from murrayroofing.com to murrayroofingllc.com that the new domain murrayroofingllc.com will be negatively affected by the spammy traffic- Somehow murrayroofing.com got on some spam sites and we get a ton of spammy traffic from china- we don't want this traffis - and these sites there is "no way" to ask them to remove our website from their spam sites in china.
All thoughts are welcome here-
-
Ta Larry
Ok nothing much of substance, that said if ranking worth trying as it is an easier or usually faster route to page 1.
Had a look at the Murray Roofing site and has not been optimised for customer queries a roofing contractor would seek to rank for. As it seems you are keen to start afresh - can do both in parallel. No harm to either.
That said would suggest you also look at your google my business structure - your effectively a local play. Getting reviews and appearing in the local search pack for roofing contractors Omaha etc we would consider a client priority.
All the best go get them.
-
only for a few and we are in position 49 and 50 for them.
-
Hi
Is the current site ranking for any terms of value?
-
Hi there,
Yes, absolutely get new domain. If you look at DA - it's only 15 (not too bad in some cases). But if you look at backlink profile - you'll see that most of the links are from listing sites - homestead, yellowpages, ezlocal etc. You can replicate that profile after a day of work. And, as you said, spam score will only bring troubles.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Legacy domains
Hi all, A couple of years ago we amalgamated five separate domains into one, and set up 301 redirects from all the pages on the old domains to their equivalent pages on the new site. We were a bit tardy in using the "change of address" tool in Search Console, but that was done nearly 8 months ago now as well. Two years after implementing all the redirects, the old domains still have significant authority (DAs of between 20-35) and some strong inbound links. I expected to see the DA of the legacy domains taper off during this period and (hopefully!) the DA of the new domain increase. The latter has happened, although not as much as I'd hoped, but the DA of the legacy domains is more or less as good as it ever was? Google is still indexing a handful of links from the legacy sites, strangely even when it is picking up the redirects correctly. So, for example, if you do a site:legacydomain1.com query, it will give a list of results which includes pages where it shows the title and snippet of the page on newdomain.com, but the link is to the page on legacydomain1.com. What has prompted me to finally try and resolve this is that the server which hosted the original 5 domains is now due to be decommissioned which obviously means the 301 redirects for the original pages will no longer be served. I can set up web forwarding for each of the legacy domains at the hosting level, but to maintain the page-by-page redirects I'd have to actually host the websites somewhere. I'd like to know the best way forward both in terms of the redirect issue, and also in terms of the indexing of the legacy domains? Many thanks, Dan
Intermediate & Advanced SEO | | clarkovitch0 -
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | | BrandBuilder0 -
Is .ME domain is effective in SEO ?
I am always listening about TLD. com. org .net but what about the .me domain. Can this will be effective in SEO. Can i able to beat down my competitors, if i choose .me . I also have a .com or other TLD option but if i am making my name than .me is for me but i need your suggestion for the seo purpose. Is there really domain affective in term of SEO.
Intermediate & Advanced SEO | | pnb5670 -
Community inside the domain or in a separate domain
Hi there, I work for an ecommerce company as an online marketing consultant. They make kitchenware, microware and so on. The are reviewing their overall strategy and as such they want to build up a community. Ideally, they would want to have the community in a separate domain. This domain wouldn't have the logo of the brand. This community wouldn't promote the brand itself. The brand would post content occassionally and link the store domain. The reasoning of this approach is to not interfere in the way of the community users and also the fact that the branded traffic acquired doesn't end up buying at the store I like this approach but I am concerned because the brand is not that big to have two domains separated and lose all the authority associated with one strong domain. I would definitely have everything under the same domain, store and community, otherwise we would have to acquire traffic for two domains. 1. What do you think of both scenarios, one domain versus two? Which one is better? 2. Do you know any examples of ecommerce companies with successful communities within the store domain? Thanks and regards
Intermediate & Advanced SEO | | footd0 -
Domain Forwarding - SEO Impacts?
I have a site that has been active for years - thinkbiglearnsmart.com. Awhile ago I had purchased about 50 domain names that were relevant to my company. I still have those urls and would like to use them to point to different pages on my site - just because they have good key words in the URLs. For example - one is dreamweavertrainingclassesonlinelive.com. Currently they are all redirecting to my homepage. A. is that hurting me? B. I would like to redirect to the more relevant page. ie the page dedicated to Dreamweaver training (http://thinkbiglearnsmart.com/dreamweaver-creative-cloud-training-course/ ) Will this hurt my Dreamweaver keyword for example because there is already a 301 redirect on that page from a very old Dreamweaver link which was something like thinkbiglearnsmart.com/dreamweaver C. On my hosting account where I can select where the URL forwards to - it has an option for "Location forwarding" and "Frame forwarding" - currently they are set to Frame forwarding - which one is best? Any help is much appreciated!!! Thank you!
Intermediate & Advanced SEO | | webbmason0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Hello! I am hoping you can help me decide the best path to take here... A little background: I'm moving to a new company that has three old domains (the oldest is 10 years old), which get a lot of traffic from their e-letters. Until recently they have not cared about SEO. So the websites have some structural, coding, URL and other issues. The sites are indexed, but have a problem getting crawled and/or indexed for new content - haven't delved into this yet but am certain I will be able to fix any of these issues. These three domains are PR4, PR4, PR5 and contain hundreds of unique articles. Here's the question... They want to move these three sites **to their main company site (PR4) and create sub domains for each one. ** I am wondering if this is a good idea or not. I have merged sites before (creating categories and/or directories) and the end result is that the ONE big site, is much for effective than TWO smaller, less authoritative sites. But the sub domain idea is something I am unsure about from an SEO perspective. Should we do this with sub domains? Or do you think we should keep the sites separate? How do Panda and Penguin play into this? Thanks in advance for the help! SD P.S. I'm not a huge advocate in using PR as a measurement tool, but since I can't reveal the actual domains, I figured I would list it as a reference point.
Intermediate & Advanced SEO | | essdee0 -
How long does a new domain need to get a specific level of trust?
We are a small start-up in germany in the Sports and health sector. We currently are building a network of people in that sector and give each person a seperate wordpress blog. The idea is to create a big network of experts. My question is: How long is the period for google to trust a completely new URL? We set up each project and create content on the page. Each week the owner of the site puts up an expert article that contain keywords. And we set certain links from other blogs, etc. Also, do you think it is more important for a site to get say, 20 backlinks from anywhere. Or 5 backlinks from very trusted blogs, etc.?
Intermediate & Advanced SEO | | wellbo0