Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
-
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
-
Disavowing has nothing to do with traffic.
Disavowing is all about spam signals from spammy links. That and only that.
-
Thanks again for all the advice- Truly appreciated-
What are your thoughts on "disavowing" with google- murrayroofing.com so when it sends traffic to the new murrayroofingllc.com google will hopefully ignore...? Can you see our account in MOZ. You can see the old domain is sending traffic since it is listed on the spammy sites.
-
You are always welcome.
If you got more questions, you can always hit me up on my Twitter @DigitalSpaceman
-
Thank you!!
-
Hard to say who and why is putting you on those websites.
The only way to truly get rid of those backlinks is to reach out to those websites' owners. You'd have to obviously find someone who speaks the language.
Now, what you can do though is this:
- Disavow all those crappy links - that'll get Google to lower the "spam score" of your website;
- Block all traffic by IPs, geolocation and/or hostnames/referrers (that'll prevent from actual unrelated traffic)
That should clean it up pretty good.
Of course, that requires full control and ownership of that domain and website code. If you can't get that - again, my suggestion is just to part ways. -
This is awesome info! Thank you. What are your thoughts on trying to get backlinks removed from sites in China where we have no way to contact them - none of the wording o the sites are in our language- and it seems like it would be impossible to get removed from some of them. Additional thoughts greatly appreciated. In analytics we see "more" traffic from china than the US-
I'm convinced a competitor may be listing us on these sites- Or one of these SEO guys that get really pissed when we turn them down. Could they be out putting our domain on listing sites?
-
Yeah, your suggestion makes sense.
Keep the old one while the new one is ranking up.
Now, here is perfect scenario for you - keep working on the new site, and get full ownership of the old one. Then through IP blocks, cloudflare, removing all spammy backlinks etc, get rid of all or most of the spammy traffic and signals. And then redirect.
-
Thank you again!
I should have been more clear- The old website gets traffic that does convert- If it loaded faster than 10 seconds I'm sure a lot more would convert- Super high bounce rate due to slooooow loading of that site. But we do get "valid leads" every week from it. But not a lot of leads- maybe 5 a week- but our jobs are large dollar jobs.
What is your thought on running both sites separately? We could go in and make sure they are not duplicate and assign different addresses and phone numbers to the old site- But this "seems" black hat- We would not be doing it to get both site to rank- but just so we don't lose the traffic- then in a year or so get rid of it. what are your thoughts?
-
"... maybe a lot of traffic will convert. "
WILL convert? so it's not converting now? If so, it's kind of optimistic that will change, no?
Since you don't own old domain, you can't really reliably do anything about it anyway.
At this point, I would say not to forward at all, start from scratch.
-
Thank you- Yes some of the traffic - maybe a lot of traffic will convert. The problem is old "printed" directories and other places where we can't update the domain. We get a lot of business from a printed catalog that won;t change for a year or more.
I will look at the suggestions you made about IP limitations. The other issue is we don't "own" the original domain so we have to ask the owner who is also our IT guy to change settings. This is another reason we bough the new domain.
Again thank you!
-
Couple ways you can go about it.
-
Is any of the traffic going to the old spammy domain any good? Does it convert? If not, then don't worry about redirecting, there wouldn't be any point, only spam signals
-
If there is some good traffic, then do IP limitations, hostnames limitations etc. That can be done in htaccess or on the server itself. There are other more elaborate ways to filter out spam traffic as well, but that depends on how you or your IT guy is familiar with it. One of the simplest solutions is to route all traffic through CloudFlare, it has quite nice spam filtering, and it's free.
Hope this helps.
-
-
Thank you- we're talking about murrayroofinllc.com in particular- we are not sure how to forward the old domain to the new- We "know how" we just don't know if we should- The reason we developed murrayroofingllc.com is because murray roofing.com had a high spam score and we got advice from this string to go for a new domain-
Now the concern is- if we forward all the traffic from murrayroofing.com to murrayroofingllc.com that the new domain murrayroofingllc.com will be negatively affected by the spammy traffic- Somehow murrayroofing.com got on some spam sites and we get a ton of spammy traffic from china- we don't want this traffis - and these sites there is "no way" to ask them to remove our website from their spam sites in china.
All thoughts are welcome here-
- topic:timeago_earlier,25 days
-
Ta Larry
Ok nothing much of substance, that said if ranking worth trying as it is an easier or usually faster route to page 1.
Had a look at the Murray Roofing site and has not been optimised for customer queries a roofing contractor would seek to rank for. As it seems you are keen to start afresh - can do both in parallel. No harm to either.
That said would suggest you also look at your google my business structure - your effectively a local play. Getting reviews and appearing in the local search pack for roofing contractors Omaha etc we would consider a client priority.
All the best go get them.
-
only for a few and we are in position 49 and 50 for them.
-
Hi
Is the current site ranking for any terms of value?
-
Hi there,
Yes, absolutely get new domain. If you look at DA - it's only 15 (not too bad in some cases). But if you look at backlink profile - you'll see that most of the links are from listing sites - homestead, yellowpages, ezlocal etc. You can replicate that profile after a day of work. And, as you said, spam score will only bring troubles.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Site (redesign) Launched Without 301 Redirects to New Pages - Too Late to Add Redirects?
We recently launched a redesign/redevelopment of a site but failed to put 301 redirects in place for the old URL's. It's been about 2 months. Is it too late to even bother worrying about it at this point? The site has seen a notable decrease in site traffic/visits, perhaps due to this issue. I assume that once the search engines get an error on a URL, it will remove it from displaying in search results after a period of time. I'm just not sure if they will try to re-crawl those old URLs at some point and if so, it may be worth it to have those 301 redirects in place. Thank you.
Intermediate & Advanced SEO | Sep 18, 2015, 7:48 PM | BrandBuilder0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | Jan 30, 2014, 8:19 PM | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
XML Sitemap on another domain
Hi, We've rebuilt our website and created a better sitemap index structure. There's a good chance that we not be able to append the XML files to existing site for technical reasons (don't get me started). I'm reaching out because I'm wondering if can we place the XML files on another website or subdomain? I know this is not best practice and probably very grey but I'm looking for alternatives. If there answer is DON'T DO IT let me know too. Thx
Intermediate & Advanced SEO | Oct 30, 2013, 1:24 AM | WMCA0 -
Canonical link vs root domain
I have a wordpress website installed on http://domain.com/home/ instead of http://domain.com - Does it matter whether I leave it that way with a canonical link from the domain.com to the domain.com/home/ or should I move the wordpress files and database to the root domain?
Intermediate & Advanced SEO | May 10, 2013, 6:43 PM | JosephFrost0 -
How long should a domain redirect take?
Hi, I know that this is a 'How long is a piece of string?' type question but at what point should the ranking value of site A pass over to site B following a domain 301 redirect? I have shifted a domain over to a new URL, same hosting server, same IP address. I haven't made any URL changes or any content changes other than to change the site logo to match the new domain name. Domain B is basically an exact clone of domain A. I have redirected Domain A to domain B using the following line at the top of the .htaccess file:- Redirect 301 / http://www.newdomain.com/ I have submitted a sitemap for the new domain via google webmaster tools. It looks like the original domain as been completely indexed by google following the redirect as all rankings have been dropped from the results and there are no results for a site:olddomain.com search. Surely the rankings should have switched over at this point? Any help would be much appreciated.
Intermediate & Advanced SEO | Jan 24, 2012, 8:28 AM | AdeLewis
Ade.0 -
How long does a new domain need to get a specific level of trust?
We are a small start-up in germany in the Sports and health sector. We currently are building a network of people in that sector and give each person a seperate wordpress blog. The idea is to create a big network of experts. My question is: How long is the period for google to trust a completely new URL? We set up each project and create content on the page. Each week the owner of the site puts up an expert article that contain keywords. And we set certain links from other blogs, etc. Also, do you think it is more important for a site to get say, 20 backlinks from anywhere. Or 5 backlinks from very trusted blogs, etc.?
Intermediate & Advanced SEO | Oct 27, 2011, 9:14 AM | wellbo0 -
Keyword-Rich Domains - Redirect?
Hi, Mozzers- I have a client that has a bunch of pretty nice keyword-rich domain names. Their traffic and rankings are good. They provide legal services in the Chicago area. I have lots of good content that I could use to start a blog using a domain like keyword,keyword-blog.com. Good idea? Currently I have a resources area on their website but feel like this area could be getting a little bloated and some news-related stuff isn't really appropriate. 2 Questions: Should I use one of the decent domains for a blog and build up the rankings, traffic, and link to the main site? Or is this lots of work for little payout? Both sites would be hosted in the cloud. Some of the domain names are related to their name, others are keyword or geo-targeted. Would it be wise to setup 301 redirects going to their website? Pros/cons? If you need additional info, please PM me for details. Thank you, friends! LHC
Intermediate & Advanced SEO | May 19, 2011, 2:53 PM | lhc670 -
Best approach to launch a new site with new urls - same domain
www.sierratradingpost.com We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results. The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites. Except for the homepage the URL structure for the new site is different than the old site. What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues? Here is what we got back from a Google post which may highlight our concerns better: http://www.google.com/support/forum/p/Webmasters/thread?tid=62d0a16c4702a17d&hl=en&fid=62d0a16c4702a17d00049b67b51500a6 Thank You, sincerely, Stephan Woo Cude SEO Specialist scude@sierratradingpost.com
Intermediate & Advanced SEO | Nov 30, 2011, 11:37 AM | STPseo0