How long a domain's bad reputation last?
-
I catched a dropped domain with a nice keyword, but poor reputation. It used to have some malware on the site and WOT (site review tool available at Chrome among others) has very negative reviews tied to the site. I guess that Google has to have records about that as well, because Chrome used to prompt a warning when I entered the site. My question is: how long will the bad reputation last if I build a legitimate website there?
-
Just post on google's forums stating the new date of purchase, you checked and removed all the bad stuff, etc etc.
-
What kind of documentation I can keep? Is it not enough that the date of registration is after the wrongdoings of the site?
-
It's a tough one to answer with any certainty - I have done reconsideration requests that have been granted in a few weeks - other times it has taken twice as long.
If Google clear it, it is, for all intents and purposes, cleared of any wrongdoing in Google's eyes, but I can't tell you if they keep an eye on it or not.
As Russ said, have some documentation at hand so that you can show them you are the new owner to help fight your corner.
And yes, look at all possible angles, including the one where the good name has a black mark next to it.
-
How long such consideration might take? Does it clear the name entirely or does something stick to it anyway? Should I take into account a scneario where it doesn't get it's good name back (taking that I run a legitimate website)?
-
Have some documentation available to prove that the domain dropped and that you are a different owner. People regularly tell google that they "just bought this website and had no idea that it had problems blah blah blah". They are going to want some evidence.
-
it is down to Google as to how long they would consider it a problem, but the thing to do here is to request a reconsideration of your site.
This is done through your webmaster tools and gives you a chance to tell Google what you have done.
Bad reputation caused by the site in the past can only be rectified by doing some reputation management
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help, no organic traffic recovery after new site launch (it's been 6 months)!
I worked with a team of developers to launch a new site back in March. I was (and still am) in charge of SEO for the site, including combining 4 sites into 1. I made sure 301 redirects were in place to combine the sites and pretty much every SEO tactic I can think of to make sure the site would maintain rankings following launch. However, here we are 6 months later and YoY numbers are down -70% on average for organic traffic. Anyone mind taking a look at http://www.guestguidepublications.com and seeing if there's a glaring mistake I'm missing?!?!?! Thanks ahead of time!
Intermediate & Advanced SEO | | Annapurna-Digital1 -
Bad Domain Links - Penguin? - Moz vs. Search Console Stats?
I've been trying to figure out why my site www.stephita.com has lost it's google ranking the past few years. I had originally thought it was due to the Panda updates, but now I'm concerned it might be because of the Penguin update. Hard for me to pinpoint, as I haven't been actively looking at my traffic stats the past years. So here's what I just noticed. On my Google Search Console - Links to your Site, I discovered there are 301 domains, where over 75% seem to be spammy. I didn't actively create those links. I'm using the MOZ - Open site Explorer tool to audit my site, and I noticed there is a smaller set of LINKING DOMAINS, at about 70 right now. Is there a reason, why MOZ wouldn't necessarily find all 300 domains? What's the BEST way to clean this up??? I saw there's a DISAVOW option in the Google Search Console, but it states it's not the best way, as I should be contacting the webmasters of all the domains, which is I assume impossible to get a real person on the other end to REMOVE these link references. HELP! 🙂 What should I do?
Intermediate & Advanced SEO | | TysonWong0 -
Canonical's, Social Signals and Multi-Regional website.
Hi all, I have a website that is setup to target different countries by using subfolders. Example /aus/, /us/, /nz/. The homepage itself is just a landing page redirect to whichever country the user belongs to. Example somebody accesses https://domain/ and will be redirected to one of the country specific sub folders. The default subfolder is /us/, so all users will be redirected to it if their country has not been setup on the website. The content is mostly the same on each country site apart from localisation and in some case content specific to that country. I have set up each country sub folder as a separate site in Search Console and targeted /aus/ to AU users and /nz/ to NZ users. I've also left the /us/ version un-targeted to any specific geographical region. In addition to this I've also setup hreflang tags for each page on the site which links to the same content on the other country subfolder. I've target /aus/ and /nz/ to en-au and en-nz respectively and targeted /us/ to en-us and x-default as per various articles around the web. We generally advertise our links without a country code prefix, and the system will automatically redirect the user to the correct country when they hit that url. Example, somebody accesses https://domain/blog/my-post/, a 302 will be issues for https://domain/aus/blog/my-post/ or https://domain/us/blog/my-post/ etc.. The country-less links are advertised on Facebook and in all our marketing campaigns Overall, I feel our website is ranking quite poorly and I'm wondering if poor social signals are a part of it? We have a decent social following on Facebook (65k) and post regular blog posts to our Facebook page that tend to peek quite a bit of interest. I would have expected that this would contribute to our ranking at least somewhat? I am wondering whether the country-less link we advertise on Facebook would be causing Googlebot to ignore it as a social signal for the country specific pages on our website. Example Googlebot indexes https://domain/us/blog/my-post/ and looks for social signals for https://domain/us/blog/my-post/ specifically, however, it doesn't pick up anything because the campaign url we use is https://domain/blog/my-post/. If that is the case, I am wondering how I would fix that, to receive the appropriate social signals /us/blog/my-post/, /aus/blog/my-post/ & /nz/blog/my-post/. I am wondering if changing the canonical url to the country-less url of each page would improve my social signals and performance in the search engines overall. I would be interested to hear your feedback. Thanks
Intermediate & Advanced SEO | | destinyrescue0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Getting into Google News, URL's & Sitemaps
Hello, I know that one of the 'technical requirements' to get into google news is that the URL's have unique numbers at the end, BUT, that requirement can be circumvented if you have a Google News Sitemap. I've purchased the Yoast Google News Sitemap (https://yoast.com/wordpress/plugins/news-seo/) BUT just found out that you cannot submit a google news Sitemap until you are accepted into google news. Thus, my question is that do you need to add the digits to the URL's temporarily until you get in and can submit a google news sitemap, OR, is it ok to apply without them and take care of the sitemap after you get in. If anyone has any other tips about getting into Google News that would be great! Thanks!
Intermediate & Advanced SEO | | stacksnew0 -
Could a HTML <select>with large numbers of <option value="<url>">'s affect my organic rankings</option></select>
Hi there, I'm currently redesigning my website, and one particular pages lists hotels in New York. Some functionality I'm thinking of adding in is to let the user find hotels close to specific concert venues in New York. My current thinking is to provide the following select element on the page - selecting any one of the options will automatically redirect to my page for that concert venue. The purpose of this isn't to affect the organic traffic - I'm simply introducing this as a tool to help customers find the right hotel, but I certainly don't want it to have an adverse effect on my organic traffic. I'd love to know your thoughts on this. I must add that in certain cities, such as New York, there could be up to 450 different options in this select element. | <select onchange="location=options[selectedIndex].value;"> <option value="">Show convenient hotels for:</option> <option value="http://url1..">1492 New York</option> <option value="http://url2..">Abrons Arts Center</option> <option value="http://url3..">Ace of Clubs New York</option> <option value="http://url4..">Affairs Afloat</option> <option value="http://url5..">Affirmation Arts New York</option> <option value="http://url6..">Al Hirschfeld Theatre</option> <option value="http://url7..">Alice Tully Hall</option> .. .. ..</select> Many thanks Mike |
Intermediate & Advanced SEO | | mjk260 -
No longer showing for 'money' phrases but long tail combinations rank high?
I hope someone can shed some light on this as I've been pulling my hair out so much there's hardly any left! Background: 12 year old website that for about 10 years had Top 3 rankings for 100's of phrases but rankings first dropped off August 2011. Panda seemed to be the cause but finding the exact issue is hard. We are an online travel agent and every hotel page has duplicate content copied from other websites. This has not been changed although lots of sections in the site still rank well, so do the hotel pages themselves. Lots of internal duplicate issues have been resolved but with no effect. Our old style link, link, link all day long with our 2-word main key phrase as anchor text has given us an unnatural backlink profile but no message has been left by G about this in WMT (yet). Internal link structure is poor with all pages linking back to the homepage with our 'money' 2-word phrase in 3 places. Penguin wiped two thirds of all our backlinks back in May 2012. Why then, do we still rank for our 'money' phrase on the homepage when it has some extra words included and becomes long tail? e.g. CityName Apartments (money phrase) - Now ranks page 2-3 CityName Apartments to rent for the night - Ranks #2 on Google in all countries To make things more confusing other pages rank really well for similar money phrase e.g. CityName Apartments Offers - Ranks 2nd on 185,000,000 results (not homepage) It seems only the homepage is effected (where 95% of inbound links point) but if the site wide duplicates or unnatural link profile was flagged it would effect more than one page of the site. Wouldn't it?
Intermediate & Advanced SEO | | lchoice0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0