Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
When you change your domain, How much time do I have to wait for google to return the traffic used to have?
-
Hello.
20 days ago, I changed my domain from uclasificados.net to uclasificados.com doing redirect 301 to all urls, and I started to loose rankings since that moment.
I was wondering if changing it back could be the solutions, but some experts recommend me not to do that, because it could be worse.
Right now I receave almost 50% of traffic I used to receave before, and I have done a lot of linkbuilding strategies to recover but nothing have worked until now.
Even though I notified google of this change and I send again my new sitemap, I don't see that have improve my situation in any aspects, and I still see in webmastertools search stats from my last website (the website who used to be uclasificados.com before the change).
What should I do to recover faster?
-
thanks man, all your advises have been very helpful
-
Agree with others. We have seen 3-6 months depending upon the size of the site. I would cover all the basics first, meaning:
1. Sitemap submission
2. All pages set to fetch in GWT, Bing WT
3. Updated crawl request of your site and all linked pages
4. Cleanup of any and all backlinks to old site domain, including social profiles
5. Updated sitemap crawl frequency (daily vs weekly) just to spur Google along
6. Manual search for broken backlinks or old 404 pagesHope this helps!
-
Agree with Andy and Donford.
I've seen it take as long as 8 months. It really depends on how long it takes for Google to reindex your site and every other page that previously pointed (backlinked) to you. It also assumes, as Andy-Halliday has pointed out, you've done your redirects correctly.
Don't give up. Don't change it back. Be patient. The only thing you can do is earn more and better links and that's not really recovering faster, it's augmenting your existing authority.
-
Hi
Sorry there is no 'x' date that I can give you. It all depends on Google and whether you have done all the 301s etc correctly.
When you say you are doing 'link building' strategies what do you mean, maybe this is the problem if your have got a bad link somewhere?
I wouldn't recommend going back - you wouldn't get the traffic back to what you had and as the 'experts' said it would cause you more problems.
I would check all the 301s are correctly, did you actually change anything on the site at the same time or was it just the url that you changed?
Thanks
Andy
-
My personal experience has been up to six months.
May not be the best of news for you, but I certainly wouldn't change it back. That could further complicate things.
For now I would continual link building where you can and give it some more time.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we use Cloudflare
Hi all, we want to speed up our website (hosted in Wordpress, traffic around 450,000 page views monthly), we use lots of images. And we're wondering about setting up on Cloudflare, however after searching a bit in Google I have seen some people say the change in IP, or possible sharing of Its with bad neighbourhoods, can really hit search rankings. So, I was wondering what the latest thinking is on this subject, would the increased speed and local server locations be a boost for SEO, moreso than a potential loss of rankings for changing IP? Thanks!
Technical SEO | | tiromedia1 -
Huge drop in rankins, traffic and impressions after changing to CloudFlare
Hi there, In October, one of our customer's programmer made a change on their website to optimize its loading speed. Since then, the all the SEO's metrics has dropped. Apparently, the change was to move to CloudFlare and to add Gzip compression. I was talking with the programmer and he told me he had no idea why that happened. Now comes 5 months later and the SEO metrics havn't come back yet. What seems so wierd is that two keywords in particular had the most massive drop. Those two keywords were the top keywords (more than 1k of impressions a month) and now its like there is no impressions or clics at all. Did anyone had the same event occur to them? Do you have any idea what could help this case?
Technical SEO | | H.M.N.0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Google's "cache:" operator is returning a 404 error.
I'm doing the "cache:" operator on one of my sites and Google is returning a 404 error. I've swapped out the domain with another and it works fine. Has anyone seen this before? I'm wondering if G is crawling the site now? Thx!
Technical SEO | | AZWebWorks0 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | | Red_Mud_Rookie0