Site offline - Mitigating measures?
-
Hi,
Our domain has expired, and it could take up to 48h to recover our website. Appart from the obvious image damage, It worries me Google will just think we have vanisheg
Any recommendations? Maybe update something on WebMasterTools? Not having the domain, cannot even do any temporary redirect, etc...
Thanks!
Jaime
-
Oh that's a bummer! Unfortunately since you don't own the domain right now there's not a ton to be done. Fortunately, 48 hours isn't SO bad. Google will re-crawl you when you re-launch (keep a close eye to make sure they do so correctly and all your pages get indexed), and you'll probably see a bit of a dip in traffic/rankings for a bit, but you should recover within a month or two.
I'd spend this time thinking up some ways to quickly earn some new links - maybe a new piece of content, or building some new relationships? That will mitigate the effects from this. Good luck!
-
Hello,
If you already have started domain renew process, then don't worry about Google. Don't do anything in Google webmaster tools, just wait for your domain to be activated. Google will re-crawl your site and update it accordingly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we dump the https from a client site?
We inherited a site that has both http and https. No e-commerce or data transfer...just html. Should we dump the https certificate? I think it might be causing issues with indexing and possible duplicate content. The https site has a certificate warning message...not good. The URL is www.charlottemechanical.com
Technical SEO | | theideapeople0 -
Squidoo vs Personal Site
Hey guys I'm Nikolas a newb, just signed up to the pro membership trial after alot of digging on the seomoz blog for months . First off let me tell you alittle about my story and seo knowledge. I started off online on the well known squidoo site with revenue sharing, because of my day job I had alot of time to work on my articles and build up to a nice monthly salary of just over 1k in less than 5 months which doubled and trippled in the last few months. Seo is like a 6th sense to me , onpage offpage and the lots. Most of what I read here is not new to me or something I didn't already know about, but its good to freshen up and remember things, as theres alot to search engine optimization. I have built up to over 500k unique visitors in less than a year and have decided to move on to my own site 4 months ago. The niche is the exact same one I have targeted on squidoo. My site had alot of issues at the start the classic 301 redirection ht_access fix I had to do,content management system building low quality content pages via tags that i have fixed(noindex) and removed with 404s, build up original unique valuable posts, interlink ,onpage and offpage seo the basics I did for squidoo. The problem here is that I can't seem to get any traction from google where as my squidoo search engine traffic is 80% , my sites google traffic is 5-10%. I have the same number of articles on both sites, similar topics , similar onpage offpage optimisation basically identical but have alot better content on my new site. My bing, yahoo and referral traffic is rising everyday but as I know google is 85% of the market share I am leaving alot of money on the table. I hope that most of you more dedicated seo's can give me a tip or two and explain exactly what is going on with my situation and if possible take a look at my site hardwarepal .
Technical SEO | | NikolasNikolaou0 -
Multilingual blogs and site structure
Hi everyone, I have a question about multilingual blogs and site structure. Right now, we have the typical subfolder localization structure. ex: domain.com/page (english site) domain.com/ja/page (japanese site) However, the blog is a slightly more complicated. We'd like to have english posts available in other languages (as many of our users are bilinguals). The current structure suggests we use a typical domain.com/blog or domain.com/ja/blog format, but we have issues if a Japanese (logged in) user wants to view an English page. domain.com/blog/article would redirect them to domain.com/ja/blog/article thus 404-ing the user if the post doesn't exist in the alternate language. One suggestion (that I have seen on sites such as etsy/spotify is to add a /en/ to the blog area: ex domain.com/en/blog domain.com/ja/blog Would this be the correct way to avoid this issue? I know we could technically work around the 404 issue, but I don't want to create duplicate posts in /ja/ that are in English or visa versa. Would it affect the rest of the site if we use a /en/ subfolder just for the blog? Another option is to use: domain.com/blog/en domain.com/blog/ja but I'm not sure if this alternative is better. Any help would be appreciated!
Technical SEO | | Seiyav0 -
Mobile Site & SEO
If i create a mobile site for a client will google crawl that site for mobile results or will it effect my rankings. My guess is no, just want to make sure. Obviously code will be different.
Technical SEO | | waqid0 -
Google Webmaster Site Performance
In webmaster tools, under labs/site performance google provides your ave page load time. When google grades a page, does it use how long that specific page loads -or- Does google use the overall ave page load time for the domain as provided in lab/site performance
Technical SEO | | Bucky0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
Site Structure question
when deciding the Site structure for a e-commerce site Is it better to keep everything mysite.com/widget.html or use categories like mysite.com/Gifts/widget.html
Technical SEO | | DavidKonigsberg0 -
Some site pages are removed from Google Index
Hello, Some pages of my clients website are removed from Google Index. We were in top 10 position for some keywords but now I cannot find those pages neither in top 1000. Any idea what to do in order to get these pages back? thank you
Technical SEO | | besartbajrami0