How long for a sitewide 301 to reindex?
-
Hey Gang,
Finally joined the big boys here, excited to see what we all can do together.
Here is my situation. I have been struggling since panda 1.0 on a particular site at www.burnworld.com. Over 2011 we figured out what the issues were with the content and went on a major cleanup. This seemed to help towards the end of the 2011. However further panda updates this year mainly April have again struck. This was after adding a wordpress blog to the site late 2011, so it was a mix of a traditional html site and a wordpress blog. Thinking that this could be an issue in May this year we transferred all the content over to wordpress only. We did keep the same linking strucutre using a permallink plugin to set specific url's.
Forward to Panda 20. This wiped out all rankings and then we could not even rank for our own content. One site that syndicates our content is now ranking for our content instead of us, and many 'feed' sites that scrape out feeds also rank insead of us.
Okay now to my original question. 2 weeks ago we pulled the plug and made the decision it may be best to start over on www.burnworld.net since the .net in the past was a blog on wordpress (which was shutdown earlier n 2012), but sat with about 5 pages of content until we did the 301.
So today none of the pages are in the main index and I am wondering if doing the 301 might have been a mistake by pointing it to an existing site that never really ranked. Would it have been best to start on a new domain?
How long have others seen before google puts the pages back in the main index?
Would like to figure out the best action to take to get back into google's good graces. I'll keep this page updated so others with this issue can hopefully have a resource to turn to.
BTW- nothing has chaned with Binghoo, rankings are all the same and they have updated the domain change properly.
-
That depends upon why the page stopped ranking. If it was being filtered for duplicate content then rewriting might bring it back if your new content is unique.
If it stopped ranking because you have an overabundance of exact match anchor text links pointing at the page then rewriting will probably not work.
-
I appreciate the feedback EGOL. Is this statement based on experience with other sites? I don't see how all the content can be worthless.
So getting back to my original question, how long on average does google take to add a site back into the index after doing a 301 redirect? Webmaster tools does show that 165 out of 170 pages are indexed from the sitemaps, but the index status only shows 12 and this was the number before the switch, so nothing has changed there.
-
Appreciate the feedback Paul, regardless of what it is.
The main reason for 301 was due that there was a complete site redesign, layout, internal linking structure, rewriting/updating/removing content, etc.. I was hoping that the domain change would be treated as a new site with all the updates that we did and to eliminate the hornets nest of the .com which was a hybrid mix of html and wordpress.
Because of all these changes is google just filtering out the content until it knows what to do with it? Meaning it has crawled it enought times to know what it is? It's just strange that Binghoo renamed everything properly but google has completely filtered it all out.
-
IMO this domain and its content is screwed. Worthless.
I would start fresh on a new domain.
Dont reuse content from the original site. Don't redirect anything from the original site.
I would start on a new domain and not syndicate anything or republish anything.
That's what I would do.... others might do different.
-
Not encouraging to hear EGOL, but if that's the reality with google now then it really does look like they are pushing the little guy out regardless of PR on the site.
So do you think that I have a chance on my content ever showing back in the google serps after doing the site 301 redirect? Should I undo the 301 or does that cause another potential ranking issue?
-
Looking at this purely from a technical perspective, Rob... (from EGOL's observations above you have content issues as well)
I think you've misunderstood the biggest challenge to recovering from a heavy Panda hammering with a new site.
When you create a new site for recovery purposes you must completely dissociate it from the original damaged site. By 301-redirecting the old to the new URL, you've simply told Google that the old site (with all it's algorithmic penalties) simply exists at the new URL. So the new site inherits all the de-ranking of the old site.
This is why Panda recovery is so tough - you need to literally start again from scratch with the new site. You can't even easily redirect the human traffic from the old site to the new without risking bringing the "penalties" with them.
And this makes sense from the Search Engine perspective. If they've de-ranked your existing site, it wouldn't make sense to allow you to just point to a new domain and sidestep the penalties. Otherwise, nefarious webmasters would just run manipulative sites as hard as they could until they got penalised, then point to a new domain and pick up where they left off until they got caught again.
Looks to me like you're going to need to completely disconnect the two sites (I'm even paranoid enough to suggest a new Google Analytics account and new hosting/domain registration if you want to go that far) and start a significant content rewrite and new content program.
And that's why Panda/Penguin recovery with a new site is so tough. You have to rebuild from the ground up with no help from your previous site. Google has clearly told you it considers your existing content of little value as a search result. Moving that content to a new domain isn't going to change that opinion. Only new, higher-quality content (as well as new additional authority/ranking factors) will.
Sorry to be such a downer on your first post (welcome to the SEOMoz Clan by the way!) but you have a steep road ahead of you and it would unfair to mislead you otherwise.
Paul
-
When other people start grabbing your content or you start grabbing their content then the sites that will survive filters and panda are usually the most powerful sites with the most domain authority.
I had a couple hundred pages on one of my sites that appeared verbatim on many other sites and it got hit with a panda problem... and this is a PR7 domain and the panda problem pages were PR4 and PR5.
I escaped panda in a few months by deleting some of the problem content and noindexing the rest. I was lucky to have thousands of pages of original content left after getting rid of the dupes.
If you have a juvenile site and the scrapers get you or you syndicate or republish... then you have gotten the kiss of death.
That's my opinion. Others might believe different.
-
Hey Egol, now that you mentioned this about content, changing the content of the focused pages will help or its just useless, what do you think ?
I am working on changing content of few pages which stopped ranking, what i mean is i am just re-writing the text in the articles.
will it be re ranked with the new content?
-
yeah that's what I'm thinking too. All these rss scraper sites are now outranking the original content. Does not seem to make sense to me though as all these sites link back to the original article.
Would it make sense to put time into redoing the content to try to outrank my original content that is now strewn across scrapers and feed sites?
I thought that google made an update that made finding and giving credit to the original source more reliable. Is that true for only for certain 'brands'?
-
I searched Google for sentences from your content between quotes. Like this....
"It’s finally done! Our official review of DVD to iPod Converting Software for 2013."
I saw your site in those SERPs but I also saw lots of other sites with the same content. I saw this for six or seven sentences from various parts of your site.
IMO this site with this content is screwed.
I would start working on another project. This one might come back in a couple of years after other sites that have your content die off or your content rolls so deep into their site that it is not getting any more linkjuice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forced Redirects/HTTP<>HTTPS 301 Question
Hi All, Sorry for what's about to be a long-ish question, but tl;dr: Has anyone else had experience with a 301 redirect at the server level between HTTP and HTTPS versions of a site in order to maintain accurate social media share counts? This is new to me and I'm wondering how common it is. I'm having issues with this forced redirect between HTTP/HTTPS as outlined below and am struggling to find any information that will help me to troubleshoot this or better understand the situation. If anyone has any recommendations for things to try or sources to read up on, I'd appreciate it. I'm especially concerned about any issues that this may be causing at the SEO level and the known-unknowns. A magazine I work for recently relaunched after switching platforms from Atavist to Newspack (which is run via WordPress). Since then, we've been having some issues with 301s, but they relate to new stories that are native to our new platform/CMS and have had zero URL changes. We've always used HTTPS. Basically, the preview for any post we make linking to the new site, including these new (non-migrated pages) on Facebook previews as a 301 in the title and with no image. This also overrides the social media metadata we set through Yoast Premium. I ran some of the links through the Facebook debugger and it appears that Facebook is reading these links to our site (using https) as redirects to http that then redirect to https. I was told by our tech support person on Newspack's team that this is intentional, so that Facebook will maintain accurate share counts versus separate share counts for http/https, however this forced redirect seems to be failing if we can't post our links with any metadata. (The only way to reliably fix is by adding a query parameter to each URL which, obviously, still gives us inaccurate share counts.) This is the first time I've encountered this intentional redirect thing and I've asked a few times for more information about how it's set up just for my own edification, but all I can get is that it’s something managed at the server level and is designed to prevent separate share counts for HTTP and HTTPS. Has anyone encountered this method before, and can anyone either explain it to me or point me in the direction of a resource where I can learn more about how it's configured as well as the pros and cons? I'm especially concerned about our SEO with this and how this may impact the way search engines read our site. So far, nothing's come up on scans, but I'd like to stay one step ahead of this. Thanks in advance!
Technical SEO | | ogiovetti0 -
How to manually create sitemap with 301 redirrctions?
Hi there I've just redirected a few of my pages (created in Dreamweaver) using the PHP Redirect function, I am wondering how do I create a manual sitemap to submit to Google to reflect this or will Google pick it up automatically based on the below? Example:
Technical SEO | | IsaCleanse
URL Being redirected: http://industrytix.com.au/cuban-club-perth-tickets.php **New URL: **[http://industrytix.com.au/buy-tickets/cuban-club-perth-nyd/ Redirect code:](http://industrytix.com.au/buy-tickets/cuban-club-perth-nyd/) Header( "HTTP/1.1 301 Moved Permanently" );
Header( "Location: http://industrytix.com.au/buy-tickets/cuban-club-perth-nyd/" );
?>0 -
301: Dynamic URL to Static Page
I've been going around trying to get this dynamic url to redirect in the .htaccess file. I know I'm missing something but can't figure it out. Code: RewriteEngine on
Technical SEO | | ohlmanngroup
RewriteCond %{QUERY_STRING} ^/dynamic-url.php?id=43$
RewriteRule ^$ http://static/page/url/inserted/here? [R=301,L] Suggestions?0 -
Creating a Target URL For 301 Redirect in Wordpress
I am confused as to what to put in as the target URL. Is this just a new URL that I must create a name for? I am having trouble finding any answer for this on the internet, just more people asking the same question. I am finally realizing that all the information is found right here at SEOmoz to both learn and ask questions about.
Technical SEO | | lartinos0 -
301 Redirects Not Allowed by Host
Not sure if anyone has an answer, but we have a client who has an ecommerce store with SBI! The client has a new site with a new store builder/host and wants to 301 redirect all of the old site's indexed pages to the new site. However, we were just informed by SBI! that 301 redirects are not allowed - even more, they don't even grant FTP access. Any brilliant ideas from anyone how we can get around this?? Thank you!
Technical SEO | | roundabout0 -
301 Redirect on a PDF, DOCX files?
Hi, I have to rename many pdf and docx files. How can I implement 301 redirect on them as they are linked from 'n' number of places? Regards, Shailendra Sial
Technical SEO | | IM_Learner1 -
301 Single Page Redirects in IIS7?
Hey all -- I am working with a client, getting ready to make a full domain level change to a brand new domain. The existing domain has solid domain importance and trust, and the home page has a 5/10 GPR, so the transfer of all existing link juice is very important. Of course, I will be utilizing 301's to permanently redirect all existing pages to their new permanent homes. It will be a 1-1 structure, which I know is also best when possible. My question comes in specific to IIS. There is a wealth of information out there on the net regarding implementing permanent 301's using Apache and .htaccess, but nada when it comes to doing it in IIS7, which is what the client is using. For instance, today I am seeking to help them redirect 2 single pages to new destinations within the same domain, just diffferent folders. When you open up the IIS7 Control Panel (yes, with full Admin access), you can navigate to the directory, but the individual pages that I am looking to redirect with 301's do not show in IIS7, so you can't just right click on each page and choose "A redirection to a URL," etc. Any help on exactly how to redirect a single page using a permanent 301 in IIS 7 would be huge! Thanks guys!
Technical SEO | | Bandicoot0 -
50+ duplicate content pages - Do we remove them all or 301?
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much). Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages? Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc) thanks!
Technical SEO | | Motava0