Getting 260,000 pages re-indexed?
-
Hey there guys,
I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed?
Thanks!
-
Great, I'm going to try that, thanks a lot!
-
Link to your category pages... Or a good idea might be to prepare pages by topic that feature (and link to) some of the most informative and popular threads.
-
-
We didn't actually do a 404, we 301'd everything, and I do mean everything, to our new domain.
-
Yes
-
Aye, that's what I thought as well
-
Nothing changed except for ads, which we placed better, the site speed is the same because we didn't move hosts. It actually improved lately because of someone we hired to optimize the site's speed. The backlinks coming in have transfered and we are building new ones. The thing is, the site itself is ranking really well for its new keywords, it's just these old ones that apparently have died
-
-
260,000 threads indeed, they go back to 2006 though, so we've had some time to get posts.
Throwing those PR5 links in there would help of course, but where to I point them at? How deep do I link? I could link to all the 260,000 threads but I believe that would be a little crazy.
-
check list:
-
) 404 , done
-
301 done
-
Been two months so by now google must have settled down with the traffic
-
How about on page factors ?
- page Title
-Layout
-
ads
-
Site speed
-
Linking outside
U need to check if they are all the same.
if its not this then I am afraid I can't come up with anymore points to help you with
-
-
while this maybe true in the general since I would like to however point out that the loss of traffic is caused due to shifting of the domain.
-
Almost two months now.
-
How long has it been since you have moved your site ?
-
260,000 threads?
How many inbound links do you have to hold all of that pagemass in the index?
If you don't have lots of high PR deep links into the site the spiders will visit obscure pages infrequently and will forget about them.
You need to link deep into these pages at multiple points with heavy PR. That will force a continuous and recurring stream of spiders down into the mass and require them to chew their way out. I think that you need a few dozen PR5 links at least for healthy indexing.
-
We've checked Google webmasters for 404 and crawl errors which we all fixed a day after moving. I can't check all the pages in SEOMoz tools because of the limit. We did do a complete 301 actually, redirecting every page to its new location.
-
I wud check google webmaster for 404 and crawl errors and fix them first.
I would then do the same in using seo moz tools.
After all that I would do a complete 301 from the old domain to the new domain.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it good or bad to add noindex for empty pages, which will get content dynamically after some days
We have followers, following, friends, etc pages for each user who creates account on our website. so when new user sign up, he may have 0 followers, 0 following and 0 friends, but over period of time he can get those lists go up. we have different pages for followers, following and friends which are allowed for google to index. When user don't have any followers/following/friends, those pages looks empty and we get issue of duplicate content and description too short. so is it better that we add noindex for those pages temporarily and remove noindex tag when there are at least 2 or more people on those pages. What are side effects of adding noindex when there is no data on those page or benefits of it?
Intermediate & Advanced SEO | | swapnil120 -
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
301 / 404 & Getting Rid of Keyword Pages
I had a feeling that my keyword focused pages were causing my site not to rank well. I do not have that many keywords. I have 2 main keyword phrases along with 6 city locations. For example (fake) "tea house tampa" "tea house clearwater" "tea house sarasota" and "tea room tampa" "tea room cleawater" "tea house sarasota". So, I don't feel that I need that many pages. I feel like I can optimize my home page and maybe 1 or 2 topic pages. Right now, I have a keyword for each of those phrases. These are all internal pages on 1 domain. Not multiple domains. Sooo... I tested it by 301ing a few of my "tea house" KW pages to the home page. And low and behold... my home page rose BIG TIME! Major improvement! I'm talking like 13th to 2nd! Here is my question... how should I proceed? My SEO has warned me against 301ing too many pages all pointing to the home page. He says that will negatively impact my ratings. Should I 404 some pages? Should I build a "tea room" topic page and 301 that set there? What is worse? 301 or 404? How many is too many? I'm really excited by these results, but I'm scare to move forward and hurt what has happened. Thanks in advance!
Intermediate & Advanced SEO | | CalicoKitty20000 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Amount of pages indexed for classified (number of pages for the same query)
I've notice that classified usually has a lots of pages indexed and that's because for each query/kw they index the first 100 results pages, normally they have 10 results per page. As an example imagine the site www.classified.com, for the query/kw "house for rent new york" there is the page www.classified.com/houses/house-for-rent-new-york and the "index" is set for the first 100 SERP pages, so www.classified.com/houses/house-for-rent-new-york www.classified.com/houses/house-for-rent-new-york-1 www.classified.com/houses/house-for-rent-new-york-2 ...and so on. Wouldn't it better to index only the 1st result page? I mean in the first 100 pages lots of ads are very similar so why should Google be happy by indexing lots of similar pages? Could Google penalyze this behaviour? What's your suggestions? Many tahnks in advance for your help.
Intermediate & Advanced SEO | | nuroa-2467120 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
Google is indexing wordpress attachment pages
Hey, I have a bit of a problem/issue what is freaking me out a bit. I hope you can help me. If i do site:www.somesitename.com search in Google i see that Google is indexing my attachment pages. I want to redirect attachment URL's to parent post and stop google from indexing them. I have used different redirect plugins in hope that i can fix it myself but plugins don't work. I get a error:"too many redirects occurred trying to open www.somesitename.com/?attachment_id=1982 ". Do i need to change something in my attachment.php fail? Any idea what is causing this problem? get_header(); ?> /* Run the loop to output the attachment. * If you want to overload this in a child theme then include a file * called loop-attachment.php and that will be used instead. */ get_template_part( 'loop', 'attachment' ); ?>
Intermediate & Advanced SEO | | TauriU0 -
Are links to on-page content crawled / have any effect on page rank?
Lets say I have a really long article that begins with links to <a name="something">anchors on the same page.</a> <a name="something"></a> <a name="something">E.g.,</a> Chapter 1, Chapter 2, etc, allowing the user to scroll down to different content. There are also other links on this page that link to other pages. A few questions: Googlebot arrives on the page. Does it crawl links that point to anchors on the same page? When link juice is divided among all the links on the page, do these links count and page rank is then lost? Thanks!
Intermediate & Advanced SEO | | anthematic0