XML Sitemap Questions For Big Site
-
Hey Guys,
I have a few question about XML Sitemaps.
-
For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml).
-
If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)?
-
If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this?
-
Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it?
Thank you!
-
-
Thanks for the input guys!
I believe Twitter and Facebook don't run sitemaps for their profiles, what they have is a directory for all their profiles (twitter: https://twitter.com/i/directory/profiles Facebook: https://www.facebook.com/find-friends?ref=pf) and use that to get their profiles crawled, however I feel the best approach is through xml sitemaps and Google plus actually does this with their profiles (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml) and quite frankly I would rather follow Google then FB or Twitter... I'm just now wondering how the hell they upkeep that monster! Does it create a new sitemap everything one hits 50k? When do they update their sitemap? daily, weekly, or monthly and how?
One other question I have is if their is any penalties to getting a lot of pages crawled at once? Meaning one day we have 10 pages and the next we have 10,000 pages or 50,000 pages...
Thanks again guys!
-
I guess the way I was explaining it was for scalabilty on a large site. You have to think a site like fb or twitter with hundreds of millions of users still has the limitation of only having 50k records in a site map. So if they are running site maps, they have hundreds.
-
I'm not a web developer, so this might may be wrong, but I feel like it might be easier to just add every user to the xml sitemap and then add a noindex robots meta tag ons users pages that don't want to their profiles to be indexed.
-
If it were me and someone were asking me to design a system like that, I would design it in a few parts.
First I would create an application that handled the sitemap minus profiles, just for your tos, sign up pages, terms, and what ever pages like that.
Then I would design a system that handled the actual profiles. It would be pretty complex and resource intensive as the site grew. But the main idea flows like this
Start generation, grab the user record with id 1 in the database, check to see if indexable (move to next if not), see what pages are connected, write to xml file, loop back and start with record #2.
There are a few concessions you have to make, you need to keep up with the number of records in a file before you start another file. You can only have 50k records in one file.
The way I would handle the process in total for a large site would be this, sync the required tables via a weekly or daily cron to another instance (server). Call the php script (because that is what I use) that creates the first sitemap for the normal site wide pages. At the end of that site map, put a location for the user profile sitemap, then at the end of the scrip, execute the user profile site map generating script. At the end of each site map, put the location of the next site map file, because as you grow it might take 2-10000 site map files.
One thing that I would ensure to do is get a list of crawler ip addresses and in your .htaccess have an allow / deny rule. That way you can make the site maps only visible to the search engines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Migration
Hi, I have been researching the best way to migrate six sites into one, since I have never done it, and I am frankly overwhelmed. Some resources say to do it incrementally, and a/b test; but I would prefer not to have to do it, as won't it present a disjointed representation for visitors? The previous sites are older and a bit clumsy compared to the new design and functionality in the new site. Can someone please tell me the right way to approach this? Or tell me the best resource for a step-by-step prep, migrate, and watch process? Thanks so much in advance!
Intermediate & Advanced SEO | | lfrazer1230 -
Launching a new website. Old inherited site cannot be saved after lifted penalty. When should we kill the old site and how?
Background Information A website that we inherited was severely penalized and after the penalty was revoked the site still never resurfaced in rankings or traffic. Although a dramatic action, we have decided to launch a completely new version of the website. Everything will be new including the imagery, branding, content, domain name, hosting company, registrar account, google analytics account, etc. Our question is when do we pull the plug on the old site and how do we go about doing it? We had heard advice that we should make sure we run both sites at the same time for 3 months, then deindex the old site using a noindex meta robots tag.We are cautious because we don't want the old website to be associated in any way, shape or form with the new website. We will purposely not be 301 redirecting any URLs from the old website to the new. What would you do if you were in this situation?
Intermediate & Advanced SEO | | peteboyd0 -
Specific KW question...
Hi, I have this site: http://www.aerlawgroup.com. It's ranking very well overall for all targeted KWs. However, I have seen a drop for one main KW: "Los Angeles criminal defense attorney." It currently ranks #8 (it used to be as high as #2). What's interesting is that for similar (yet slightly less competitive KWs, he ranks much better - "Los Angeles Criminal Defense Lawyer." I'm not trying to be greedy with rankings, but I would love feedback and/or tips regarding any issues that could be contributing to this drop. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Why does old "Free" site ranks better than new "Optimized" site?
My client has a "free" site he set-up years ago - www.montclairbariatricsurgery.com (We'll call this the old site) that consistently outranks his current "optimized" (new) website - http://www.njbariatricsurgery.com/ The client doesn't want to get rid of his old site, which is now a competitor, because it ranks so much better. But he's invested so much in the new site with no results. A bit of background: We recently discovered the content on the new site was a direct copy of content on the old site. We had all copy on new site rewritten. This was back in April. The domain of the new site was changed on July 8th from www.Bariatrx.com to what you see now - www.njbariatricsurgery.com. Any insight you can provide would be greatly appreciated!!!
Intermediate & Advanced SEO | | WhatUpHud0 -
Ask a Question
We use DNN and we have case studies ran from our CMS. This is so we can have them in lists by category on service/market pages and show specific ones when needed. Then there is the case study detail page, (this is where the problem exists)to where you read out the case study in full detail and see the images and story. We enter our Case Studies into the CMS and this determines which website they show, and it creates URLs from the titles. However, on the detail page, the case studies all share the same page, Case Study.aspx, and they resolve to that page with their respected URLs in place. As seen here, http://www.structural.net/case-study/1/new-marlins-stadium.aspx Because they all share the same page they are being pulled as duplicate pages. They do show in the SERPS with the right title and URL and it all looks great, but they get errors for having duplicate page content and titles. Is there a way to solve this, or is this something I should even worry about?
Intermediate & Advanced SEO | | KJ-Rodgers0 -
What Questions Should I Be Asking?
I just read a discussion that was originally posted by Steve Ollington on May 22, 2011 where he states that many people are asking the wrong types of questions on this forum. He said that he wonders if he will see a shift from people asking questions on "how to rank" to questions dealing with "how to work out the best KPIs" (Key Performance Indicators - yes I had to google it). I was once told we learn more by asking questions about a topic than by just listening. I've also been told that sometimes the right question to ask is, "What questions should I be asking?" So here is my question, what types of questions should I be asking to be better at SEO? Perhaps these are some of them: Is it possible to be good at SEO when it is not a full-time job? It is very tempting to look for easy answers when you only have limited time. What are considered KPI's? Are they different for every industry? How do you know what is junk information vs what is truly good SEO advice? Is it just simply trial and error? It seems to me that if people find truly good SEO information, they aren't going to be sharing it so easily. It's the whole, "You get what you pay for". Maybe some of you can tell me more of the questions I should be asking.
Intermediate & Advanced SEO | | kadesmith1 -
Scapers and Other Sites Outranking
Post panda, there is definitely more talk about scrapers or other (more authoritative) sites outranking the original content creators in the SERPS. The most common way this problem is addressed (from what I've seen) is by rewriting the content and try your hardest to be the first one to be indexed or just ignoring it from an on page standpoint and do more link dev. Does anyone have any advice on the best way to address? Should site owners be looking deeper into their analytics and diagnostics before doing the rewrites?
Intermediate & Advanced SEO | | Troyville0 -
One site or five sites for geo targeted industry
OK I'm looking to try and generate traffic for people looking for accommodation. I'm a big believer in the quality of the domain being used for SEO both in terms of the direct benefit of it having KW in it but also the effect on CTR a good domain can have. So I'm considering these options: Build a single site using the best, broad KW-rich domain I can get within my budget. This might be something like CheapestHotelsOnline.com Advantages: Just one site to manage/design One site to SEO/market Better potential to resell the site for a few million bucks Build 5 sites, each catering to a different region using 5 matching domains within my budget. These might be domains like CheapHotelsEurope.com, CheapHotelsAsia.com etc Advantages: Can use domains that are many times 'better' by adding a geo-qualifier. This should help with CTR and search Can be more targeted with SEO & Marketing So hopefully you see the point. Is it worth the dilution of SEO & marketing activities to get the better domain names? I'm chasing the longtail searchs whetever I do. So I'll be creating 5K+ pages each targeting a specific area. These would be pages like CheapestHotelsOnline.com/Europe/France/Paris or CheapHoteslEurope.com/France/Paris to target search terms targeting hotels in Paris So with that thought, is SEO even 100% diluted? Say, a link to the homepage of the first option would end up passing 1/5000th of value through to the Paris page. However a link to the second option would pass 1/1000th of the link juice through to the Paris page. So by thet logic, one only needs to do 1/5th of the work for each of the 5 sites ... that implies total SEO work would be the same? Thanks as always for any help! David
Intermediate & Advanced SEO | | OzDave0