XML Sitemap Questions For Big Site
-
Hey Guys,
I have a few question about XML Sitemaps.
-
For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml).
-
If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)?
-
If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this?
-
Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it?
Thank you!
-
-
Thanks for the input guys!
I believe Twitter and Facebook don't run sitemaps for their profiles, what they have is a directory for all their profiles (twitter: https://twitter.com/i/directory/profiles Facebook: https://www.facebook.com/find-friends?ref=pf) and use that to get their profiles crawled, however I feel the best approach is through xml sitemaps and Google plus actually does this with their profiles (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml) and quite frankly I would rather follow Google then FB or Twitter... I'm just now wondering how the hell they upkeep that monster! Does it create a new sitemap everything one hits 50k? When do they update their sitemap? daily, weekly, or monthly and how?
One other question I have is if their is any penalties to getting a lot of pages crawled at once? Meaning one day we have 10 pages and the next we have 10,000 pages or 50,000 pages...
Thanks again guys!
-
I guess the way I was explaining it was for scalabilty on a large site. You have to think a site like fb or twitter with hundreds of millions of users still has the limitation of only having 50k records in a site map. So if they are running site maps, they have hundreds.
-
I'm not a web developer, so this might may be wrong, but I feel like it might be easier to just add every user to the xml sitemap and then add a noindex robots meta tag ons users pages that don't want to their profiles to be indexed.
-
If it were me and someone were asking me to design a system like that, I would design it in a few parts.
First I would create an application that handled the sitemap minus profiles, just for your tos, sign up pages, terms, and what ever pages like that.
Then I would design a system that handled the actual profiles. It would be pretty complex and resource intensive as the site grew. But the main idea flows like this
Start generation, grab the user record with id 1 in the database, check to see if indexable (move to next if not), see what pages are connected, write to xml file, loop back and start with record #2.
There are a few concessions you have to make, you need to keep up with the number of records in a file before you start another file. You can only have 50k records in one file.
The way I would handle the process in total for a large site would be this, sync the required tables via a weekly or daily cron to another instance (server). Call the php script (because that is what I use) that creates the first sitemap for the normal site wide pages. At the end of that site map, put a location for the user profile sitemap, then at the end of the scrip, execute the user profile site map generating script. At the end of each site map, put the location of the next site map file, because as you grow it might take 2-10000 site map files.
One thing that I would ensure to do is get a list of crawler ip addresses and in your .htaccess have an allow / deny rule. That way you can make the site maps only visible to the search engines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Mobile site crawl returns poorer results on 100% responsive site
Has anyone experienced an issue where Google Mobile site crawl returns poorer results than their Desktop site crawl on a 100% responsive website that passes all Google Mobile tests?
Intermediate & Advanced SEO | | MFCommunications0 -
Sitemap Query
I've decided to write my own sitemap because frankly, the automated ones pull all kinds of out of I don't know where. So to get around that, manual it is. But I have some products appear in various categories, should I still list every product in each category in the sitemap, regardless of some being duplicates, or should I choose the most relevant category and list them there? I do have a canonical URL extension which should resolve any duplicate content I have.
Intermediate & Advanced SEO | | moon-boots0 -
Client wants to remove mobile URLs from their sitemap to avoid indexing issues. However this will require SEVERAL billing hours. Is having both mobile/desktop URLs in a sitemap really that detrimental to search indexing?
We had an enterprise client ask to remove mobile URLs from their sitemaps. For their website both desktop & mobile URLs are combined into one sitemap. Their website has a mobile template (not a responsive website) and is configured properly via Google's "separate URL" guidelines. Our client is referencing a statement made from John Mueller that having both mobile & desktop sitemaps can be problematic for indexing. Here is the article https://www.seroundtable.com/google-mobile-sitemaps-20137.html
Intermediate & Advanced SEO | | RosemaryB
We would be happy to remove the mobile URLs from their sitemap. However this will unfortunately take several billing hours for our development team to implement and QA. This will end up costing our client a great deal of money when the task is completed. Is it worth it to remove the mobile URLs from their main website to be in adherence to John Mueller's advice? We don't believe these extra mobile URLs are harming their search indexing. However we can't find any sources to explain otherwise. Any advice would be appreciated. Thx.0 -
301 old site to new site?
I have client with an old site - www.bestfamilylawattorney.com - which had a lot of spammy links (and bad rankings). Instead of fixing those issues, we started a new URL - www.berenjifamilylaw.com - with new content and redesign. Should I do a 301 redirect from old to new domain? If the old site was being penalized, would a 301 transfer that penalty? I just want to make sure I don't end up hurting the new site after doing all the work to start fresh. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Site rankings down
Our site is over 10 years old and has consistently ranked highly in google.co.uk for over 100 key phrases. Until the middle of April, we were 7th for 'nuts and bolts' and 5th for 'bolts and nuts' - we have been around these positions for 5-6 years easily now. Our rankings dropped mid-April, but now (presumably as a result of Penguin 2.0), we've seen larger decreases across the board. We are now 5th page on 'nuts and bolts', and second page on 'bolts and nuts'. Can anyone please shed any light on this? Although we'd fallen some before Penguin 2.0, we've fallen quite a bit further since. So I'm wondering if it's that. We do still rank well on our more specialised terms though - 'imperial bolts', 'bsw bolts', 'bsf bolts', we're still top 5. We've lost out with the more generic terms. In the past we did a bit of (relevant) blog commenting and obtained some business directory links, before realising the gain was tiny if at all. Are those likely to be the issue? I'm guessing so. It's hard to know which to get rid of though! Now, I use social media sparingly, just Facebook, Twitter and G+. The only linkbuilding I do now is by sending polite emails to people who run classic car clubs that would use our bolts, stuff like that. I've had a decent response from that, and a few have become customers directly. Here's our link profile if anyone would be kind enough as to have a look: http://www.opensiteexplorer.org/links?site=www.thomassmithfasteners.com Also, SEOMOZ says we have too many links on our homepage (107) - the dropdown navigation is the culprit here. Should I simply get rid of the dropdown and take users to the categories? Any advice here would be appreciated before I make changes! If anyone wants to take a look at the site, the URL is in the link profile above - I'm terrified of posting links anywhere now! Thanks for your time, and I'd be very grateful for any advice. Best Regards, Stephen
Intermediate & Advanced SEO | | stephenshone1 -
Site dancing
Hi guys I have a site which is dancing. I mean one day is on position 20 , if I put more backlinks is falling, after rising again,, I dont know what is going on. The site is 2 years old, pr 2, authority 35. Why this is happening? Usually when he appears again is ranking higher, but today he disappear totally from rankings. Maybe return tomorrow? But anyway why is dancing? Thanks
Intermediate & Advanced SEO | | nyanainc0 -
Scapers and Other Sites Outranking
Post panda, there is definitely more talk about scrapers or other (more authoritative) sites outranking the original content creators in the SERPS. The most common way this problem is addressed (from what I've seen) is by rewriting the content and try your hardest to be the first one to be indexed or just ignoring it from an on page standpoint and do more link dev. Does anyone have any advice on the best way to address? Should site owners be looking deeper into their analytics and diagnostics before doing the rewrites?
Intermediate & Advanced SEO | | Troyville0 -
Dupicated Site Issues?
We are launching a new site for the Australian market and the URL will just be siteAU.com. Currently the tech team (before we came on board) has it setup with almost exactly the same content (including the site css/nav/structure etc). Some product page content is slightly different, and category pages have different product orders, plus there are location pages that are specific to AU, but otherwise it's the same. The original site: site.ca has been around for 6+ years, with several thousand pages and solid organic ranking (though the last few months have dropped ) Will the new AU site create issues for the original domain? We also have siteUSA.com which follows the same logic and has been live for a while.
Intermediate & Advanced SEO | | BMGSEO0