What kind of data storage and processing is needed
-
Hi,
So after reading a few posts here I have realised it a big deal to crawl the web and index all the links.
For that I appreciate seomoz.org's efforts .
I was wondering what kind of infrastructure they might need to get this done ?
cheers,
Vishal
-
Thank you so much Kate for the explanation. It is quite helpful to better understand the process.
-
Hi vishalkhialani!
I thought I would answer your question with some detail that might satisfy your curiosity (although I know more detailed blog posts are in the works).
For Linkscape:
At the heart of our architecture is our own column oriented data store - much like Vertica, although far more specialized for our use case - particularly in terms of the optimizations around compression and speed.
Each month we crawl between 1-2 petabytes of data, strip out the parts we care about (links, page attributes, etc) and then compute a link graph of how all those sites link to one another (typically between 40-90 billion urls) and then calculate our metrics using those results. Once we have all of that we then precompute lots of views of the data, which is what gets displayed in Open SIte Explorer or retrieved via the Linkscape api. These resulting views of the data is over 12 terabytes (and this is all raw text compressed data - so it is a LOT of information). Making this fast and scalable is certainly a challenge.
For the crawling, we operate 10-20 boxes that crawl all the time.
For processing, we spin up between 40-60 instances to create the link graph, metrics and views.
And the API servers the index from S3 (Amazon's cloud storage) with 150-200 instances (but this was only 10 1 year ago, so we are seeing a lot of growth).All of this is Linux and C++ (with some python thrown in here and there).
For custom crawl:
We use similar crawling algorithms to Linkscape, only we keep the crawls per site, and also compute issues (like which pages are duplicates of one another). Then each of those crawls are processed and precomputed to be served quickly and easily within the web app (so calculating the aggregates and deltas you see in the overview sections).
We use S3 for archival of all old crawls. Cassandra for some of the details you see in detailed views, and a lot of the overviews and aggregates are served with the web app db.
Most of the code here is Ruby, except for the crawling and issue processing which is C++. All of it runs on Linux.
Hope that helps explain! Definitely let me know if you have more questions though!
Kate -
It is no where near that many. I attached an image of when I saw Rand moving the server to the new building. I think this may be the reason why there have been so many issues with the Linkscape crawl recently.
-
@keri and @Ryan
will ask them. my guess is around a thousand server instances.
-
Good answer from Ryan, and I caution that even then you may not get a direct answer. It might be similar to asking Google just how many servers they have. SEOmoz is fairly open with information, but that may be a bit beyond the scope of what they are willing to answer.
-
A question of this nature would probably be best as your one private question per month. That way you will be sure to receive a directly reply from a SEOmoz staff member. You could also try the help desk but it may be a stretch.
All I can say is it takes tremendous amounts of resources. Google does it very well, but we all know they have over 30 billion in revenue generated annually.
There are numerous crawl programs available, but the problem is the server hardware to run them.
I am only responding because I think your question may otherwise go unanswered and I wanted to point you in a direction where you can receive some info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Delays in Search Console Data
We are seeing big delays in search console data this month. In the past we have seen delays of a few days, but I have never noticed it being this long before. At the moment we have no data since 23rd February. Have I just never noticed this before or is this particularly long? Is anyone else seeing the same thing?
Reporting & Analytics | | Wagada0 -
Is it possible to import data from an old Google Analytics profile to a new Google Analytics profile?
We have encountered a situation where a client's old SEO firm is refusing to grant us Admin access to our client's existing GA account. For security purposes (so the other SEO firm doesn't delete the existing GA profile) we have started a new Google Analytics profile. Again we do have access to the data in the old account. Is it possible to migrate this old data over (if we just have user access)? Thanks for the help
Reporting & Analytics | | RosemaryB0 -
How can we stop Google analytics pulling in data from another site?
We have a few accounts under one Google login. They all have separate Google analytics codes but one of the sites is somehow pulling in some data from another site but the other site has not got the same analytics code on it. Not sure how this is happening and what we can do about this, is it a bug in the Google Analytics system? Any help would be appreciated.
Reporting & Analytics | | dentaldesign0 -
How long does Google Analytics store data?
Hello All, How long analytics keep the data for one website? at least two years at least 25 months other I guess that they guarantee at least 25 months, but it might be more.
Reporting & Analytics | | CommercePundit
Anyone has any other suggestion? Thanks,0 -
I have data missing in Google and don't know who to turn to for help
Hi everyone, I know this isn't the 'Google help forum' but I'm stuck and I hope someone here might be able to point me in the right direction. For a period last month - Thursday 22nd to Sunday 25th November Google Analytics reports our site as having 0 visits. In addition we have two days which were strangely low - Weds 21st 105 visits, Weds 28th Nov 78 visits. We normally get between 1000 and 1200 visits on a weekday from a global audience (I know that was the Thanksgiving weekend, but the US accounts for ~10% of total traffic). Has anyone else had this problem? If so, what did you do? The "report a bug" board on the Google help forum has a few entries like this, people with 0 visits shouting "help!" into the void with no response. Ideas?
Reporting & Analytics | | StevenHowe0 -
ECommerc site redirect to external site when add to cart. Need HELP to track sales!!!
Hi, I buil this site on WordPress, http://www.pilatesboisfranc.com When you go on <<plan &="" pricing="">> on the menu you can purchase a package online.</plan> When you click ''Get Started Now'' or ''Add to Cart'' the buyer is redirect to this external site: mindbodyonline.com QUESTIONS: Can I track my sales on Googles Analytics? Can I creat a goal on G.A. ? I found this video: https://getsatisfaction.com/mindbody/topics/chalk_talk_how_to_setup_google_analytics Is this the right way to do this? About goals, a simple goal I would like to create is, one purchase. Can I acheive that? Not shure about Goals. When I test and purchase, URL is always the same https://clients.mindbodyonline.com/ASP/home.asp?studioid=30371 I'm know only very basics stuff when it is time to play in Analytics, I hope you can provide help in details. Thank you, BigBlaze
Reporting & Analytics | | BigBlaze2050 -
Transfer Google Analytics data from one user to another?
Long story short, two weeks of analytics data are stored on a different profile from the rest of the year's data. Is there a way to export all data from a date range, and then import it to another account? Thanks for your help.
Reporting & Analytics | | AmericanOutlets0 -
Is the link data from Open Site Explorer in real time or an average?
I just started using Open Site Explorer to track internal and external link data. Is this information given in real time or is it an average over a specified period of time?
Reporting & Analytics | | mequoda0