Academic Training Lecture Regular Programme

Big Data Challenges in the Era of Data Deluge (2/2)

by Ilya Volvovski (Senior Software Architect, Cleversafe, USA)

Europe/Zurich
4/3-006 - TH Conference Room (CERN)

4/3-006 - TH Conference Room

CERN

110
Show room on map
Description

For the better or for the worse the amount of data generated in the world grows exponentially. The year of 2012 was dubbed as the year of Big Data and Data Deluge, in 2013 petabyte scale is referenced matter­of­factly and exabyte size is now in the vocabulary of storage providers and large organization. The traditional copy based technology doesn’t scale into this size territory, relational DBs give up on many billions rows in tables; typical File Systems are not designed to store trillions of objects. Disks fail, networks are not always available. Yet individuals, businesses and academic institutions demand 100% availability with no data loss. Is this the final dead end? These lectures will describe a storage system, based on IDA (Information Dispersal Algorithm) unlimited in scale, with a very high level of reliability, availability, and unbounded scalable indexing. And all this without any central facility anywhere in the system and thus no single point of failure or any scalability barriers.

Discussed in this lecture:

  • What does it take to build a practical modern storage system
  • Major practical system characteristics 
  • Examples of how these principles could be applied
more information
Slides
Video
From the same series
1
Organised by

Sponsor: Maria Dimou