Academic Training Lecture Regular Programme

Big Data Challenges in the Era of Data Deluge (1/2)

by Ilya Volvovski (Senior Software Architect, Cleversafe, USA)

4/3-006 - TH Conference Room (CERN)

4/3-006 - TH Conference Room


Show room on map

For the better or for the worse the amount of data generated in the world grows exponentially. The year of 2012 was dubbed as the year of Big Data and Data Deluge, in 2013 petabyte scale is referenced matter­of­factly and exabyte size is now in the vocabulary of storage providers and large organization. The traditional copy based technology doesn’t scale into this size territory, relational DBs give up on many billions rows in tables; typical File Systems are not designed to store trillions of objects. Disks fail, networks are not always available. Yet individuals, businesses and academic institutions demand 100% availability with no data loss. Is this the final dead end? These lectures will describe a storage system, based on IDA (Information Dispersal Algorithm) unlimited in scale, with a very high level of reliability, availability, and unbounded scalable indexing. And all this without any central facility anywhere in the system and thus no single point of failure or any scalability barriers.

In this lecture we discussed:

  • Storage industry trends
  • Existing storage system limitations
  • Modern storage requirements
  • Information Dispersion Algorithm and its ability to solve storage problems
  • Major modern storage system requirements
more information
From the same series
Organized by

Sponsor: Maria Dimou