Foundations of Differentially Oblivious Algorithms

Elaine Shi


It is well-known that a program?s memory access pattern can leak information about itsinput. To thwart such leakage, most existing works adopt the technique of oblivious RAM(ORAM) simulation. Such an obliviousness notion has stimulated much debate. AlthoughORAM techniques have significantly improved over the past few years, the concrete overheadsare arguably still undesirable for real-world systems ? part of this overhead is in fact inherentdue to a well-known logarithmic ORAM lower bound by Goldreich and Ostrovsky. To makematters worse, when the program?s runtime or output length depend on secret inputs, it may benecessary to perform worst-case padding to achieve full obliviousness and thus incurring possibly(super-)linear overheads.Inspired by the elegant notion of differential privacy, we initiate the study of a new notion ofaccess pattern privacy, which we call ?(,?)-differential obliviousness?. We separate the notionof (,?)-differential obliviousness from classical obliviousness by considering several fundamentalalgorithmic abstractions including sorting small-length keys, merging two sorted lists, and rangequery data structures (akin to binary search trees). We show that by adopting differential obliv-iousness with reasonable choices ofand?, not only can one circumvent several impossibilitiespertaining to full obliviousness, one can also, in several cases, obtain meaningful privacy withlittle overhead relative to the non-private baselines (i.e., having privacy ?almost for free?). Onthe other hand, we show that for very demanding choices ofand?, the same lower bounds foroblivious algorithms would be preserved for (,?)-differential obliviousness.

Time and Place

Tuesday, July 2, 4:15pm
Gates 463A