Babak Hassibi, California Institute of Technology
Abstract: We study the information flow in networks through the notion of entropic vectors. Given n random variables, the 2^n-1 dimensional vector obtained from all possible joint entropies is called an entropic vector. It turns out that the space of entropic vectors is a convex cone and that a large class of network information theory problems can be cast as linear optimization over this convex cone. While this formulation circumvents the “non-convex” and “infinite-letter” characterizations of earlier formulations, it still does not lead to a solution since a characterization of the convex cone of entropic vectors is not known for n>4 random variables. In this talk, we develop some inner and outer bounds to this space, as well as describe the connections to finite group theory, quasi-uniform distributions, non-Shannon inequalities, matroids, and Cayley's hyperdeterminant. We review the insuficiency of linear network codes and describe Ingleton-bound-violating finite groups. As a concrete example, we show how determining optimal linear codes over GF(2), for arbitrary networks, reduces to linear programming. We also develop Monte Carlo Markov chain methods for designing optimal nonlinear network codes.
Biography: Babak Hassibi is professor and executive officer of electrical engineering at the California Institute of Technology, where he has been since 2001. From 1998 to 2001 he was a member of the technical staff at the Mathematical Sciences Research Center at Bell Laboratories, Murray Hill, NJ, and prior to that he obtained his PhD in electrical engineering from Stanford University. His research interests span different aspects of communications, signal processing and control. Among other awards, he is a recipient of the David and Lucille Packard Foundation Fellowship, and the Presidential Early Career Award for Scientists and Engineers (PECASE).
Host: Alex Dimakis, dimakis [at] usc.
Back to CommNetS Seminar Page