User Tools

Site Tools


Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/b/b70bd89822b1baddd69fb15b821d7a5b.i failed
Unable to save cache file. Hint: disk full; file permissions; safe_mode setting.
Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/b/b70bd89822b1baddd69fb15b821d7a5b.metadata failed
entropy_networks_and_information_flow
Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/4/4c47e099ad93cfdd49da21f4cec5aad6.i failed
Unable to save cache file. Hint: disk full; file permissions; safe_mode setting.
Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/4/4c47e099ad93cfdd49da21f4cec5aad6.xhtml failed

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

entropy_networks_and_information_flow [2016/09/01 19:15] (current)
Line 1: Line 1:
 +
 +
 +==Entropy, Networks and Information Flow==
 +
 +
 +
 +Babak Hassibi, California Institute of Technology
 +
 +
 + 
 +Abstract: We study the information flow in networks through the notion of entropic vectors. Given n random variables, the 2^n-1 dimensional vector obtained from all possible joint entropies is called an entropic vector. It turns out that the space of entropic vectors is a convex cone and that a large class of network information theory problems can be cast as linear optimization over this convex cone. While this formulation circumvents the "​non-convex"​ and "​infinite-letter"​ characterizations of earlier formulations,​ it still does not lead to a solution since a characterization of the convex cone of entropic vectors is not known for n>4 random variables. In this talk, we develop some inner and outer bounds to this space, as well as describe the connections to finite group theory, quasi-uniform distributions,​ non-Shannon inequalities,​ matroids, and Cayley'​s hyperdeterminant. We review the insuficiency of linear network codes and describe Ingleton-bound-violating finite groups. As a concrete example, we show how determining optimal linear codes over GF(2), for arbitrary networks, reduces to linear programming. We also develop Monte Carlo Markov chain methods for designing optimal nonlinear network codes.
 +
 + 
 +
 +Biography: Babak Hassibi is professor and executive officer of electrical engineering at the California Institute of Technology, where he has been since 2001. From 1998 to 2001 he was a member of the technical staff at the Mathematical Sciences Research Center at Bell Laboratories,​ Murray Hill, NJ, and prior to that he obtained his PhD in electrical engineering from Stanford University. His research interests span different aspects of communications,​ signal processing and control. Among other awards, he is a recipient of the David and Lucille Packard Foundation Fellowship, and the Presidential Early Career Award for Scientists and Engineers (PECASE).
 +
 +Host: Alex Dimakis, dimakis [at] usc. 
 +
 +Back to [[start | CommNetS Seminar Page ]]
 +
 +
 +
  
entropy_networks_and_information_flow.txt ยท Last modified: 2016/09/01 19:15 (external edit)