Loading Events

Communications and Signal Processing Seminar

Shared Information

Prakash NarayanProfessorUniversity of Maryland, ECE Department
SHARE:

Shannon's mutual information between two random variables
is a fundamental and venerable concept in information and
communication theory, statistics and beyond. What is a measure
of mutual dependence among an arbitrary number of random
variables? A notion of "shared information' among mutiple
terminals, that observe correlated random variables and
communicate interactively among themselves, is shown
to play a useful role in certain problems of distributed processing
and computation. A larger role for shared information, which
for two terminals particularizes to mutual information, is an
open and intriguing question. This talk is based on joint works
with Imre Csiszar, Sirin Nitinawarat, Himanshu Tyagi and
Shun Watanabe.

Prakash Narayan received the Bachelor of Technology degree in Electrical Engineering from the Indian Institute of Technology, Madras, in 1976, and the M.S. and D.Sc. degrees in Systems Science and Mathematics, and Electrical Engineering, respectively, from Washington University, St. Louis, MO, in 1978 and 1981.

He is Professor of Electrical and Computer Engineering at the University of Maryland, College Park, with a joint appointment at the Institute for Systems Research. His research and teaching interests are in network information theory and coding, communication and signal processing.

Narayan has served as Associate Editor for Shannon Theory for the IEEE Transactions on Information Theory and on its Executive Editorial Board. He is currently Executive Editor of these Transactions and will serve as Editor-in-Chief beginning in 2017. He is a Fellow of the IEEE.

Sponsored by

ECE

Faculty Host

Dave Neuhoff