• Brain City Berlin, TU Berlin, Onur Günlü

    Exploring the "ultimate limits"

Guest contribution by Brain City Ambassador Dr.-Ing. Onur Günlü, research group leader and lecturer at the Information Theory and Applications Chair of the Technische Universität Berlin (TU Berlin).

Consider the fundamental wireless communication question: “How much information can be transmitted over a medium reliably from a point A to point B?"  This fundamental question was not even thought of before a genius called Claude E. Shannon realized that this is the right question to ask as a wireless communication engineer. He answered the question in a single paper in 1948. He applied many tricks to obtain the ultimate limit for wireless communications, but probably the most novel trick was to remove the “semantic meaning” of the information that should be transmitted. 

This actually means that it suffices to represent any information by the amount of uncertainty in it that is measured by the metric called “Entropy”. For instance, the messages “I want to eat an apple” and “I want to get vaccinated” might have the same entropy values, so then the amount of information in each sentence would be the same. Due to this fundamental trick, Shannon was able to provide the ultimate limit for wireless communications and single-handedly established a new research area called “Information Theory”. The ultimate limit for wireless communications over a noisy channel is defined as the “Channel Capacity”. One can consider the channel capacity as the maximum achievable dependency between the transmitted and received information. If the channel capacity increases, more information can be transmitted reliably from a point A (transmitter) to point B (receiver) over a noisy channel. Moreover, the channel capacity does not depend on the semantic meaning of the transmitted information. Each sentence is represented by using, e.g., binary digits (bits), regardless of what it means.

 

Researchers from Berlin, Atlanta and Siegen

An international collaboration project, named “Optimal Code Constructions for Private and Secure Function Computations”, between Dr. Onur Günlü (TU Berlin & Brain City Berlin), Prof. Matthieu Bloch (Georgia Institute of Technology, USA), and Prof. Rafael F. Schaefer (University of Siegen, Germany) is funded by the Deutsche Forschungsgemeinsschaft (German Research Foundation, DFG) to stimulate cooperation between the cities Berlin, Atlanta, and Siegen. All partners have an information theoretic security background, and the partners from Germany focus more on the practical code design (TU Berlin) and adversarial (Uni Siegen) perspectives, whereas the partner from the US focuses more on the quantum-inspired asymptotically optimal constructions. He designs optimal methods for regular communication systems, such as wireless communication, by drawing inspirations from security methods proposed for quantum key distribution (QKD). The common goal of all project partners: to provide digital security and privacy in a provable way. The main question to be answered via this collaboration is: “How should one compute functions in a network with security and privacy guarantees?”

There are numerous applications where such a security and privacy related question is asked, including the new hot topics such as distributed Machine Learning (ML) and Internet-of-Things (IoT) networks with smart sensors. As an information theory project, the first collaboration steps are taken to define the problem at hand in such a way that it can be solved optimally. This means that by following Shannon’s methodology, the project partners are removing the “nonessential” complications of the real-world applications in such a way that when the ultimate limits are obtained, the complications can be inserted back to the problem to extend the obtained results.

Information theory guarantees security even against attackers with an unlimited amount of computational power and time

Why is this collaboration important beyond scientific curiosity and the beautiful limits one expects to obtain? Rather than answering this question separately for each application, it is possible to provide a generic answer for security and privacy applications. One of the strongest points of information theoretic security results is that they guarantee security even against attackers with unlimited amount of computational power and time. The way to provide such guarantees is to obtain a probabilistic model of the application at hand. 

For instance, assume that we all are genuinely interested in the security and privacy of individuals in, e.g., the European Union (EU). Then, we need to realize that security or privacy guarantees cannot be proved by extensive simulations, unlike reliability guarantees. One reason for this is that we cannot be sure that an attacker will not be able to develop a new attack method that was not considered. For instance, believing that no country will ever have a quantum computer that can handle many qubits would result in the assumption that we could continue using the current cryptographic primitives to provide digital security. However, such a belief can be detrimental if someone can manage to build such a powerful quantum computer that could break various cryptographic primitives used. As an alternative, information theoretic methods will be used by the project partners to provably eliminate any unexpected security and privacy leaks. No attacker (not even the one with an unlimited amount of computing power and time) can crack a properly modeled system. An interesting point is that since 2008 we even know how to achieve the ultimate (security and privacy) limits, which will be determined during the project, thanks to Prof. Erdal Arikan’s (Bilkent University, Turkey) polar codes used already in 5G networks. 

The project “Optimal Code Constructions for Private and Secure Function Computations” is expected to bring Berlin, Atlanta, and Siegen closer via reciprocal visits in addition to serving as a means to illustrate once again that every scientific success is a result of accumulated human knowledge that is nourished naturally by diversity.  

More Stories