Large graph analysis is an area of mathematics with increasing relevance for networked systems.

In 1959, Bell Labs mathematician Edgar Gilbert introduced the G(n,p) model of random graphs – the first example of what is now known as Small World networks. In such networks, any two pair of vertices are surprisingly close despite the size of the overall network; a property that is foundational for social and other variations of current information networks.

Today, Nokia Bell Labs researchers are exploring the use of the mathematics of systems and networks to address several complex challenges. For example, asymptotic analysis (including diffusion approximations, large deviations and stochastic programming) has been used to optimize scheduling functions associated with complex time and space division multiplexing in wireless, virtual and optical systems. Sem Borst has been investigating queuing networks with heavy-tailed arrival or response times, which aids in the modeling and control of cloud latencies. Sasha Stolyar has obtained characterization of simple optimal asymptotic control rules for distributed service systems, such as placement of large-scale virtual machines.

Random matrices or techniques that encode the connectivity of a graph turn out to be extremely useful in estimating capacity and congestion in complicated networked systems. Resulting eigenvalues and eigenvectors provide closed analytical estimates, thus making detailed and time-consuming simulations unnecessary, at least for strong estimates. Milan Bradonjic has explored dynamically evolving random network processes such as percolation which underscore performance of many network protocols. Sem Borst and Iraj Saniee have used Markov random fields for a precise model of self-organizing protocols in demand in many areas of wireless systems and the cloud. Marty Reiman has explored network game theory to better model and understand large-scale, multi-party systems and the consequent impact on yield management of telecom systems.