MCA mathematicians have improved the algorithm for the fundamental intractable problem of building energy-efficient wireless communication networks

Scientists have investigated a fundamental intractable problem of building an energy-efficient wireless communication network: it is necessary to assign such transmission power to each node of the network so that they can transmit their data to the base station, possibly through other nodes. In other words, it is required to build a connected communication network with a minimum total transmission power of the nodes.

Wireless sensor networks are used to monitor the environmental situation (air and water pollution, forest fires, landslides). Sensors are often battery powered, so the lifetime of the sensor network depends on the energy efficiency of their wireless communication network.

In 2017, Rene van Bevern, head of the laboratory of algorithms of the Department of mathematics and mechanics NSU, together with his Berlin colleagues, proved that theoretically the problem has an effective solution algorithm when a network with n nodes already consists of O(log n) connected components that need to be connected with a minimum increase in the total transmission power. This situation arises, for example, when a previously connected network splits into several components after the failure of network nodes, or when several areas are monitored, in each of which the nodes are located with a fairly regular structure. The structure arises when the intersections of the scanning area of ​​the sensors are minimized (also for energy efficiency purposes). For the theoretical result, scientists from NSU and Berlin received an award in 2017 at the large European Congress on Algorithms. Mathematically, it was proved that their algorithm for sufficiently large n will work any times faster than the known algorithms: the complexity of their algorithm grows polynomially in n, while the complexity of the previously known algorithms grows exponentially.
The question about the practical value of the algorithm remained open: will it not turn out that the new algorithm will be faster only for such n, exceeding the number of particles in the Universe? In 2018, Pavel Smirnov took up this issue, then a master's student at the Department of information technology of NSU and now a 1st-year master's program student of the Department of mathematics and mechanics. Having found out that the algorithm, in fact, in the proposed form is practically uncompetitive, he accelerated the algorithm both in practice and in theory, which was the main result of his master's work and an article in the Q1-journal INFORMS Journal on Computing.

Pavel comments:

On the one hand, the task itself has applications in the field of wireless networks. There is a practical interest in solving it in a reasonable time. From a theoretical point of view, which is closer to me personally, the problem is interesting because it is NP-hard, that is, it is apparently impossible to solve it quickly in the general case. Previous approaches to the solution were based on solving an integer programming problem. We tried to compete with these approaches using parametrization methods and achieved some success.

Even on small networks with n=200 nodes and five disconnected components, the new algorithm (on the left in the figure) works up to 100 times faster than the previously known algorithm (on the right) for an optimal solution to the problem: the new algorithm takes 10 seconds instead of 16 minutes, while the advantage of the new algorithm only increases as the network grows. It is worth noting that both algorithms were tasked with only "interconnecting" the network, that is, they both use a special structure of the sensor network supplied at the input.