Getting an Edge at High Speeds: Randomized Algorithms and Networking Hardware

58
Опубликовано 17 августа 2016, 21:26
Even commercial router vendors have adopted randomized algorithms in a few cases because of their simplicity, speed, and memory-efficiency in a few cases. Further, because of the opportunity to see every member of population (i.e., every arriving packet) and preserve summary information about the entire population, such randomized algorithms can obtain an 'edge' over standard algorithms that merely sample the population. I illustrate this thesis by three algorithms. First, I will describe a simple algorithm for finding sources that send a large proportion of traffic, and its application in a worm detection chip. Second, I will describe an algorithm that provides an inexpensive technique for measuring the average and variance of packet latencies and loss on a link. By contrast, the majority of routers have no support for fine-grained latency measurement; managers must instead rely on approximate methods such as sending probe packets or using 'tomographic' techniques. If time permits, I will describe a third algorithm which allows scalable logging, say of millions of network addresses infected after an attack, using only a small amount of memory. In all three cases I will quantify the edge obtained over simple sampling.
Случайные видео
188 дней – 97 8170:15
Just #XiaomiHyperOS "snaking" through
12.10.07 – 5 79415:28
Sigurd Magnusson of SilverStripe
автотехномузыкадетское