MIT Researchers Say All Network Congestion Algorithms Are Unfair
(Credit: Getty Images)
We’re all using more data than ever before, and the bandwidth caps ISPs force on us do little to slow people down — they’re just a tool to make more money. Legitimate network management has to go beyond penalizing people for using more data, but researchers from MIT say the algorithms that are supposed to do that don’t work as well as we thought. A newly published study suggests that it’s impossible for these algorithms to distribute bandwidth fairly.
We’ve all been there, struggling to get enough bandwidth during peak usage to stream a video or upload large files. Your devices don’t know how fast to send packets because they lack information on upstream network conditions. If they send packets too slow, you waste available bandwidth. If they go too fast, packets can be lost, and resent packets cause delays. You have to rely on the network to adjust, which can be frustrating even though academics and businesses have spent years developing algorithms that are supposed to reduce the impact of network saturation. These systems, like the BBR algorithm devised by Google, aim to control delays from packets waiting in queues on the network to make sure everyone gets some bandwidth.
But can this type of system ever be equitable? The new study contends that there will always be at least one sender who gets screwed in the deal. This hapless connection will get no data while others get a share of what’s available, a problem known as “starvation.” The team developed a mathematical model of network congestion and fed it all the algorithms currently used to control congestion. No matter what they did, every scenario ended up shutting out at least one user.
The problem appears to be the overwhelming complexity of the internet. Algorithms use signals like packet loss to estimate congestion, but packets can also be lost for reasons unrelated to congestion. This “jitter” delay is unpredictable and causes the algorithm to spiral toward starvation, say the researchers. This led the team to define these systems as “delay-convergent algorithms” to indicate that starvation is inevitable.
Study author and MIT grad student Venkat Arun explains that the failure modes identified by the team have been present on the internet for years. The fact no one knew about them speaks to the difficulty of the problem. Existing algorithms may fail to avoid starvation, but the researchers believe a solution is possible. They continue to explore other classes of algorithms that could do a better job, perhaps by accepting wider variation in delay across a network. These same modeling tools could also help us understand other unsolved problems in networked systems.
Continue reading
Google CEO Promises to Investigate Exit of Top AI Researcher
Google CEO Sundar Pichai has waded into the furor surrounding the termination of AI ethicist Dr. Timnit Gebru, but his memo may not help the situation much.
Security Researcher: ‘solarwinds123’ Password Left Firm Vulnerable in 2019
SolarWinds, the company at the center of the massive hack that hit US government agencies and corporations, doesn't exactly use cutting-edge password techniques.
Researchers Develop Whitest Paint Ever to Combat Climate Change
Aside from being a neat technical feat, the team believes the new white paint could help address climate change by saving loads of power.
Researchers: 2.5 Billion Tyrannosaurus Rexes Walked the Earth
A new analysis from the University of California Berkeley estimates that there were about 20,000 adult Tyrannosaurs at any given time during the Cretaceous period. Add that up over millions of years, and there could easily have been 2.5 billion of these dinosaurs in total.