Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Math Networking Science IT

Too Many Connections Weaken Networks 48

itwbennett writes "Conventional wisdom holds that more connections make networks more resilient, but a team of mathematicians at UC Davis have found that that is only true up to a point. The team built a model to determine the ideal number of cross-network connections. 'There are some benefits to opening connections to another network. When your network is under stress, the neighboring network can help you out. But in some cases, the neighboring network can be volatile and make your problems worse. There is a trade-off,' said researcher Charles Brummit. 'We are trying to measure this trade-off and find what amount of interdependence among different networks would minimize the risk of large, spreading failures.' Brummitt's team published its work (PDF) in the Proceedings of The National Academies of Science."
This discussion has been archived. No new comments can be posted.

Too Many Connections Weaken Networks

Comments Filter:
  • Primitive (Score:5, Interesting)

    by gilgongo ( 57446 ) on Saturday February 25, 2012 @03:52PM (#39159859) Homepage Journal

    I'm sure that in 100 years time, people will look back on our understanding of networks, information and culture in the same way as we look back on people's understanding of the body's nervous or endocrine systems 100 before now. This study hints at our lack of knowledge about what the hell is happening.

  • Too many cooks.... (Score:5, Interesting)

    by bwohlgemuth ( 182897 ) <`moc.liamg' `ta' `htumeglhowb'> on Saturday February 25, 2012 @04:29PM (#39160049) Homepage
    As a telecom geek, I see many people create these vast, incredibly complex networks that end up being more difficult to troubleshoot and manage because they invoke non-standard designs which fail when people wander in and make mundane changes. And then when these links fail, go down for maintenance....surprise, there's no 100% network availability.

    Three simple rules to networks...

    Simple enough to explain to your grandmother.
    Robust enough to handle an idiot walking in and disconnecting something.
    Reasonable enough to be able to be maintained by Tier I staffing.
  • Re:Primitive (Score:5, Interesting)

    by Anonymous Coward on Saturday February 25, 2012 @04:39PM (#39160091)

    Cascades on power networks happen when you suddenly lose source, without rejecting the drain. I.e. the load remains high, but suddenly the flow required to supply that load has to shift because a link went down due to failure/overload.

    There is a protection against this, it is called selective load rejection. You shut off large groups of customers, plain and simple. And you do it very very fast. Then you reroute the network to make sure power is going to be able to flow over links that will not overload, and do a staggered reconnect of the load you rejected.

    That costs BIG $$$$ (in fines, lost revenue, and indirect damage due to brown-outs), and there is a silent war to try to get someone ELSE to reject load instead of your network. The only thing that balances it are extremely steep fines among the power networks themselves, and in countries that are not nasty jokes, the government regulatory body.

    I am not exactly sure how to move that to a BGP4, IS-IS or OSPFv2/v3 network, where instead of a sink pressure, you have a source pressure.

  • by dietdew7 ( 1171613 ) on Saturday February 25, 2012 @04:44PM (#39160131)
    Could these types of models be applied to government or corporate hierarchies? I've often heard about the efficiencies of scale, but my experience with large organizations is that they have too much overhead and inertia. I wonder if mathematicians could could come up with a most efficient organization size and structure.
  • by Attila Dimedici ( 1036002 ) on Saturday February 25, 2012 @05:19PM (#39160305)
    One of the things that technological changes since the mid-70s have taught us is that the most efficient organization size and structure changes as technology changes. There are two things that exist in dynamic and as the relationship between them changes, the efficiency point of organizations changes. One of those factors is speed of communication. As we become able to communicate faster over long distances, the most efficient organization tends towards a more centralized, larger organization. However, as we become able to process information faster and more efficiently, a smaller, more distributed organization becomes more efficient. There are probably other factors that affect this dynamic as well.

Say "twenty-three-skiddoo" to logout.

Working...