General Discussion Undecided where to post - do it here. |
Reply to Thread New Thread |
![]() |
#2 |
|
Link
What some users thought was their imagination turns out to be a very real problem Sometimes you try to reach a site and mysteriously you can't get to it. Perhaps you think its simply your imagination, something wrong with your computer, or perhaps the site is under attack. Or maybe the server that the site is hosted on went down. However, the cause could be far more mysterious. With the massive amount of users trafficing the internet today, weird things are bound to happen, but about the strangest perhaps is the recently discovered, but long suspected phenomena of internet "black holes." Traffic is unintentionally routed into these holes at times, forever lost, even though a working path existed between the sending computer and receiving computer. University of Washington researchers developed a system known as Hubble, which scours the dark depths of the internet for black holes, posting the results on the website. The result is an ever-shifting map of the internet's weak spots. Users can view a specific map or type in a specific web address to type for problems. The University will present its research at the San Francisco at the Usenix Symposium on Networked Systems Design and Implementation. Ethan Katz-Bassett, a UW doctoral student in computer science and engineering, explains how the research debunks the common misconception that if both the sender are receiver are active, the internet "just works." Says Katz-Bassett, "There's an assumption that if you have a working Internet connection then you have access to the entire Internet. We found that's not the case." The scientists named the project after its NASA telescope namesake as they saw its search and mapping through mazes of cables and routers as analogous to the Hubble telescope's search of the the murky cosmic depths. The practice of finding analogies between cosmology and internet research is actually quite common, in fact study of the internet is oft labeled as "Internet astronomy". Explains Katz-Bassett, "It's the idea of peering into the depths of something and trying to figure out what's going on, without having direct access." The UW researchers extensively probe around the internet to find computers that are reachable by some parts of the internet, but not others, a phenomena known as partial reachability. To rule out short blips, a error must be present in two consecutive 15 minutes trials. Hubble found that nearly 7 percent of the world's computers experienced such a problem at least once in a three week period last fall. Arvind Krishnamurthy, a UW research assistant professor of computer science and engineering and Katz-Bassett's doctoral adviser states, "When we started this project, we really didn't expect to find so many problems. We were very surprised by the results we got." The new online map produced by Katz-Bassett and Krishnamurthy is updated every 15 minutes. Problems are flagged and the numerical address of the group of computers effected is listed. An address can describe a few hundred to a few thousand effected computers. For the affected addresses Hubble listed the successful percentage of probes and how long the problem persisted. By clicking a flag, the user gains access to a list of which locations could or could not reach the address. The researchers hope to in the future also include information on what caused the black hole. Hubble utilizes the PlanetLab network. PlanetLab is a shared worldwide network of academic, industrial and government computers. Hubble uses about 100 computers in 40 countries to accomplish its global probing. Hubble currently monitors 90 percent of the internet, according to researchers. The new site should be helpful both to inquisitive users, wondering at the cause of their frustrations, and for network administrators. Typically administrators currently try to determine the nature of such problems through online discussion boards, a rather poor means of diagnosis. Katz-Bassett states, "You would think that the network operators of Internet service providers would have access to better data. That's not the case. The general approach has been to mail something out to a listserv and say, 'Hey, can you try this and see if you have a problem?'" The ultimate goal of the project is to make the internet more reliable and easy to use. Says Krishnamurthy, "We want to give operators a way to tell what's going on quicker, catch problems quicker and solve them quicker." The research is being funded by the National Science Foundation. Well what can I say. This obviously answers my questions on why I have so much trouble connecting to some sites (including this one) some times. Every now and than I would try and load up a site and it would come up "Could Not Load Page" or something like that and no matter how many retries I do I get the same results. I always thought it might be a problem on my end, but when I can connect to other sites perfectly fine I think that the actual site must be down or playing up or something. But I guess the data just gets lost between connections. I wonder if this takes up any of my download quota ![]() |
![]() |
![]() |
#4 |
|
Hah, internet blackholes.... just a fancy name for malfunctioning routers. [rolleyes] I for sure know that theres nothing wrong with my router, I mean if I can connect to every site but one, though connect to the site the next day without any troubles, than its not something on my end. |
![]() |
![]() |
#5 |
|
Which routers though, server routers, user routers or host routers? |
![]() |
![]() |
#6 |
|
|
![]() |
Reply to Thread New Thread |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
|