Let us compare two methods for setting the TCP TimeoutInterval. The method described in class (and in the textbook), is the Jacobson-Karels algorithm, which sets TimeoutInterval = EstimatedRTT + 4 * DevRTT. An early unnamed algorithm did not use DevRTT, and just set TimeoutInterval = 2 * EstimatedRTT. Consider an ongoing TCP connection which has an estimate of 350 ms for the RTT and an estimate for the deviation in the RTT of 40 ms. Now suppose that subsequent RTT measurements jump to 900 ms. Compare the behavior of the early EWMA algorithm (which does not use the deviation) to the Jacobson/Karels algorithm, by calculating the sequence of TimeOut values these algorithms each compute. For Jacobson/Karels, assume alpha = 0.125 and beta = 0.25 as discussed in the text.
Submit your username-hw05.doc or username-hw05.pdf or username-hw05.txt file with the the answers to these questions on Moodle
P1 worth 20%. 8 problems from the textbook worth 10% each.
You should attempt to answer questions for partial credit even if you're unsure of the answers.
Cooperation is recommended in understanding programming concepts and system features. But the actual solution of the assignments including all programming must be your individual work. For example, copying without attribution any part of someone else's program is plagiarism, even if you have modified the code. The University takes acts of cheating and plagiarism very seriously.