Hey, I'm in a networking class and I'm trying to get a handle on this problem but I'm not sure if I'm getting it.
The problem states as follows:
Computer A is connected to a router by a link that is 1 Mbps and has a propagation delay of 1 msec. Computer B is connected to that same router, by a link that is 2 Mbps with a propagation delay of 2 msec.
If I sent a 1000 byte packet from computer A to the router, that would mean it only takes 1msec to get there, right? Is there any other delays I account for in this calculation? And if it goes from A all the way to B, then it's 3 msec? I feel like that makes it way too simple, but I don't know what else I should be accounting for.
I appreciate any help anyone can offer.
The problem states as follows:
Computer A is connected to a router by a link that is 1 Mbps and has a propagation delay of 1 msec. Computer B is connected to that same router, by a link that is 2 Mbps with a propagation delay of 2 msec.
If I sent a 1000 byte packet from computer A to the router, that would mean it only takes 1msec to get there, right? Is there any other delays I account for in this calculation? And if it goes from A all the way to B, then it's 3 msec? I feel like that makes it way too simple, but I don't know what else I should be accounting for.
I appreciate any help anyone can offer.