Internet

Why are my M-Lab results different from other speed tests?

Share from

Ron Dallmeier
 

Internet performance tests may provide different results for a lot of reasons. Three of the main reasons for different results among tests are listed below:

 

1. Differences in the location of testing servers

Every performance test has two parts:

 client: This is the software that runs on the user’s machine and shows the user their speed results.
server: This is the computer on the Internet to which the client connects to complete the test.

A test generates data between the client and the server, and measures performance between these two points. The location of these two points is important in terms of understanding the results of a given test.

If the server is located within your Internet Service Provider’s (ISP’s) own network (also known as the “last mile”), this is referred to as an “on-net” measurement. This approach lets you know about how your Internet connection is performing intra-network within your ISP, but it does not necessarily reflect the full experience of using the Internet, which almost always involves using inter-network connections (connections between networks) to access content and services that are hosted somewhere outside of your ISP. Results from on-net testing are often higher than those achieved by using other methods, since the “distance” traveled is generally shorter, and the network is entirely controlled by one provider (your ISP).

“Off-net” measurements occur between your computer and a server located outside of your ISP’s network. This means that traffic crosses inter-network borders and often travels longer distances. Off-net testing frequently produces results that are lower than those produced from on-net testing.

M-Lab’s measurements are always conducted off-net. This way, M-Lab is able to measure performance from testers’ computers to locations where popular Internet content is often hosted. By having inter-network connections included in the test, test users get a real sense of the performance they could expect when using the Internet.

 2. Differences in testing methods

Different Internet performance tests measure different things in different ways. M-Lab’s NDT test tries to transfer as much data as it can in ten seconds (both up and down), using a single connection to an M-Lab server. Other popular tests try to transfer as much data as possible at once across multiple connections to their server. Neither method is “right” or “wrong,” but using a single stream is more likely to help diagnose problems in the network than multiple streams would. Learn more about M-Lab’s NDT methodology.

I would suggest a couple of things. When you run the m-lab test, note which server it is running the test against. It shows you as the test is running. It would be something like: NDT.IUPUI.MLAB1.ORD02.MEASUREMENT-LAB.ORG

ORD is the airport code for Chicago O’Hare in this example. It is certainly important to note, which server it used.

Another suggestion would be to select the “Details” from the results page. In here you will get a summary of the test results. This can shed some light as to whether the test observed any potential causes for bad results. Two big factors for TCP based throughput are Packet Loss and Latency. Here is an example:

Your system: –
Plugin version: – (-)

TCP receive window: 893408 current, 894848 maximum
0.00 % of packets lost during test
Round trip time: 21 msec (minimum), 68 msec (maximum), 33 msec (average)
Jitter: –
0.00 seconds spend waiting following a timeout
TCP time-out counter: 232
473 selective acknowledgement packets received

No duplex mismatch condition was detected.
The test did not detect a cable fault.
No network congestion was detected.

0.9633 % of the time was not spent in a receiver limited or sender limited state.
0.0000 % of the time the connection is limited by the client machine’s receive buffer.
Optimal receive buffer: – bytes
Bottleneck link: –
469 duplicate ACKs set

…Ron

Q&A Categories

0 0 vote
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x