KB/s Kb/s Ko/s configuration

Hi !

Congratulation for your great monitor !

First, there is a translation error in the units in french :grin:

KB/s = Ko/s (byte = octet)
Kb/s = Kb/s (bit = bit)

Next, could you set a setting to change the scale from KB/s <> Kb/s and another setting to set

1 Kb = 1000 <> 1024

please !

Many thanks,


1 Like

Thanks for reporting this. I’ll report the problem to the dev team.

Could you post an example network monitor that is correctly translated to french with the Ko/s designation so we can better understand it?

I have two others and they do the same mistake ! I can live with the KB - Kb terminology but it’s bad translation. It should be Ko - Kb in french.

You can anyway customize the scaling in order to enable KB units or Kb units as does all others; and also enable the K multiplier to either 1000 or 1024 at user convenience as does Networx !

Best regards


Thanks! We don’t speak French so we feel scared to make changes with something so important like network terminology. :smile: I hope the rest of the translations look OK?

I will check out Networx to see what you mean.

Yup – here it is: https://en.wikipedia.org/wiki/Octet_(computing)

Long ago (but not so far away) I remember studying about how a byte can have varying numbers of bits depending on the system it ran on and the purpose for the encoding. I can’t remember any of the details (remember, long ago) but at my age I’m proud that I can remember even the idea. :smile:

1 Like

Hi RichLife69,

I think your memory fails you :smile: :

a Byte (octet) is always a set of 8 bits, regardless of the system :wink:

The problem with a Kilo Byte (KB) is that the metric system defines Kilo as 1000 X ! In computer world, what is the closest to 1000 X bitwise, is 2E10 or 1024 since in computer world, every bit sequence leads to a multiple of 2, decimalwise ! so :

1 bit = 2
2 bits = 4
3 bits = 8
4 bits = 16
5 bits = 32
6 bits = 64
7 bits = 128
1 Byte = 256
9 bits = 512
10 bits = 1024

so, a Kb = 1024 computerwise, or 1000 bits metricwise ! But ISP (internet service providers) will always chose a figure that will boost their performance figure in the eyes of the client so they will chose K = 1000 bits in order to tell the clients that their bandwidth is more performant than the competition, if that competition would have used the computer K multiplier of 1024 bits ! How so ? Think of a grocery fruit crate. If one crate (blue) contains 1000 fruits and another crate (red) contains 1024 fruits, which grocery store is the best performer ? the one who sells 50 blue crates per hour or the other who sells 50 red crates per hour ? I guess you figured it out :wink: ! Said another way, if the grocery store sells 1024K fruits/day, he will tend to say that its sellings are 1024 crates/day (blue) instead of 1000 crates/day (red) ! The blue figure lures the reader to think he is more performant than the competitor that advertises as selling 1000 crates/day (red) !

But this is a misleading reasoning since the job of an ISP is to “hose” your computer with a hose full of bits and what counts is the number of bits per second they can hose you with so, in this context, the metric decimal multiplier will always be logically used ! and the computer 1024 multilier is USELESS ! so … Ken_GlassWire, forget that multiplier request !!!


1 Like

Good points, Benny.

But actually my memory isn’t all that bad. In the 1960s and evern 70’s (before more specific standardization), there were computer manufacturers who chose to increase the number of bits in a byte for their own purposes. So if you worked with one of those systems, you had to be aware of the extra bit (often used similarly to a checksum). As computers became more generally available, these special uses fell out of favor.

Also in the past, the term “byte” was often used to mean the number of bits needed to represent a character. In some systems, that could be 9 bits. And going farther back, byte could have meant 5 or 6 bits.

Fortunately, we mostly we don’t need to be concerned about that today (except, as you note, for the providers who manipulate terms for their own benefit).

I understand your point RichLife69


For anyone who is wondering what these guys are talking about the Wikipedia article on the byte gives a good background.

If you’re interested in the history of it all then see the short article WHY IS A BYTE 8 BITS? OR IS IT? by Bob Bemer. In the 1960s and 70s bytes were often less than 8-bit bytes because there were many 3-bit, 4-bit, 5-bit, 6-bit and 7-bit CPUs.

1 Like