Is there a special name for the zero-to-one value range?

Silly Question Time!

// "I for one like Roman numerals."

The 0…1 range is such a common thing in coding, but I never know what to call it. Does it have an accepted name, in the way “percent” applies to parts of 100?

In my own code I call it p or percent, but since that’s not what it actually is, when I talk about this range to newcomers it takes me a whole sentence to do it.

Thanks for reading,

“Normalized” is often used to represent that, especially in the context of taking an arbitrary range of other values and binding it to 0…1

Normalized Number :

a number is normalized when it is written in scientific notation with one nonzero decimal digit before the decimal point.[1] Thus, a real number when written out in normalized scientific notation is as follows:

getyour411 explained it much simpler of course.
The best example of normalized numbers is Perlin Noise. The entire algorithm is based on scaled normalized numbers.
Of course it is confusing to people who aren’t programmers because when they see it as “a number between 0 and 1” so it tends to invoke the idea that there are only 2 outcomes.