Encoding metadata in a texture

I am trying to procedurally generate a texture that will be used to define the location of resources on terrain. My idea is so that I can have a relationship between the display of the locations, and what is at the location. I am having some trouble with the display though (and possibly the storage).

First, I use a Voronoi noise function to get a set of regions, which I then desaturate by a great deal in order to get a few left over regions, which represent the location of resources. Relevent portion of code (which exists in an x/y loop; “value” is the value returned for this coordinate by the noise function):

// now that we have a value, we need to adjust it. Values are returned in
// a range of -1 to +1. We do not want every single node to represent a
// resource deposit, so let's clamp any values to be between 0.0 and 1.0.
// This should effectivly give us a 50% resource deposit rate
value = Mathf.Clamp(value, 0.0f, 1.0f);
value = value - 0.99f; // gimme 1% of the 50%

float resID = 0;
if(value > 0) {
	// this is a postive value, which means it is officialyl a deposit! now we need to assign
	// it a resource ID. we'll store the id in the R channel, and since this is a float, we will
	// adjust the id such that it is 1% of 1.0 (id 1 = 0.01, id 2 = 0.02, etc).
	if(resIDLookup.ContainsKey(value)) {
		// we already have an id for this, use it
		resID = resIDLookup[value];
	} else {
		//this is new, generate
		resID = ResourceList.Natural[UnityEngine.Random.Range(0, 8)].id * 0.01f;
		resIDLookup.Add(value, resID);
	}

	// saturate value for G channel (might be temporary)
	value = 1.0f;
} 

// NOTE: for now, we use the G channel is defined if a resource is here at ALL. in reality
//		 this is not necessary at all. 
Color color = new Color(resID, value, 0, 1);

int index = (x * mapWidth) + y;
pixels[index] = color;

This code takes the id we have assigned to the region (converted to a float where 1 = 0.01, etc), and places it in the red channel of the pixel. After this loops completes, I finalize the texture as so:

resourceMap.filterMode = FilterMode.Point;
resourceMap.SetPixels(pixels, 0);
resourceMap.Apply();

What we end up with is a black image with a few sporadic splotches of green representing the deposits. Like this:

So far so good! The trouble arises when I try to visualize this.

I have a shader for my terrain where I have added a few lines of code to display these regions based on a parameter defining which resource deposit type I wish to see:

if(_ShowResourceLocation != 0 && cR.r != 0) { 										     
	if(int(cR.r * 100) == _ShowResourceLocation){
	     output = lerp(output, cR, 0.5);	                	
	} 
}

cR is the sample of the resource texture. _ShowResourceLocation is a float param that represents the int version of the id. 0 means don’t show any resources at all. “output” is the result of all previous processing unrelated to this.

This is where the problem arises. For some reason, I can only see resources 1,3,5 and 7. When I set _ShowResourceLocation to 2, 4, or 6; I see nothing. However, when I use Bilinear filtering; I can see outlines when using even values of areas that are actually odd valued IDs.

Why would something like this happen? I feel like something is happening under the hood that I do not understand that is perhaps rounding my even values. Overall, I am just trying to encode some game related data into my texture for use later, without it being altered. Any ideas?

Since your texture is probably 8 bits per channel, but the float value when you generate the texture is 32 bits, you can’t really guarantee that you’ll get the same value that you wrote out to the texture when sampling it in the shader. Also with filtering enabled you’ll get intermediate values too, so you probably want to stick to point filtering.

Say you write an id of 1. In your current implementation you multiply that by 0.01. Remember that the texture is really composed of byte values for each channel, so 0.01 is multiplied by 255 to get the 8 bit channel value, which gives us 2.55. I suspect Unity is rounding that value rather than truncating so let’s say you have 3 for the channel. Then when you sample the channel, you get a float, which is 3 / 255 or 0.01176. Now you multiply by 100 and cast to int, which I’m pretty sure is a floor operation, which results in 1. That seems okay, but now run through the process with an id value of 2:

C#:

round((2 * 0.01) * 255) = 5

Shader:

int((5 / 255) * 100) = 1

What is the range of your resource id? I would use the SetPixels32 method to write out exact values. Then when you sample the value, you should be able to just multiply by 255 to get the exact result.