Problems with UV on created mesh

Hi everyone!

I am trying to make an in-game map editor that will let user create asteroid belts and asteroid fields of various shapes by specifying vertices. (Something like that)

Currently I’m using a script from (this site), it works great because I can use the same script to handle both asteroid belts and fields.

The problem emerges when trying to assign a material to the object. Both via script and when assigning it manually. It seems like the image is scaled up so drasticly that no matter how large I set the Tiling, whole object uses only one pixel of the image. (When I switch it from Cutout to Texture the whole object is either fully transparent or black, also when manipulating Offset I can get other pixels of the texture to fill whole object) I think it’s because of badly assigned UV, but I can’t make much sense out of script from site above.

If somebody would help me understand what I’m doing wrong or maybe point me towards other methods of creating meshes in real time with holes in them, it will be greatly appreciated.

You don’t have problems with UVs but you simply don’t have UVs at all. The Uv coordinates would need to be created as well as the vertex positions.

However it depends on how you want the texture to appear on the mesh. If it’s an arbitrarily created mesh there is no “one-fits-all” solution.

Look up UV mapping and first understand what it’s all about. Each vertex has a Vector3 position which defines the location of the vertex in the 3d space. However if you want to use texture mapping each vertex also needs an additional UV coordinate (Vector2) which defines the position of the vertex in “texture space”. You might want to download my UVViewer which allows you to view the UV map of an object inside Unity. Just try it with some of the default meshes (sphere, capsule, cube) and you see how each triangle is mapped onto the texture.

The process of mapping the triangles onto the texture is called “unwrapping” and it’s usually done by the artists who created the model in a modelling software. This is usually a manual process, just like the modelling itself. Modelling applications often provide tools / complex algorithms to simplify the process.

When you create procedural geometry you have to care about everything yourself. If it’s just a 2d mesh it might be a bit easier, but in the end it depends on how you want it to look like. For 2d a common way is to use the actual world / local space coordinates as UV coordinates. Just make sure the texture wrapmode is set to repeat.

From your question we can not determine how your meshes will actually look like and how you want the texture mapping to behave.