UV mapping Icosphere after subdivision issue

First things first. I have successfully created and icosphere or icosahedron and I have applied a material to it and set initial UV’s without too much of an issue. This is using the icosphere code that’s floating around. After creating the sphere I have it set up to chunk out the sphere into several gameobjects…essentially doing a terrain generation thing here.

The problem is after my initial creation I can create the chunks with their respective UV’s no problem…I can even subdivide the mesh on those chunks a couple of times without problems. It’s when it gets subdivided over 4 times that some issues pop up with the UV’s.
(I’m assuming it’s the UV’s).

Now the code I use to get the texturecoords is

private Vector2 GetTextureCoord(Vector3 normal){
		Vector3 norm = normal;
		float targetU;
		float targetV;
		float u = (Mathf.Atan2(norm.x, norm.z) + Mathf.PI) / (Mathf.PI * 2f);
		float v = (Mathf.Acos(norm.y) + Mathf.PI) / Mathf.PI - 1f;
		targetU = u;
		targetV = v;

        return new Vector2 { x = targetU, y = targetV };
    }

But I also had to make a shader because there was an extremely ugly seam that went from pole to pole

Shader "Custom/isoshader" {
	Properties {
			decal ("Base (RGB)", 2D) = "black" {}
			decBump ("Bumpmap (RGB)", 2D) = "bump" {}
	    }
		SubShader {
			Pass {
			Fog { Mode Off }
			Tags { "RenderType"="Opaque" }
			LOD 200
			
			CGPROGRAM
			
			#pragma vertex vert
			#pragma fragment frag
			#define PI 3.141592653589793238462643383279
			
			sampler2D decal;
			sampler2D decBump;
			
	 		struct appdata {
			    float4 vertex : POSITION;
			    float4 color : COLOR;
			    float4 texcoord : TEXCOORD0;
			};
	 
			struct v2f {
				float4 pos : SV_POSITION;
			    float4 tex : TEXCOORD0;
			    float4 col : COLOR0;
			    float3 pass_xy_position : TEXCOORD1;
			};
			
			v2f vert(appdata v){
				v2f  o;
				o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
    			o.pass_xy_position = v.vertex.xyz;
    			o.tex = v.texcoord;
    			o.col = v.color;
				return o;
			}
	  
			float4 frag(v2f i) : COLOR {
				float3 tc = i.tex;
				tc.x = (PI + atan2(i.pass_xy_position.x, i.pass_xy_position.z)) / (2 * PI);
				float4 color = tex2D(decal, tc);
				return color;
			}
	 
			ENDCG
		}
	}
}

After going through 4 subdivisions on the chunks…once I hit the 5th one…I think…regardless of which one I hit…by the time I get to a certain level of subdivision it causes what looks like mini squished poles on all the chunks that have gotten to that level of subdivision.

Here is a couple of pictures…first one is before it hits the subdiv level of ugly.

and this one is when it gets to that level

so if someone could either point to some flaws in my UV related code or help me understand why it does that once getting subbed I’d be grateful.

SIDE NOTE: When I subdivide the vertices…I also subdivide the UV coordinates using the same method like so…

Vector2 uvpoint1 = oldUVs[p1];
		Vector2 uvpoint2 = oldUVs[p2];
		Vector2 uvMiddle = new Vector2((uvpoint1.x + uvpoint2.x) / 2.0000f,
									   (uvpoint1.y + uvpoint2.y) / 2.0000f);
		
		oldUVs.Add (uvMiddle);

[EDIT]
I figure it has something to do with the shader. Because periodically while moving the scene around to check out the objects and such, random chunks will change the texture to the pinched ones in the pictures above. I’m a complete noob at writing shaders and was trying to figure out how to fix the UV seam on a icosphere through a script instead of the shader…mainly so I could avoid these problems. Any help would be grateful.

[EDIT-2]
It happens mainly when I subdivide 8 times. I create a main object with a mesh, verts and uvs, subdivide that then chunk out each triangle to its own object. The new object has its vertices and UVs given to it from the main mesh. I did this so that I wouldn’t have to recalculate all the UVs all over again. It all works fine and dandy. I can subdivide the new objects mesh (or spherical chunk) 4 more times without issues but once I do it a fifth time it cause problems. It’s almost like it takes the UVs for the poles and moves them to each chunk.mesh that’s been subdivided the 5th time. I’ve already posted code for the function to create the inital UV coordinates as well is what I do to subdivide the initial UV’s on the chunk and even the shader. I’m confident that it’s mostly the shader. There is a specific reason I am doing an icosphere/geosphere/tesselated sphere over a UV-Sphere. I know both have their own respective issues. Hopefully I’ve included enough information for someone who is far more skilled than I do either point me in the right direction in what I can chance or even a snippet of code that I haven’t thought of. Really, anything will help out.

Again, Thanks for any help.

UV mapping a sphere triangle mesh - MFT Development please read this as I am also in the same boat but this is the reason it is happening some of the normals are flipped so we have to go through all of the triangles get three that make up the normal and crossfade them to flip them…

I know this is a really old post but here I am years later. Decided that the best course of action for my specific project was to utilize the shader. I no longer need to place one texture over the whole sphere and instead went with triplanar texturing it since I am creating spherical terrain. After quite a bit of trial and error I’m able to put on as many textures as I need for any terrain I am creating getting any heightmap data needed from getting the value of the vertex from the noise created with libnoise utils and storing that specific noise value in the UV. This way I am bypassing having to create UV’s and instead can just use one or both of the UV channels to store needed data for the shader.

I posted on Unity here at this location on what I’m currently doing.