Is there a way to cast float to int in unity shaders? I've tried all the normal methods (which compile fine, but don't seem to actually do anything).
Shader error: int or unsigned int type required (on d3d11)
The offending code:
col.r += int(sqrt(( abs(i.vertex.y - (_ScreenParams.x/2) ) ^2) + ( abs(i.vertex.x - (_ScreenParams.y/2) ) ^2)));
None of the functions I'm using seem to require only int types and I'm casting it properly for the final value.
Any help would be appreciated.
Answer by Zilppuri
Apr 19 at 09:11 AM
Have you tried surrounding the abs() inside int()?
col.r +=sqrt(( int(abs(i.vertex.y - (_ScreenParams.x/2) )) ^2) + ( int(abs(i.vertex.x - (_ScreenParams.y/2)) ) ^2));
Thank you, it seems to have worked. Do you know what the original problem was?
Because i.vertex.y and _ScreenParams.x are float values, the float-version of abs() is used. This then returns float as well. So you could either surround the whole abs() in int(), or do the same for i.vertex.y and _ScreenParams.x (therefore int abs is used instead). With my limited knowledge of shader coding I can't say which would be more effective - although I doubt huge performance loss happens either way.
Up to 2 attachments (including images) can be used with a maximum of 524.3 kB each and 1.0 MB total.
The best place to ask and answer questions about development with Unity.
To help users navigate the site we have posted a site navigation guide.
If you are a new user to Unity Answers, check out our FAQ for more information.
Make sure to check out our Knowledge Base for commonly asked Unity questions.
If you are a moderator, see our Moderator Guidelines page.
We are making improvements to UA, see the list of changes.
Answers and Comments
99 People are following this question.
Simple shader works in Editor-view but not in game-view.
How to make shaders work on all devices ?
Mobile Bumped Diffuse problem
Shader add vertex displacement
Shader on 3D model not seen over 2D sprites