So far Amplify has been great but i ran into an issue the other day. I want to track the player in my watershader to create ripple effects (Top-Down ortho Rendertexture with ripple particles as mask) but so far no luck. (Reference is Mario Odesseys water system, seems to do something similar)
I tried getting the players position in script and applying it to a vector3 and using it as texture coordinates. Multiplying or Adding it to worldposition isn't working. Same goes for vertex position/normal.
Is there a way for a texture to follow either the player or rendercamera (can't select a camera with WSCameraPos)? Example: https://youtu.be/aZJQuHZQakQ?t=11m43s (Don't mind the music haha)
No problem at all, that's a great song, and thank you for getting in touch!
It mostly depends on the type of effect you would like to create and the intended approach, as there are a few ways of tackling this, I would kindly suggest that you check out our ForceShield sample, which is located in the AmplifyShaderEditor/Examples/Community/ForceShield folder as a starting point.
The impact effect node setup and its script will hopefully point you the right way, but don't hesitate to get in touch with us if you have any further questions, we'll be glad to assist you!
Thanks! Maybe this is a better explanation: Here's the material setup for the ripples: I want to use this in a particle effect, render it on the player position to a rendertexture and use it as a mask to drive the vertex Y offset.
In a nutshell: All i need is an output that's shown in the Uncharted GDC talk (11:54, snow and lava deformation) It uses a particle system, trailing the player, to see where deformation is happening on the snow.
I tried to work with the forceshield sample, but still i can't seem to add the ripple effect as a mask on top of the impact position output.
Thanks for pointing me to the ForceShield shader though! Definitely learned from it.
To clarify, you cannot use either a local position or a world position directly as a texture coordinate.
So, in order for them to be transformed into some tipe of UVs, you need to transform your position to normalized device coordinates, but in relation with the camera rendering the mask. For that you'll need to pass to your shader the model view projection matrix from the camera which is rendering the mask.
Inside the shader, you'll have to multiply the local vertex position by this matrix. What this will do is give you the object position as seen from that camera through values between -1 and 1, so you will now have to offset and scale those values to be between 0 and 1 ( which are the normalized device coordinates ).
This value can then be used to fetch the render texture and apply the mask results. There might be some tweaking you need to do to further match how the render texture itself is being written, but this is the general idea.
You may also check the following links for some additional information regarding this type of effects:
Thank you for getting back in touch, its good to know that the information shared was helpful!
Have you had the chance to rate and review Amplify Shader Editor? It would be awesome if you could share your experience with the Unity community, the Unity Asset Store thrives on user interaction and direct feedback.
Every bit helps, your feedback is extremely valuable to us!
Feel free to get back in touch if you have further issues or questions, thanks!