Okay, so I made the Mesh Type of the sprites "Full Rect" and now it's working properly. The problem now is that when I'm adjusting the fade slider back and forth, I noticed one of my CPU cores (and only one) spiking to near 100% usage and it's causing stuttering in the effect. Is it supposed to be like that? Is there a way to optimize it?
Hello, thank you for getting in touch and for your support!
What Unity and ASE versions are you using? Does the CPU spike in the editor, play mode, build? Can you replicate it in a new project with just ASE imported?
Seeing that you might have stumbled upon an issue, could you please send us a sample with that issue present, including shaders and textures, so that we can debug it on our side? If you'd rather share it privately, please use firstname.lastname@example.org, thanks!
I'm on Unity 2017.4.7f1 and Amplify 1.5.6. It happens in editor and play mode. I made a Windows build, and it was completely fine. In play mode, there is a bit of an increase in the core utilization when I'm letting the script change the fade amount value, and a huge increase when I manually adjust the fade amount slider in the inspector.
The project only has ASE, the images, and the shader I made in it.
You can see here that my third core (though it varies) is spiking to 90% when I adjust the slider back and forth:
Hello, we've ran a few tests on the sample that you kindly shared using the same Unity version and Unity's Profiler with V-Sync Off, but there seems to be no performance issue when running your project with the script toggled on. Have you tried forcing V-Sync Off?
The CPU usage data you've shared is quite odd, but likely to be unrelated to ASE or the shader itself, since shaders run on the GPU, so we would assume that it might be Unity related. Even with our editor canvas open, we didn't notice anything irregular.
Unfortunately, I don't think there's anything that we can on our side do regarding this matter.
Thanks Borba. I did some profiling of my own and found that it was the Unity editor. It's not too bad, so I can live with it since it's less pronounced in play mode and non-existent in builds.
I've added a dissolve effect now to the shader and I've run into what I believe is a bug.
When connection from my append to the alpha channel of the lerp is a single line like that, the dissolve works (as you can see in the preview image), but it doesn't actually show up in the game. When I reconnect it and it has a triple line like so:
It works completely fine. Do you think it's a bug? It turns back into a single line connection after I close the shader editor, and the effect stops working if I recompile it when it has that single link. I'm sending the project over to you via email in case you need it (I'm running Unity 2017.4.12f1 now). In that project, the dissolve slider shouldn't do anything until after you reconnect the alpha link between append and lerp and save the shader. Once you reopen the editor, the connection should be a single link again and the effect should stop working after saving it.
No problem! The result you've shared is expected, as the node returns values of 0 to 1 per channel, and since you've connected the RGBA to the node's A input, the Step is being applied to each one of those channels, and the review is showing all of them simultaneously.
Please let me know if you have any further questions, thanks!