'Float3 array values are different in compute shader than what I sent them to be as from C# script in unity?
So I am using a compute shader in unity to find groups of vertices that overlap (are the same as) other groups of vertices in a Vector3[].
I have a List < List < int > > called faces. Each List in the faces list is a group of indexes that points to the triangles array in a Mesh. This is so I can have multiple triangles that represent an N sided face.
After I do some C# preparation to get that faces matrix into something the GPU can understand, I send over all the mesh and faces data to the Compute Shader:
This code is for those who need to see the whole thing. I get more specific in the next section.
public void ClearOverlap()
{
//faces
ComputeBufferMatrix matrix = ListListIntToComputeBufferMatrix(faces);
GlobalProperties.findOverlappingFaces.SetInt("width", matrix.width);
GlobalProperties.findOverlappingFaces.SetInt("height", matrix.height);
GlobalProperties.findOverlappingFaces.SetBuffer(0, "faces", matrix.computeBuffer);
ComputeBuffer facesSharing = new ComputeBuffer(matrix.width * matrix.height, sizeof(int));
int[] facesSharingArray = new int[matrix.width * matrix.height];
facesSharing.SetData(facesSharingArray);
GlobalProperties.findOverlappingFaces.SetBuffer(0, "facesSharing", facesSharing);
//vertices
ComputeBuffer vertices = new ComputeBuffer(mesh.vertices.Length, sizeof(float) * 3);
vertices.SetData(mesh.vertices);
GlobalProperties.findOverlappingFaces.SetBuffer(0, "vertices", vertices);
//triangles
ComputeBuffer triangles = new ComputeBuffer(mesh.triangles.Length, sizeof(int));
vertices.SetData(mesh.triangles);
GlobalProperties.findOverlappingFaces.SetBuffer(0, "triangles", triangles);
//output
ComputeBuffer output = new ComputeBuffer(matrix.height, sizeof(float) * 3);
Vector3[] outputInfo = new Vector3[matrix.height];
output.SetData(outputInfo);
GlobalProperties.findOverlappingFaces.SetBuffer(0, "returnFaces", output);
//dispatch
GlobalProperties.findOverlappingFaces.Dispatch(0, matrix.height, 1, 1);
//clear out overlapping faces
output.GetData(outputInfo);
for(int i = 0; i < outputInfo.Length; i+=1)
{
Debug.Log(outputInfo[i]);
}
Debug.Log(mesh.vertices.Length);
Debug.Log(mesh.vertices[1000]);
Debug.Log(outputInfo[0].x);
Debug.Log(outputInfo[0].y);
Debug.Log(outputInfo[1000].z);
//dispose buffers
matrix.computeBuffer.Dispose();
vertices.Dispose();
triangles.Dispose();
output.Dispose();
facesSharing.Dispose();
}
Here is the Next Section
This next piece of code is the part where I send the mesh vertices to the computebuffer which will then set the RWStructuredBuffer in the compute shader called "vertices"
//vertices
ComputeBuffer vertices = new ComputeBuffer(mesh.vertices.Length, sizeof(float) * 3);
vertices.SetData(mesh.vertices);
GlobalProperties.findOverlappingFaces.SetBuffer(0, "vertices", vertices);
At no point am I changing the values of the vertices after I set them in the compute buffer. And before this point the values are perfectly fine. In fact even after the compute shader is dispatch the values are still fine. Only when they are in the compute shader are they bad.
I am expecting values like 3.5 and 0.5 and 10.5 (I am working with block points so they are 0.5 off of the grid). However I am getting values like 2.8054E-42, 3.4563E-45... etc.
Important Note
I am not changing the values in the Compute Shader. I am literally just writing them back out into another RWStructuredBuffer that is set to a different (But same sized) Vector3[] as the "mesh.vertices" .
Summary
Vector3[] gets changed while going into the compute shader. Not before and not after.
Can anyone shed some light on this issue because I have search sooo many forums and no one seems to have experienced this issue.
PS I even thought that maybe the bits of the floating points were being flipped during the dispatch or something so I went to this website (https://www.h-schmidt.net/FloatConverter/IEEE754.html) to see what the value is when I unflip them but its not what's happening.
Here Is the Compute Shader Code
The RWStructuredBuffer vertices are the vertices and the RWStructuredBuffer returnFaces is the other Vector3[] that returns what is in the vertices array. Its literally just sending values in and getting the same values out. But they are not the same value which is the problem!!!
#pragma kernel CSMain
int width;
int height;
RWStructuredBuffer<int> faces;
RWStructuredBuffer<int> triangles;
RWStructuredBuffer<float3> vertices;
RWStructuredBuffer<int> facesSharing;
RWStructuredBuffer<float3> returnFaces;
[numthreads(1,1,1)]
void CSMain (uint3 id : SV_DispatchThreadID)
{
returnFaces[id.x] = vertices[id.x];
}
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|