'Reduce artifacts when resampling a Brain MRI scan using scipy affine_transform
I have a Brain MRI. It is gray scale with 20 slices. I put it into a numpy array with shape (20,256,256). I use scipy.ndimage affine_transform to rotate and resample the array as below.
The dark stipes in the image is the artifact that I want reduce. The artifact is caused by the relatively large gap between the slices. In this example the pixel spacing is 0.85 mm, but the distance between slices is 7 mm.
I tried to change the order of the affine transform, but even order=5 has the same artifacts. Below is order=0 (nearest neighbor)...
and you can see how the curvature of the skull is compounding the problem. Are there any tricks to fix this? Maybe I should add dummy data between the pixels to equalize the spacing? Maybe I should use polar coordinates to eliminate the curvature? Any other ideas?
Solution 1:[1]
The affine_transform using any order will look terrible. You need to do feature detection and delaunay triangulation on both images first, and then use the interpolating variable as a morph parameter to move the pixels between the corresponding features in adjacent images. see link https://devendrapratapyadav.github.io/FaceMorphing/
see also this video https://www.youtube.com/watch?v=5FEr5SiXB1g
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |


