'Scaling pixels between -1 and 1 using cv2.Normalize()

def preprocess(self):

    # Import image
    pic1 = self.path
    raw_image = cv2.imread(pic1)
    #cv2.imshow('Raw image',raw_image)
    #cv2.waitKey(0)

    # Resize image
    dim = (320,180)
    resized = cv2.resize(raw_image, dim)
    #cv2.imshow('Resized Image',resized)
    #cv2.waitKey(0)

    # Scale image
    scaled = cv2.normalize(resized, None, alpha=-1, beta=1, norm_type=cv2.NORM_MINMAX,dtype=cv2.CV_32F)
    #cv2.imshow('Scaled Image',scaled)
    #cv2.waitKey(0)

    return scaled

I'm trying to scale the pixel values of "raw_image" to within the range -1 to 1 as part of a pre-process for identifying an object using machine learning. Essentially, a camera takes a picture, resizes and scales the image to the same size as the images within a dataset used for training and validating. Then that image is inferred by the model generated using model.fit() to detect what the object in the image actually is.

The question here is: " Is this scaling function correct for putting the pixel values in the range of -1 to 1?" It appears SUPER dark when I use cv2.imshow and I'm afraid the model isn't recognizing it properly.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source