'Is it possible to tell filter2D to assume pixels outside of the image have a given value?

I'm using OpenCV's filter2D (in Python), and of course because my kernel is larger than 1x1, it will sometimes go "outside" of the image. So I assume under the hood, OpenCV uses some extrapolation method to assign values to these "virtual pixels" outside of the image. The results I'm getting are consistent with it assuming a value of 0. I would like to tell it to assume a value of 255. Can I do this?

The docs mention an argument called borderType which sets the "pixel extrapolation method". This sounds like what I need, but the (linked) documentation is simply insufficient for me to figure out how to use this to get what I want. The docs for the BorderType enum just say that BORDER_CONSTANT assumes a value of i "with some specified i", but no clue as to how to specify it.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source