'Using Python and OpenCV to capture images and sending over UDP, getting "select timeout" error
I am using an Omron Sentech USB3 camera to capture frames in a Python script using OpenCV. These frames are compressed to smaller files and sent over UDP to another Python application that uses these frames to show the image. After a while I get an error "select timeout" in the Python script that is capturing the frames. The Python script that is running the camera and capturing images is running on a Raspberry Pi 4.
The error often happens after a random amount of time, and I am not sure what is causing it. When the "select timeout" error appears, it prints in a regular interval and I get no new image in my other Python application.
I have found some similar problems (w/solutions) but I don't know how to translate this with my downloaded drivers. As for example this solution from 2013 Solution.
The code that is in my Raspberry application (sends frames over UDP):
class FrameSegment(object):
def __init__(self, sock, port, addr="169.254.226.73"):
self.s = sock
self.port = port
self.addr = addr
self.MAX_DGRAM = 2**16
self.MAX_IMAGE_DGRAM = self.MAX_DGRAM - 64 # minus 64 bytes in case UDP frame overflown
def udp_frame(self, img):
"""“””
Compress image and Break down
into data segments
“””"""
# compress_img = cv2.resize(img, None, fx = 0.4, fy=0.4, interpolation=cv2.INTER_AREA) # 2 420 000 bytes
compress_img = cv2.resize(img, None, fx = 0.3, fy=0.3, interpolation=cv2.INTER_AREA) # 150 802 bytes with scaling 0.1
compress_img = cv2.imencode(".jpg", compress_img)[1]
dat = compress_img.tostring()
size = len(dat)
num_of_segments = math.ceil(size/(self.MAX_IMAGE_DGRAM))
array_pos_start = 0
while num_of_segments:
array_pos_end = min(size, array_pos_start + self.MAX_IMAGE_DGRAM)
self.s.sendto(
struct.pack("B", num_of_segments) +
dat[array_pos_start:array_pos_end],
(self.addr, self.port)
)
array_pos_start = array_pos_end
num_of_segments -= 1
The code that is in my other Python application (receiving frames over UDP):
def UDPCom():
MAX_DGRAM = 2**16
def dump_buffer(s):
# Emptying buffer frame
while True:
seg, addr = s.recvfrom(MAX_DGRAM)
print(seg[0])
if struct.unpack('B', seg[0:1])[0] == 1:
print("finish emptying buffer")
break
# Getting image udp frame & concate before decode and output image
# Set up UDP socket
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.bind(('169.254.226.73', 20001))
dat = b''
dump_buffer(s)
while True:
seg, _ = s.recvfrom(MAX_DGRAM)
if struct.unpack("B", seg[0:1])[0] > 1:
dat += seg[1:]
else:
dat += seg[1:]
img = cv2.imdecode(np.frombuffer(dat, dtype=np.uint8), 1)
try:
config.rovCamera = img
except:
print("error (-215:Assertion failed)")
if cv2.waitKey(20) & 0xFF == ord('q'): # Changed from waitkey(1)
print("Waitkey event entered!")
break
dat = b''
cv2.destroyAllWindows()
s.close()
I have tested just running and opening the frames in the Python application on the Raspberry and it will not crash, therefor my conclusion is that the problem is with the communication. But from my understanding of UDP, frames that don't arrive is ignored, so that is confusing.
I have also researched if it something with how long time my program takes to long, and therefore the receiving application doesn't process the frames quickly enough. And therefore some timeout happens.
I think if I could adjust the driver for my Omron Sentech camera, as the before mentioned example, this problem should disappear. However I don't know how to find the installed driver, nor how and what to change in this file.
The camera full name: STC-MCA504USB Camera
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
