'pycurl webhook breaks incoming data
I´m using pycurl to hook incoming data from a LAN-based device to a function.
The hook works well in the sense that the function receives the data, but it´s only getting a part of the complete packet each time.
For instance, if the received data totals 4000 bytes, I get 1500, 1400 and 1100 bytes in three different consecutive calls to my hook function.
Is there a setting to adjust max data length, or receiving timeout, that I could be missing?
Here´s my pycurl inicialization:
CurlObj = pycurl.Curl()
MyDevice.CurlObj = CurlObj
CurlObj.setopt(pycurl.URL, Url)
CurlObj.setopt(pycurl.CONNECTTIMEOUT, 60)
CurlObj.setopt(pycurl.TCP_KEEPALIVE, 1)
CurlObj.setopt(pycurl.TCP_KEEPIDLE, 30)
CurlObj.setopt(pycurl.TCP_KEEPINTVL, 45)
CurlObj.setopt(pycurl.HTTPAUTH, pycurl.HTTPAUTH_DIGEST)
CurlObj.setopt(pycurl.USERPWD, "user:pass")
CurlObj.setopt(pycurl.WRITEFUNCTION, MyDevice.OnReceive)
self.CurlMultiObj.add_handle(CurlObj)
And this is the hook function:
def OnReceive(self, data):
Data = data.decode("utf-8", errors="ignore")
...process Data here...
I guess that I could get all the received data segments in a larger string and then process it (there´s some data length indication at the beginning of the packet) but I´d rather receive the whole packet in only one call to OnReceive.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
