'Socket is closed error in python script (paramiko)

I am new to python and have written this script through references online. This script is basically connecting to a sftp server outside the network and lists a directory in that server to collect .xml files (oldest to newest) and transfers file 1 by 1 in a loop.

After few mins, the script ends the session when it reaches connection timeout or no files to transfer. Sleeps for 5 seconds and connects again to get the next set of files, like near real-time. This script was working fine for several months before we started having 'Socket is closed error' like every 10 -15 mins. The script would run normally as expected and starts transferring files, then all of sudden will hang for 2-3 mins and eventually through the below error.

Sometimes, the moment the script connects to sftp server and starts transferring files, after few files, will send up with the same error again

Error: return self._send(s, m) File "C:\ProgramData\Anaconda3\lib\site-packages\paramiko\channel.py", line 1198, in _send raise socket.error("Socket is closed")OSError: Socket is closed

import os
import shutil
import paramiko
from time import sleep
from datetime import datetime 
import fnmatch
from lxml import etree
import lxml

localpath=r"D:/Imported_Files/"
logpath=r"D:/Imported_Files/log/"
temppath=r"D:/Imported_Files/file_rename_temp/"

while True:
######### 
    try:                            #### providing server credentials and connection to the sftp server with username and private key
        host = "Hostname"
        port = 22
        transport = paramiko.Transport((host, port))
        username = "username"
        mykey = paramiko.RSAKey.from_private_key_file("C:/<PATH>",password='#########')
        transport.connect(username = username, pkey = mykey)
        sftp = paramiko.SFTPClient.from_transport(transport)
    except Exception as e:
        print(str(e))
        sleep(30)
        continue
    try:
        sftp.listdir()
        sftp.chdir("outbox")
        sftp.listdir("")
        file_list=[x.filename for x in sorted(sftp.listdir_attr(),key= lambda f: f.st_mtime)] ## listing directory and get oldest files first in the list to process
        file_list
    except Exception as e:  #### continue if there is an exception
        print(str(e))
        sleep(30)
        continue
    dttxt=str(datetime.now().strftime('%Y%m%d'))
    for file in file_list:                             #### getting only files with .xml extension
        if fnmatch.fnmatch(file,"*.xml"):
            tempfilepath=temppath+file
            localfilepath=localpath+file
            file_info=sftp.stat(file)
            file_mod_time=datetime.fromtimestamp(file_info.st_mtime)    ### assigning modified timestamp of file to variable
            file_acc_time=datetime.fromtimestamp(file_info.st_atime)    ### assigning create timestamp of file to variable
                                        
            try:
                sftp.get(file,tempfilepath)             ### performing sftp of the selected file from the list 
            except:
                file_error_log = open(logpath+"files_not_processed"+dttxt+".log", "a")   #### writing info to log 
                file_error_log.write("Failed:"+file+"\n")
                file_error_log.close()
                print("getting file "+file+" failed!")
                continue
            
            try:
                sftp.remove(file)                   #### removing the file from sftp server after successful transfer
            except:
                print("deleteing file "+file+" failed")
                os.remove(tempfilepath)
                print("exception, moving on to next file")
                file_error_ftp_remove = open(logpath+"files_not_deleted_ftp"+dttxt+".log", "a")
                file_error_ftp_remove.write("Failed:"+file+"\n")
                file_error_ftp_remove.close()
                continue
            
            try:
                root = lxml.etree.parse(tempfilepath)                       #### parsing the file to extract a tag from .xml
                system_load_id=root.find('system_load_id')
                if system_load_id.text==None:
                    system_load_id_text=""
                else:
                    system_load_id_text=system_load_id.text
                new_filename=localpath+os.path.splitext(os.path.basename(tempfilepath))[0]+"-"+system_load_id_text+os.path.splitext(os.path.basename(localfilepath))[1]
                sleep(0.3)
                os.rename(tempfilepath, new_filename)
            except:
                sleep(0.3)
                os.rename(tempfilepath, localpath+file)
                print('Cant parse xml, hence moving the file as it is')
                pass
            ########### file moved to final location after parsing .xml. writing to log and processing next file in the list    
            file_processed_log = open(logpath+"files_processed"+ str(datetime.now().strftime('%Y%m%d'))+".log", "a")
            file_processed_log.write(str(datetime.now().strftime('%Y%m%d %H:%M:%S'))+" : "+file+","+str(file_mod_time)+","+str(file_acc_time)+"\n")
            file_processed_log.close()
    print(datetime.now())
    sleep(5)    ######## 1 session complete , sleeping and connecting to server again to get the next set of files

The issue is not consistent. Sometimes, we have this error like 5 times in a day and some days 100+ times in a day

I have researched online and not sure where the issue is since the script ran fine for several months and processed 1000s of files per day in near real-time without any issues



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source