'onnxruntime cant create session in subprocess jetson nano

hey guys I have a problem with onnxruntime I have python code that I wrote the code works properly and without any problem but when i want to use it subprocess.peopen the onnx session cant find gpu .

here is error:

Traceback (most recent call last):
  File "/home/datis/dl/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 206, in __init__
    self._create_inference_session(providers, provider_options)
  File "/home/datis/dl/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 231, in _create_inference_session
    sess.initialize_session(providers or [], provider_options or [])
RuntimeError: /home/nvidia/onnxruntime/onnxruntime/core/providers/cuda/cuda_call.cc:123 bool onnxruntime::CudaCall(ERRTYPE, const char*, const char*, ERRTYPE, const char*) [with ERRTYPE = cudaError; bool THRW = true] /home/nvidia/onnxruntime/onnxruntime/core/providers/cuda/cuda_call.cc:117 bool onnxruntime::CudaCall(ERRTYPE, const char*, const char*, ERRTYPE, const char*) [with ERRTYPE = cudaError; bool THRW = true] CUDA failure 100: no CUDA-capable device is detected ; GPU=127 ; hostname=datis-desktop ; expr=cudaSetDevice(device_id_); 



During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python3.6/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/home/datis/react_version/detector/auth_frame_multi.py", line 139, in recognize
    ext_model = FaceAnalysis(name='models', root="faceee/models")
  File "/home/datis/react_version/faceee/app/face_analysis.py", line 37, in __init__
    model = model_ex.get_model(onnx_file)
  File "/home/datis/react_version/faceee/model_ex/model_ex.py", line 63, in get_model
    model = router.get_model()
  File "/home/datis/react_version/faceee/model_ex/model_ex.py", line 21, in get_model
    session = onnxruntime.InferenceSession(self.onnx_file, None,providers=['CUDAExecutionProvider',])
  File "/home/datis/dl/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 209, in __init__
    print("EP Error using {}".format(self._providers))
AttributeError: 'InferenceSession' object has no attribute '_providers'
[[<detector.camera.Camera object at 0x7f61b1d748>, 1], [<detector.camera.Camera object at 0x7f61b1d6d8>, 1]]
this is  faceee/models
Process NoDaemonProcess-2:
Traceback (most recent call last):
  File "/home/datis/dl/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 206, in __init__
    self._create_inference_session(providers, provider_options)
  File "/home/datis/dl/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 231, in _create_inference_session
    sess.initialize_session(providers or [], provider_options or [])
RuntimeError: /home/nvidia/onnxruntime/onnxruntime/core/providers/cuda/cuda_call.cc:123 bool onnxruntime::CudaCall(ERRTYPE, const char*, const char*, ERRTYPE, const char*) [with ERRTYPE = cudaError; bool THRW = true] /home/nvidia/onnxruntime/onnxruntime/core/providers/cuda/cuda_call.cc:117 bool onnxruntime::CudaCall(ERRTYPE, const char*, const char*, ERRTYPE, const char*) [with ERRTYPE = cudaError; bool THRW = true] CUDA failure 100: no CUDA-capable device is detected ; GPU=127 ; hostname=datis-desktop ; expr=cudaSetDevice(device_id_); 



During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib/python3.6/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/home/datis/react_version/detector/auth_frame_multi.py", line 139, in recognize
    ext_model = FaceAnalysis(name='models', root="faceee/models")
  File "/home/datis/react_version/faceee/app/face_analysis.py", line 37, in __init__
    model = model_ex.get_model(onnx_file)
  File "/home/datis/react_version/faceee/model_ex/model_ex.py", line 63, in get_model
    model = router.get_model()
  File "/home/datis/react_version/faceee/model_ex/model_ex.py", line 21, in get_model
    session = onnxruntime.InferenceSession(self.onnx_file, None,providers=['CUDAExecutionProvider',])
  File "/home/datis/dl/lib/python3.6/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 209, in __init__
    print("EP Error using {}".format(self._providers))
AttributeError: 'InferenceSession' object has no attribute '_providers'

the device is Nvidia Jetson Nano 4GB

I have tried to export any env that uses out of the process to solve the problem but it didn't work.

tell me if you need any other information . any one can tell me what did I miss?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source