Jump to content

canaglia

Members
  • Content Count

    1
  • Joined

  • Last visited

Everything posted by canaglia

  1. canaglia

    Multi Istance problem

    I run this script: import cpuinfo # pip install py-cpuinfo print(cpuinfo.get_cpu_info()) happens that a new instance of the program runs and a JSON error appears when new istance closed : Traceback (most recent call last): File "<string>", line 3, in <module> File "..............\Python311\Lib\site-packages\cpuinfo\cpuinfo.py", line 2762, in get_cpu_info output = json.loads(output, object_hook = _utf_to_str) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "..............\Python311\Lib\json\__init__.py", line 359, in loads return cls(**kw).decode(s) ^^^^^^^^^^^^^^^^^^^ File "..............\Python311\Lib\json\decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "..............\Python311\Lib\json\decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) This doesn't happen when running the script from python The problem is in this line of cpuinfo.py that create a new istance p1 = Popen(command, stdout=PIPE, stderr=PIPE, stdin=PIPE) I don't need to use the cpuinfo library but it's used in the ultralytics libraries to determine if a GPU is present. You can experience the same effect with this simple script from ultralytics import YOLO model = YOLO("yolov8n.pt") results = model.predict(source="\\temp", show=True) print(results) For now I have modified the get_cpu_info() function found in the torch_utils.py of ultralytics library, but it's not a good thing 😞 Does anyone have any ideas on how to avoid this problem?
×