Skip to content

InvalidArgumentError: scores has incompatible shape [Op:CombinedNonMaxSuppression] #468

Open
@laumecha

Description

@laumecha

I am trying to perform the inference of one image using a quantized model using the "detect.py" file. I have quantized the original yolov4 model using QKeras and then, I have transform it to SaveModel format. However, now I am getting the error:


Traceback (most recent call last):
  File "/mnt/beegfs/gap/laumecha/conda-qkeras/tensorflow-yolov4-tflite/detect.py", line 134, in <module>
    app.run(main)
  File "/mnt/beegfs/gap/laumecha/miniconda3/envs/qkeras_env/lib/python3.9/site-packages/absl/app.py", line 312, in run
    _run_main(main, args)
  File "/mnt/beegfs/gap/laumecha/miniconda3/envs/qkeras_env/lib/python3.9/site-packages/absl/app.py", line 258, in _run_main
    sys.exit(main(argv))
  File "/mnt/beegfs/gap/laumecha/conda-qkeras/tensorflow-yolov4-tflite/detect.py", line 113, in main
    boxes, scores, classes, valid_detections = tf.image.combined_non_max_suppression(
  File "/mnt/beegfs/gap/laumecha/miniconda3/envs/qkeras_env/lib/python3.9/site-packages/tensorflow/python/util/dispatch.py", line 206, in wrapper
    return target(*args, **kwargs)
  File "/mnt/beegfs/gap/laumecha/miniconda3/envs/qkeras_env/lib/python3.9/site-packages/tensorflow/python/ops/image_ops_impl.py", line 5101, in combined_non_max_suppression
    return gen_image_ops.combined_non_max_suppression(
  File "/mnt/beegfs/gap/laumecha/miniconda3/envs/qkeras_env/lib/python3.9/site-packages/tensorflow/python/ops/gen_image_ops.py", line 358, in combined_non_max_suppression
    _ops.raise_from_not_ok_status(e, name)
  File "/mnt/beegfs/gap/laumecha/miniconda3/envs/qkeras_env/lib/python3.9/site-packages/tensorflow/python/framework/ops.py", line 6897, in raise_from_not_ok_status
    six.raise_from(core._status_to_exception(e.code, message), None)
  File "<string>", line 3, in raise_from
tensorflow.python.framework.errors_impl.InvalidArgumentError: scores has incompatible shape [Op:CombinedNonMaxSuppression]

Before converting to SavedModel format I have seen that the topology of the non-quantized and the quantized models have the same layers formats and outputs. So I assume that the output should be the same but I do not have much experience with SavedModel models.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions