| | ---
|
| | license: bsd-3-clause
|
| | language:
|
| | - en
|
| | base_model:
|
| | - deeplabv3plus_mobilenet
|
| | pipeline_tag: image-segmentation
|
| | tags:
|
| | - deeplabv3plus
|
| | ---
|
| |
|
| | # DeepLabv3Plus
|
| |
|
| | This version of deeplabv3plus_mobilenet has been converted to run on the Axera NPU using **w8a16** quantization.
|
| |
|
| | Compatible with Pulsar2 version: 5.0-patch1
|
| |
|
| | ## Convert tools links:
|
| |
|
| | For those who are interested in model conversion, you can try to export axmodel through
|
| |
|
| | - [The repo of original](https://github.com/VainF/DeepLabV3Plus-Pytorch.git)
|
| |
|
| | - [Pulsar2 Link, How to Convert ONNX to axmodel](https://pulsar2-docs.readthedocs.io/en/latest/pulsar2/introduction.html)
|
| |
|
| |
|
| | ## Support Platform
|
| |
|
| | - AX650
|
| | - [M4N-Dock(爱芯派Pro)](https://wiki.sipeed.com/hardware/zh/maixIV/m4ndock/m4ndock.html)
|
| | - [M.2 Accelerator card](https://axcl-docs.readthedocs.io/zh-cn/latest/doc_guide_hardware.html)
|
| | - AX637
|
| |
|
| | |Chips|Models |Time|
|
| | |--|--|--|
|
| | |AX650|deeplabv3plus_mobilenet_u16|13.4 ms |
|
| | |AX637|deeplabv3plus_mobilenet_u16|39.4 ms |
|
| |
|
| |
|
| | ## How to use
|
| |
|
| | Download all files from this repository to the device
|
| |
|
| |
|
| | ### python env requirement
|
| |
|
| | #### pyaxengine
|
| |
|
| | https://github.com/AXERA-TECH/pyaxengine
|
| |
|
| | ```
|
| | wget https://github.com/AXERA-TECH/pyaxengine/releases/download/0.1.3.rc2/axengine-0.1.3-py3-none-any.whl
|
| | pip install axengine-0.1.3-py3-none-any.whl
|
| | ```
|
| |
|
| | #### others
|
| |
|
| | Maybe None.
|
| |
|
| | #### Inference with AX650 Host, such as M4N-Dock(爱芯派Pro)
|
| |
|
| | Input image:
|
| |
|
| | 
|
| |
|
| | run
|
| | ```
|
| | python3 infer.py --img samples/1_image.png --model models-ax637/deeplabv3plus_mobilenet_u16.axmodel
|
| | ```
|
| |
|
| | Output image:
|
| |
|
| | 
|
| | |