Spaces:
Sleeping
Sleeping
Create README.md
Browse files
README.md
CHANGED
|
@@ -1,310 +1,12 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
</a>
|
| 14 |
-
</div>
|
| 15 |
-
|
| 16 |
-
## Web Demo
|
| 17 |
-
|
| 18 |
-
- Integrated into [Huggingface Spaces 🤗](https://huggingface.co/spaces/akhaliq/yolov7) using Gradio. Try out the Web Demo [](https://huggingface.co/spaces/akhaliq/yolov7)
|
| 19 |
-
|
| 20 |
-
## Performance
|
| 21 |
-
|
| 22 |
-
MS COCO
|
| 23 |
-
|
| 24 |
-
| Model | Test Size | AP<sup>test</sup> | AP<sub>50</sub><sup>test</sup> | AP<sub>75</sub><sup>test</sup> | batch 1 fps | batch 32 average time |
|
| 25 |
-
| :-- | :-: | :-: | :-: | :-: | :-: | :-: |
|
| 26 |
-
| [**YOLOv7**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7.pt) | 640 | **51.4%** | **69.7%** | **55.9%** | 161 *fps* | 2.8 *ms* |
|
| 27 |
-
| [**YOLOv7-X**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7x.pt) | 640 | **53.1%** | **71.2%** | **57.8%** | 114 *fps* | 4.3 *ms* |
|
| 28 |
-
| | | | | | | |
|
| 29 |
-
| [**YOLOv7-W6**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6.pt) | 1280 | **54.9%** | **72.6%** | **60.1%** | 84 *fps* | 7.6 *ms* |
|
| 30 |
-
| [**YOLOv7-E6**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6.pt) | 1280 | **56.0%** | **73.5%** | **61.2%** | 56 *fps* | 12.3 *ms* |
|
| 31 |
-
| [**YOLOv7-D6**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-d6.pt) | 1280 | **56.6%** | **74.0%** | **61.8%** | 44 *fps* | 15.0 *ms* |
|
| 32 |
-
| [**YOLOv7-E6E**](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6e.pt) | 1280 | **56.8%** | **74.4%** | **62.1%** | 36 *fps* | 18.7 *ms* |
|
| 33 |
-
|
| 34 |
-
## Installation
|
| 35 |
-
|
| 36 |
-
Docker environment (recommended)
|
| 37 |
-
<details><summary> <b>Expand</b> </summary>
|
| 38 |
-
|
| 39 |
-
``` shell
|
| 40 |
-
# create the docker container, you can change the share memory size if you have more.
|
| 41 |
-
nvidia-docker run --name yolov7 -it -v your_coco_path/:/coco/ -v your_code_path/:/yolov7 --shm-size=64g nvcr.io/nvidia/pytorch:21.08-py3
|
| 42 |
-
|
| 43 |
-
# apt install required packages
|
| 44 |
-
apt update
|
| 45 |
-
apt install -y zip htop screen libgl1-mesa-glx
|
| 46 |
-
|
| 47 |
-
# pip install required packages
|
| 48 |
-
pip install seaborn thop
|
| 49 |
-
|
| 50 |
-
# go to code folder
|
| 51 |
-
cd /yolov7
|
| 52 |
-
```
|
| 53 |
-
|
| 54 |
-
</details>
|
| 55 |
-
|
| 56 |
-
## Testing
|
| 57 |
-
|
| 58 |
-
[`yolov7.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7.pt) [`yolov7x.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7x.pt) [`yolov7-w6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6.pt) [`yolov7-e6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6.pt) [`yolov7-d6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-d6.pt) [`yolov7-e6e.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6e.pt)
|
| 59 |
-
|
| 60 |
-
``` shell
|
| 61 |
-
python test.py --data data/coco.yaml --img 640 --batch 32 --conf 0.001 --iou 0.65 --device 0 --weights yolov7.pt --name yolov7_640_val
|
| 62 |
-
```
|
| 63 |
-
|
| 64 |
-
You will get the results:
|
| 65 |
-
|
| 66 |
-
```
|
| 67 |
-
Average Precision (AP) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.51206
|
| 68 |
-
Average Precision (AP) @[ IoU=0.50 | area= all | maxDets=100 ] = 0.69730
|
| 69 |
-
Average Precision (AP) @[ IoU=0.75 | area= all | maxDets=100 ] = 0.55521
|
| 70 |
-
Average Precision (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.35247
|
| 71 |
-
Average Precision (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.55937
|
| 72 |
-
Average Precision (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.66693
|
| 73 |
-
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 1 ] = 0.38453
|
| 74 |
-
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets= 10 ] = 0.63765
|
| 75 |
-
Average Recall (AR) @[ IoU=0.50:0.95 | area= all | maxDets=100 ] = 0.68772
|
| 76 |
-
Average Recall (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = 0.53766
|
| 77 |
-
Average Recall (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = 0.73549
|
| 78 |
-
Average Recall (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.83868
|
| 79 |
-
```
|
| 80 |
-
|
| 81 |
-
To measure accuracy, download [COCO-annotations for Pycocotools](http://images.cocodataset.org/annotations/annotations_trainval2017.zip) to the `./coco/annotations/instances_val2017.json`
|
| 82 |
-
|
| 83 |
-
## Training
|
| 84 |
-
|
| 85 |
-
Data preparation
|
| 86 |
-
|
| 87 |
-
``` shell
|
| 88 |
-
bash scripts/get_coco.sh
|
| 89 |
-
```
|
| 90 |
-
|
| 91 |
-
* Download MS COCO dataset images ([train](http://images.cocodataset.org/zips/train2017.zip), [val](http://images.cocodataset.org/zips/val2017.zip), [test](http://images.cocodataset.org/zips/test2017.zip)) and [labels](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/coco2017labels-segments.zip). If you have previously used a different version of YOLO, we strongly recommend that you delete `train2017.cache` and `val2017.cache` files, and redownload [labels](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/coco2017labels-segments.zip)
|
| 92 |
-
|
| 93 |
-
Single GPU training
|
| 94 |
-
|
| 95 |
-
``` shell
|
| 96 |
-
# train p5 models
|
| 97 |
-
python train.py --workers 8 --device 0 --batch-size 32 --data data/coco.yaml --img 640 640 --cfg cfg/training/yolov7.yaml --weights '' --name yolov7 --hyp data/hyp.scratch.p5.yaml
|
| 98 |
-
|
| 99 |
-
# train p6 models
|
| 100 |
-
python train_aux.py --workers 8 --device 0 --batch-size 16 --data data/coco.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6.yaml --weights '' --name yolov7-w6 --hyp data/hyp.scratch.p6.yaml
|
| 101 |
-
```
|
| 102 |
-
|
| 103 |
-
Multiple GPU training
|
| 104 |
-
|
| 105 |
-
``` shell
|
| 106 |
-
# train p5 models
|
| 107 |
-
python -m torch.distributed.launch --nproc_per_node 4 --master_port 9527 train.py --workers 8 --device 0,1,2,3 --sync-bn --batch-size 128 --data data/coco.yaml --img 640 640 --cfg cfg/training/yolov7.yaml --weights '' --name yolov7 --hyp data/hyp.scratch.p5.yaml
|
| 108 |
-
|
| 109 |
-
# train p6 models
|
| 110 |
-
python -m torch.distributed.launch --nproc_per_node 8 --master_port 9527 train_aux.py --workers 8 --device 0,1,2,3,4,5,6,7 --sync-bn --batch-size 128 --data data/coco.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6.yaml --weights '' --name yolov7-w6 --hyp data/hyp.scratch.p6.yaml
|
| 111 |
-
```
|
| 112 |
-
|
| 113 |
-
## Transfer learning
|
| 114 |
-
|
| 115 |
-
[`yolov7_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7_training.pt) [`yolov7x_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7x_training.pt) [`yolov7-w6_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6_training.pt) [`yolov7-e6_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6_training.pt) [`yolov7-d6_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-d6_training.pt) [`yolov7-e6e_training.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-e6e_training.pt)
|
| 116 |
-
|
| 117 |
-
Single GPU finetuning for custom dataset
|
| 118 |
-
|
| 119 |
-
``` shell
|
| 120 |
-
# finetune p5 models
|
| 121 |
-
python train.py --workers 8 --device 0 --batch-size 32 --data data/custom.yaml --img 640 640 --cfg cfg/training/yolov7-custom.yaml --weights 'yolov7_training.pt' --name yolov7-custom --hyp data/hyp.scratch.custom.yaml
|
| 122 |
-
|
| 123 |
-
# finetune p6 models
|
| 124 |
-
python train_aux.py --workers 8 --device 0 --batch-size 16 --data data/custom.yaml --img 1280 1280 --cfg cfg/training/yolov7-w6-custom.yaml --weights 'yolov7-w6_training.pt' --name yolov7-w6-custom --hyp data/hyp.scratch.custom.yaml
|
| 125 |
-
```
|
| 126 |
-
|
| 127 |
-
## Re-parameterization
|
| 128 |
-
|
| 129 |
-
See [reparameterization.ipynb](tools/reparameterization.ipynb)
|
| 130 |
-
|
| 131 |
-
## Inference
|
| 132 |
-
|
| 133 |
-
On video:
|
| 134 |
-
``` shell
|
| 135 |
-
python detect.py --weights yolov7.pt --conf 0.25 --img-size 640 --source yourvideo.mp4
|
| 136 |
-
```
|
| 137 |
-
|
| 138 |
-
On image:
|
| 139 |
-
``` shell
|
| 140 |
-
python detect.py --weights yolov7.pt --conf 0.25 --img-size 640 --source inference/images/horses.jpg
|
| 141 |
-
```
|
| 142 |
-
|
| 143 |
-
<div align="center">
|
| 144 |
-
<a href="./">
|
| 145 |
-
<img src="./figure/horses_prediction.jpg" width="59%"/>
|
| 146 |
-
</a>
|
| 147 |
-
</div>
|
| 148 |
-
|
| 149 |
-
|
| 150 |
-
## Export
|
| 151 |
-
|
| 152 |
-
**Pytorch to CoreML (and inference on MacOS/iOS)** <a href="https://colab.research.google.com/github/WongKinYiu/yolov7/blob/main/tools/YOLOv7CoreML.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a>
|
| 153 |
-
|
| 154 |
-
**Pytorch to ONNX with NMS (and inference)** <a href="https://colab.research.google.com/github/WongKinYiu/yolov7/blob/main/tools/YOLOv7onnx.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a>
|
| 155 |
-
```shell
|
| 156 |
-
python export.py --weights yolov7-tiny.pt --grid --end2end --simplify \
|
| 157 |
-
--topk-all 100 --iou-thres 0.65 --conf-thres 0.35 --img-size 640 640 --max-wh 640
|
| 158 |
-
```
|
| 159 |
-
|
| 160 |
-
**Pytorch to TensorRT with NMS (and inference)** <a href="https://colab.research.google.com/github/WongKinYiu/yolov7/blob/main/tools/YOLOv7trt.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a>
|
| 161 |
-
|
| 162 |
-
```shell
|
| 163 |
-
wget https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-tiny.pt
|
| 164 |
-
python export.py --weights ./yolov7-tiny.pt --grid --end2end --simplify --topk-all 100 --iou-thres 0.65 --conf-thres 0.35 --img-size 640 640
|
| 165 |
-
git clone https://github.com/Linaom1214/tensorrt-python.git
|
| 166 |
-
python ./tensorrt-python/export.py -o yolov7-tiny.onnx -e yolov7-tiny-nms.trt -p fp16
|
| 167 |
-
```
|
| 168 |
-
|
| 169 |
-
**Pytorch to TensorRT another way** <a href="https://colab.research.google.com/gist/AlexeyAB/fcb47ae544cf284eb24d8ad8e880d45c/yolov7trtlinaom.ipynb"><img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"></a> <details><summary> <b>Expand</b> </summary>
|
| 170 |
-
|
| 171 |
-
|
| 172 |
-
```shell
|
| 173 |
-
wget https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-tiny.pt
|
| 174 |
-
python export.py --weights yolov7-tiny.pt --grid --include-nms
|
| 175 |
-
git clone https://github.com/Linaom1214/tensorrt-python.git
|
| 176 |
-
python ./tensorrt-python/export.py -o yolov7-tiny.onnx -e yolov7-tiny-nms.trt -p fp16
|
| 177 |
-
|
| 178 |
-
# Or use trtexec to convert ONNX to TensorRT engine
|
| 179 |
-
/usr/src/tensorrt/bin/trtexec --onnx=yolov7-tiny.onnx --saveEngine=yolov7-tiny-nms.trt --fp16
|
| 180 |
-
```
|
| 181 |
-
|
| 182 |
-
</details>
|
| 183 |
-
|
| 184 |
-
Tested with: Python 3.7.13, Pytorch 1.12.0+cu113
|
| 185 |
-
|
| 186 |
-
## Pose estimation
|
| 187 |
-
|
| 188 |
-
[`code`](https://github.com/WongKinYiu/yolov7/tree/pose) [`yolov7-w6-pose.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-w6-pose.pt)
|
| 189 |
-
|
| 190 |
-
See [keypoint.ipynb](https://github.com/WongKinYiu/yolov7/blob/main/tools/keypoint.ipynb).
|
| 191 |
-
|
| 192 |
-
<div align="center">
|
| 193 |
-
<a href="./">
|
| 194 |
-
<img src="./figure/pose.png" width="39%"/>
|
| 195 |
-
</a>
|
| 196 |
-
</div>
|
| 197 |
-
|
| 198 |
-
|
| 199 |
-
## Instance segmentation (with NTU)
|
| 200 |
-
|
| 201 |
-
[`code`](https://github.com/WongKinYiu/yolov7/tree/mask) [`yolov7-mask.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-mask.pt)
|
| 202 |
-
|
| 203 |
-
See [instance.ipynb](https://github.com/WongKinYiu/yolov7/blob/main/tools/instance.ipynb).
|
| 204 |
-
|
| 205 |
-
<div align="center">
|
| 206 |
-
<a href="./">
|
| 207 |
-
<img src="./figure/mask.png" width="59%"/>
|
| 208 |
-
</a>
|
| 209 |
-
</div>
|
| 210 |
-
|
| 211 |
-
## Instance segmentation
|
| 212 |
-
|
| 213 |
-
[`code`](https://github.com/WongKinYiu/yolov7/tree/u7/seg) [`yolov7-seg.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-seg.pt)
|
| 214 |
-
|
| 215 |
-
YOLOv7 for instance segmentation (YOLOR + YOLOv5 + YOLACT)
|
| 216 |
-
|
| 217 |
-
| Model | Test Size | AP<sup>box</sup> | AP<sub>50</sub><sup>box</sup> | AP<sub>75</sub><sup>box</sup> | AP<sup>mask</sup> | AP<sub>50</sub><sup>mask</sup> | AP<sub>75</sub><sup>mask</sup> |
|
| 218 |
-
| :-- | :-: | :-: | :-: | :-: | :-: | :-: | :-: |
|
| 219 |
-
| **YOLOv7-seg** | 640 | **51.4%** | **69.4%** | **55.8%** | **41.5%** | **65.5%** | **43.7%** |
|
| 220 |
-
|
| 221 |
-
## Anchor free detection head
|
| 222 |
-
|
| 223 |
-
[`code`](https://github.com/WongKinYiu/yolov7/tree/u6) [`yolov7-u6.pt`](https://github.com/WongKinYiu/yolov7/releases/download/v0.1/yolov7-u6.pt)
|
| 224 |
-
|
| 225 |
-
YOLOv7 with decoupled TAL head (YOLOR + YOLOv5 + YOLOv6)
|
| 226 |
-
|
| 227 |
-
| Model | Test Size | AP<sup>val</sup> | AP<sub>50</sub><sup>val</sup> | AP<sub>75</sub><sup>val</sup> |
|
| 228 |
-
| :-- | :-: | :-: | :-: | :-: |
|
| 229 |
-
| **YOLOv7-u6** | 640 | **52.6%** | **69.7%** | **57.3%** |
|
| 230 |
-
|
| 231 |
-
|
| 232 |
-
## Citation
|
| 233 |
-
|
| 234 |
-
```
|
| 235 |
-
@inproceedings{wang2023yolov7,
|
| 236 |
-
title={{YOLOv7}: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors},
|
| 237 |
-
author={Wang, Chien-Yao and Bochkovskiy, Alexey and Liao, Hong-Yuan Mark},
|
| 238 |
-
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
|
| 239 |
-
year={2023}
|
| 240 |
-
}
|
| 241 |
-
```
|
| 242 |
-
|
| 243 |
-
```
|
| 244 |
-
@article{wang2023designing,
|
| 245 |
-
title={Designing Network Design Strategies Through Gradient Path Analysis},
|
| 246 |
-
author={Wang, Chien-Yao and Liao, Hong-Yuan Mark and Yeh, I-Hau},
|
| 247 |
-
journal={Journal of Information Science and Engineering},
|
| 248 |
-
year={2023}
|
| 249 |
-
}
|
| 250 |
-
```
|
| 251 |
-
|
| 252 |
-
|
| 253 |
-
## Teaser
|
| 254 |
-
|
| 255 |
-
YOLOv7-semantic & YOLOv7-panoptic & YOLOv7-caption
|
| 256 |
-
|
| 257 |
-
<div align="center">
|
| 258 |
-
<a href="./">
|
| 259 |
-
<img src="./figure/tennis.jpg" width="24%"/>
|
| 260 |
-
</a>
|
| 261 |
-
<a href="./">
|
| 262 |
-
<img src="./figure/tennis_semantic.jpg" width="24%"/>
|
| 263 |
-
</a>
|
| 264 |
-
<a href="./">
|
| 265 |
-
<img src="./figure/tennis_panoptic.png" width="24%"/>
|
| 266 |
-
</a>
|
| 267 |
-
<a href="./">
|
| 268 |
-
<img src="./figure/tennis_caption.png" width="24%"/>
|
| 269 |
-
</a>
|
| 270 |
-
</div>
|
| 271 |
-
|
| 272 |
-
YOLOv7-semantic & YOLOv7-detection & YOLOv7-depth (with NTUT)
|
| 273 |
-
|
| 274 |
-
<div align="center">
|
| 275 |
-
<a href="./">
|
| 276 |
-
<img src="./figure/yolov7_city.jpg" width="80%"/>
|
| 277 |
-
</a>
|
| 278 |
-
</div>
|
| 279 |
-
|
| 280 |
-
YOLOv7-3d-detection & YOLOv7-lidar & YOLOv7-road (with NTUT)
|
| 281 |
-
|
| 282 |
-
<div align="center">
|
| 283 |
-
<a href="./">
|
| 284 |
-
<img src="./figure/yolov7_3d.jpg" width="30%"/>
|
| 285 |
-
</a>
|
| 286 |
-
<a href="./">
|
| 287 |
-
<img src="./figure/yolov7_lidar.jpg" width="30%"/>
|
| 288 |
-
</a>
|
| 289 |
-
<a href="./">
|
| 290 |
-
<img src="./figure/yolov7_road.jpg" width="30%"/>
|
| 291 |
-
</a>
|
| 292 |
-
</div>
|
| 293 |
-
|
| 294 |
-
|
| 295 |
-
## Acknowledgements
|
| 296 |
-
|
| 297 |
-
<details><summary> <b>Expand</b> </summary>
|
| 298 |
-
|
| 299 |
-
* [https://github.com/AlexeyAB/darknet](https://github.com/AlexeyAB/darknet)
|
| 300 |
-
* [https://github.com/WongKinYiu/yolor](https://github.com/WongKinYiu/yolor)
|
| 301 |
-
* [https://github.com/WongKinYiu/PyTorch_YOLOv4](https://github.com/WongKinYiu/PyTorch_YOLOv4)
|
| 302 |
-
* [https://github.com/WongKinYiu/ScaledYOLOv4](https://github.com/WongKinYiu/ScaledYOLOv4)
|
| 303 |
-
* [https://github.com/Megvii-BaseDetection/YOLOX](https://github.com/Megvii-BaseDetection/YOLOX)
|
| 304 |
-
* [https://github.com/ultralytics/yolov3](https://github.com/ultralytics/yolov3)
|
| 305 |
-
* [https://github.com/ultralytics/yolov5](https://github.com/ultralytics/yolov5)
|
| 306 |
-
* [https://github.com/DingXiaoH/RepVGG](https://github.com/DingXiaoH/RepVGG)
|
| 307 |
-
* [https://github.com/JUGGHM/OREPA_CVPR2022](https://github.com/JUGGHM/OREPA_CVPR2022)
|
| 308 |
-
* [https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose](https://github.com/TexasInstruments/edgeai-yolov5/tree/yolo-pose)
|
| 309 |
-
|
| 310 |
-
</details>
|
|
|
|
| 1 |
+
---
|
| 2 |
+
title: YoloCustom
|
| 3 |
+
emoji: 🚀
|
| 4 |
+
colorFrom: blue
|
| 5 |
+
colorTo: green
|
| 6 |
+
sdk: gradio # or "streamlit", "static", etc., depending on your app
|
| 7 |
+
sdk_version: "4.44.0" # Check the latest version if using Gradio
|
| 8 |
+
app_file: app.py # Replace with your main script (e.g., interfacetest.py?)
|
| 9 |
+
pinned: false
|
| 10 |
+
---
|
| 11 |
+
# YoloCustom Space
|
| 12 |
+
This is a custom YOLO implementation.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|