ZIP: Scalable Crowd Counting via Zero-Inflated Poisson Modeling
The official implementation of the paper ZIP: Scalable Crowd Counting via Zero-Inflated Poisson Modeling.
Reults
Variants | Size (M) | GFLOPS (on HD) | SHA (MAE) | SHA (RMSE) | SHA (NAE, %) | SHB (MAE) | SHB (RMSE) | SHB (NAE, %) | QNRF (MAE) | QNRF (RMSE) | QNRF (NAE, %) | NWPU-Val (MAE) | NWPU-Val (RMSE) | NWPU-Val (NAE, %) | NWPU-Test (MAE) | NWPU-Test (RMSE) | NWPU-Test (NAE, %) |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
-P (Pico) | 0.81 | 6.46 | 71.18 | 109.60 | 16.69 | 8.23 | 12.62 | 6.98 | 96.29 | 161.82 | 14.40 | 66.94 | 223.52 | 14.97 | 79.91 | 327.17 | 21.42 |
-N (Nano) | 3.36 | 24.73 | 58.86 | 94.63 | 14.15 | 7.74 | 12.14 | 6.33 | 86.46 | 147.64 | 12.60 | 56.27 | 292.53 | 14.06 | 75.03 | 334.54 | 17.59 |
-T (Tiny) | 10.53 | 61.39 | 56.36 | 86.09 | 13.26 | 6.67 | 9.90 | 5.52 | 76.02 | 129.40 | 11.10 | 46.74 | 145.32 | 11.31 | 66.43 | 323.27 | 13.45 |
-S (Small) | 33.60 | 242.43 | 55.17 | 88.99 | 11.97 | 5.83 | 9.21 | 4.58 | 73.32 | 125.09 | 10.40 | 31.66 | 77.11 | 9.61 | 62.89 | 309.02 | 12.09 |
-B (Base) | 105.60 | 800.99 | 47.81 | 75.04 | 11.06 | 5.51 | 8.63 | 4.48 | 69.46 | 121.88 | 10.18 | 28.26 | 64.84 | 9.20 | 60.09 | 298.95 | 10.44 |
Step 1: Install Dependencies
pip install -r requirements.txt
Step 2: Download Processed Datasets
- ShanghaiTech A: sha.zip
- ShanghaiTech B: shb.zip
- UCF-QNRF: qnrf.zip, qnrf.z01
- NWPU-Crowd: nwpu.zip, nwpu.z01, nwpu.z02, nwpu.z03, nwpu.z04, nwpu.z05, nwpu.z06, nwpu.z07, nwpu.z08
To unzip splitted .zip
files, 7-Zip is recommended. You can use the following command to install 7-Zip and unzip the dataset:
sudo apt update
sudo apt install p7zip-full
7z x dataset.zip
Step 3: Run Training
Add the training code to run.sh
and execute it:
sh run.sh
If you want to use the zero-inflated loss, set either --reg_loss
or --aux_loss
to zipnll
. For example, you can set --reg_loss zipnll
to use the zero-inflated loss for regression.
You can use an auxillary loss to improve the performance. For example, you might want to use the pre-defined multi-scale MAE loss by setting --aux_loss msmae
and --scales 1 2 4
.
The DMCount loss can also be used together with the zero-inflated loss. For example, you can set --reg_loss zipnll --aux_loss dmcount
to use both losses.
Step 4: Test the Model
Use test.py
or test.sh
to test the model. You can specify the dataset, weight path, input size, and other parameters.
To generate the predicted counts on NWPU-Crowd Test, you need to use test_nwpu.py
instead.
To visualize the results, use the notebooks/model.ipynb
notebook.
Trained weights are also provided:
Make sure to use the processed datasets and the exact commands pre-defined in test.sh
to reproduce the same results.
Step 5: Visualize the Results
Use the notebooks/model.ipynb
notebook to visualize the results.
Evaluation results
- MAE on ShanghaiTech Aself-reported55.170
- RMSE on ShanghaiTech Aself-reported88.990
- NAE on ShanghaiTech Aself-reported0.120
- MAE on ShanghaiTech Bself-reported5.830
- RMSE on ShanghaiTech Bself-reported9.210
- NAE on ShanghaiTech Bself-reported0.046
- MAE on UCF-QNRFself-reported73.320
- RMSE on UCF-QNRFself-reported125.090
- NAE on UCF-QNRFself-reported0.104
- MAE on NWPU-Crowd-Valself-reported31.660