|
--- |
|
license: cc-by-nc-4.0 |
|
language: |
|
- en |
|
pipeline_tag: depth-estimation |
|
library_name: coreml |
|
tags: |
|
- depth |
|
- relative depth |
|
base_model: |
|
- depth-anything/Depth-Anything-V2-Large |
|
--- |
|
|
|
# Depth Anything V2 Large (mlpackage) |
|
|
|
In this repo you can find: |
|
* The notebook which was used to convert [depth-anything/Depth-Anything-V2-Large](https://huggingface.co/depth-anything/Depth-Anything-V2-Large) into a CoreML package. |
|
* The mlpackage which can be opened in Xcode and used for Preview and development of macOS and iOS Apps |
|
* Performence and compute unit mapping report for this model as meassured on an iPhone 16 Pro Max and a MacBook Pro (With Apple M3 Pro) |
|
|
|
As a derivative work of Depth-Anything-V2-Large this port is also under cc-by-nc-4.0 |
|
|
|
 |
|
|
|
|
|
## Citation of original work |
|
|
|
If you find this project useful, please consider citing: |
|
|
|
```bibtex |
|
@article{depth_anything_v2, |
|
title={Depth Anything V2}, |
|
author={Yang, Lihe and Kang, Bingyi and Huang, Zilong and Zhao, Zhen and Xu, Xiaogang and Feng, Jiashi and Zhao, Hengshuang}, |
|
journal={arXiv:2406.09414}, |
|
year={2024} |
|
} |
|
|
|
@inproceedings{depth_anything_v1, |
|
title={Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data}, |
|
author={Yang, Lihe and Kang, Bingyi and Huang, Zilong and Xu, Xiaogang and Feng, Jiashi and Zhao, Hengshuang}, |
|
booktitle={CVPR}, |
|
year={2024} |
|
} |
|
|
|
|