Connecting online mapping with hybrid navigation to enable interpretable autonomous driving.
Key insights:
# Download the OMA dataset to the data/oma directory using the Huggingface CLI:
# Note: The dataset is still in compliance review. We will open it as soon as possilble.
huggingface-cli dataset download wanjiaxu/OMA --local-dir data/oma
# By script (Recommended)
# -p is default set as python and can be ignored
sh scripts/train.sh -p python -d oma -c oma-mt-v1m1-l -n oma-mt-v1m1-l
# Direct
export PYTHONPATH=./
python tools/train.py --config-file configs/oma/oma-mt-v1m1-l.py --options save_path=exp/oma/oma-mt-v1m1-l
# By script (Based on experiment folder created by training script)
# -p is default set as python and can be ignored
# -w is default set as model_best and can be ignored
sh scripts/test.sh -p python -d oma -n oma-mt-v1m1-l -w model_best
# Direct
export PYTHONPATH=./
python tools/test.py --config-file configs/oma/oma-mt-v1m1-l.py --options save_path=exp/oma/oma-mt-v1m1-l weight=exp/oma/oma-mt-v1m1-l/model/model_best.pth
Coming soon, Wait for data compliance review.
This project is released under MIT licence.
This project is mainly based on the following projects:
The Readme is inspired by DeepEyes.
@article{wan2025driving,
title={Driving by Hybrid Navigation: An Online HD-SD Map Association Framework and Benchmark for Autonomous Vehicles},
author={Wan, Jiaxu and Wang, Xu and Xie, Mengwei and Chang, Xinyuan and Liu, Xinran and Pan, Zheng and Xu, Mu and Yuan, Ding},
journal={arXiv preprint arXiv:2507.07487},
year={2025}
}