|
|
|
Hands on Demo - ManualHANDS ON DEMO - USER MANUAL
1. Get the source code :
# downloading the code :
2. Run the Evaluation2.1. Run Locally
conda create --name test-dc python=3.11; conda activate test-dc # install esmf dependency (not supported by poetry) conda install -c conda-forge esmf esmpy # install preject dependencies using poetry cd <LIB_PATH> poetry lock; poetry install # run the data challenge using the python scriptDC1: poetry run python <ROOT_PATH>/dc1-emulating-global-ocean/dc1/evaluate.pyThe json file with the results of the evaluation will be saved to <LIB_PATH>/dc1_output/results DC3: poetry run python <ROOT_PATH>/dc1-emulating-global-ocean/dc1/evaluate.py --logfile <LOG_PATH>/dc1.log --data_directory <TEMP_DATA_PATH> 2.2. Run Using Edito DatalabOpen the link to Edito services : DC1 : https://datalab.dive.edito.eu/launcher/service-playground/dc1-emulating-global-ocean?name=dc1-emulating-global-ocean then run run "python dc1/evaluate.py"
DC3: https://datalab.dive.edito.eu/launcher/service-playground/dc3-sea-ice-forecasting
run "python dc1/evaluate.py --logfile <LOG_PATH>/dc3.log --data_directory <TEMP_DATA_PATH>" 2.3. Run using Docker images :# prerequisite : install Docker desktop For Linux users look here : https://docs.docker.com/desktop/setup/install/linux/ubuntu/ DC1: docker run --rm -p 8888:8888 --name dc1-lab ghcr.io/ocean-ai-data-challenges/dc1-emulating-global-ocean:0.1.0 then copy the link http://127.0.0.1 .... to your browser, open a terminal and run "python dc1/evaluate.py"
DC3: docker run --rm -p 8888:8888 --name dc3-lab ghcr.io/ocean-ai-data-challenges/dc-sea-ice-forecasting:edito-gpu-0.1.0 then copy the link http://127.0.0.1 .... to your browser, open a terminal and run "python dc3/evaluate.py --logfile <LOG_PATH>/dc3.log --data_directory <TEMP_DATA_PATH>"
3. Create the leaderboard tables# prerequisite : install Quarto : https://quarto.org/docs/get-started/ # Get the code : git clone https://github.com/ocean-ai-data-challenges/dc_leaderboard.git# cd to dcleaderboard folder cd <root_path>/dcleaderboard # create the Conda env make -f Makefile.conda install # (optional) Check the environment install make -f Makefile.conda test-env # copy the results json file in dcleaderboard/results # create "fake" challenger result filespython gen_noisy_results.py results/results_glonet.json results/results_challenger_model_1.json challenger_model_1 0.08python gen_noisy_results.py results/results_glonet.json results/results_challenger_model_2.json challenger_model_2 0.13 # Compile all the "website" (all pages) make -f Makefile.conda all # OR compile the leaderboard only make -f Makefile.conda html # (optional) Clean temporary files make -f Makefile.conda clean # open the leaderboard in dcleaderboard/_site/leaderboard.html |
Loading...