Hands on Demo - Manual

HANDS ON DEMO - USER MANUAL

 

1. Get the source code :


For an overview of the repositories, go to the page of the Github Organization : https://github.com/ocean-ai-data-challenges


# downloading the code :


- DC1: git clone https://github.com/ocean-ai-data-challenges/dc1-emulating-global-ocean.git
- DC2: git clone https://github.com/ocean-ai-data-challenges/dc2-forecasting-global-ocean-dynamics.git
- DC3: git clone https://github.com/ocean-ai-data-challenges/dc3-sea-ice-forecasting.git
- DC4: git clone https://github.com/ocean-ai-data-challenges/dc4-forecasting-tropical-cyclones.git - implementation is in progress -
- DC5: git clone https://github.com/ocean-ai-data-challenges/dc5-marine-biodiversity-prediction.git - implementation is in progress -


2. Run the Evaluation

2.1. Run Locally


# install poetry (install pipx first):


    sudo apt update
    sudo apt install pipx
    pipx ensurepath

    pipx install poetry

# create a conda environement :


 conda create --name test-dc python=3.11; conda activate test-dc

# install esmf dependency (not supported by poetry)

conda install -c conda-forge esmf esmpy

# install preject dependencies using poetry 
cd <LIB_PATH>
poetry lock; poetry install

# run the data challenge using the python script
DC1: poetry run python <ROOT_PATH>/dc1-emulating-global-ocean/dc1/evaluate.py
The json file with the results of the evaluation will be saved to  <LIB_PATH>/dc1_output/results
 
DC3:   poetry run python <ROOT_PATH>/dc1-emulating-global-ocean/dc1/evaluate.py --logfile <LOG_PATH>/dc1.log --data_directory <TEMP_DATA_PATH>
 
  
 

2.2. Run Using Edito Datalab

Open the link to Edito services :

DC1 : https://datalab.dive.edito.eu/launcher/service-playground/dc1-emulating-global-ocean?name=dc1-emulating-global-ocean

then run run "python dc1/evaluate.py"

 

DC3: https://datalab.dive.edito.eu/launcher/service-playground/dc3-sea-ice-forecasting

 

run "python dc1/evaluate.py --logfile <LOG_PATH>/dc3.log --data_directory <TEMP_DATA_PATH>"

 

2.3. Run using Docker images :

 # prerequisite : install Docker desktop

For Linux users look here : https://docs.docker.com/desktop/setup/install/linux/ubuntu/ 

DC1: docker run --rm -p 8888:8888 --name dc1-lab ghcr.io/ocean-ai-data-challenges/dc1-emulating-global-ocean:0.1.0

then copy the link  http://127.0.0.1 .... to your browser, open a terminal and run "python dc1/evaluate.py"

 

DC3docker run --rm -p 8888:8888 --name dc3-lab ghcr.io/ocean-ai-data-challenges/dc-sea-ice-forecasting:edito-gpu-0.1.0 

then copy the link  http://127.0.0.1 .... to your browser, open a terminal and run "python dc3/evaluate.py --logfile <LOG_PATH>/dc3.log --data_directory <TEMP_DATA_PATH>"

 

3. Create the leaderboard tables

# prerequisite : install Quarto : https://quarto.org/docs/get-started/
 
# Get the code :
git clone https://github.com/ocean-ai-data-challenges/dc_leaderboard.git

# cd to dcleaderboard folder
cd <root_path>/dcleaderboard

# create the Conda env
make -f Makefile.conda install

# (optional) Check the environment install
make -f Makefile.conda test-env
 
# copy the results json file in dcleaderboard/results
 
# create "fake" challenger result files
python gen_noisy_results.py results/results_glonet.json results/results_challenger_model_1.json challenger_model_1 0.08
python gen_noisy_results.py results/results_glonet.json results/results_challenger_model_2.json challenger_model_2 0.13

# Compile all the "website" (all pages) 
make -f Makefile.conda all

# OR compile the leaderboard only
make -f Makefile.conda html

#  (optional) Clean temporary files
make -f Makefile.conda clean

# open the leaderboard in dcleaderboard/_site/leaderboard.html 
Loading... Loading...