You are here

FungiCLEF 2023

FungiCLEF 2023

SnakeCLEF

Schedule

  • December 2023: registration opens
  • 14 February 2023: training data release
  • 24 May 2023: deadline for submission of runs by participants
  • 27 May 2023: release of processed results by the task organizers
  • 7 June 2023: deadline for submission of working note papers by participants [CEUR-WS proceedings]
  • 30 June 2023: notification of acceptance of working note papers [CEUR-WS proceedings]
  • 7 July 2023: camera ready copy of participant's working note papers and extended lab overviews by organizers
  • 18-21 Sept 2023: CLEF 2023 Thessaloniki

Results

All teams that provided a runnable code were scored, and their scores are available on HuggingFace.
Official competition results and intermediate validation results are listed in the Private and Public leaderboards, respectively.

  • The 1st place team: meng18 (Huan Ren, Han Jiang, Wang Luo, Meng Meng and Tianzhu Zhang)
  • The 2nd place team: stefanwolf (Stefan Wolf and Jürgen Beyerer)
  • The 3rd place team: word2vector (Feiran Hu, Peng Wang, Yangyang Li, Chenlong Duan, Zijian Zhu, Yong Li and Xiu-Shen Wei)

Public Leaderboard Evaluation

Track1-Scoremacro F1-Score

Private Leaderboard Evaluation

Track1-ScoreTrack1-ScoreTrack1-Score

Motivation

Automatic recognition of fungi species assists mycologists, citizen scientists and nature enthusiasts in species identification in the wild. Its availability supports the collection of valuable biodiversity data. In practice, species identification typically does not depend solely on the visual observation of the specimen but also on other information available to the observer - such as habitat, substrate, location and time. Thanks to rich metadata, precise annotations, and baselines available to all competitors, the challenge provides a benchmark for image recognition with the use of additional information. Moreover, the toxicity of a mushroom can be crucial for the decision of a mushroom picker. We will explore the decision process within the competition beyond the commonly assumed 0/1 cost function.

Task Description

Given the set of real fungi species observations and corresponding metadata, the goal of the task is to create a classification model that, for each observation (multiple photographs of the same individual + geographical location), returns a ranked list of predicted species. The classification model will have to fit limits for memory footprint (ONNX model with max size of 1GB) and prediction time limit (will be announced later) measured on the submission server. The model should have to consider and minimize the danger to human life, i.e., the confusion between poisonous and edible species.

Participation requirements

Data

The challenge dataset is primarily based on the data from the Danish Fungi 2020 dataset, which contains 295,938 training images belonging to 1,604 species observed mostly in Denmark.
All training samples passed an expert validation process, guaranteeing high-quality labels. Rich observation metadata about habitat, substrate, time, location, EXIF etc. are provided.

The validation set contains 30,131 observations with 60,832 images and 2,713 species, covering the whole year and including observations collected across all substrate and habitat types. The public test set contains 30,130 observations with 60,225 images and similar data distribution as the validation set.

Using additional data or metadata is permitted!

Image Data Metadata

Other resources

Publication Track

LifeCLEF 2023 is an evaluation campaign that is being organized as part of the CLEF initiative labs. The campaign offers several research tasks that welcome team participation worldwide.

The results of the campaign appear in the working notes proceedings published by CEUR Workshop Proceedings (CEUR-WS.org).
Selected contributions among the participants will be invited for publication in the following year in the Springer Lecture Notes in Computer Science (LNCS), together with the annual lab overviews.

Context

This competition is held jointly as part of:

The participants are required to participate in the LifeCLEF lab by registering for it using the CLEF 2023 labs registration form (and checking "Task 5 - SnakeCLEF" of LifeCLEF).
Only registered participants should submit a working-note paper to peer-reviewed LifeCLEF proceedings (CEUR-WS) after the competition ends."

This paper should provide sufficient information to reproduce the final submitted runs.
Only participants who submitted a working-note paper will be part of the officially published ranking used for scientific communication.

Organizers

Credits

ZCU-FAV      CTU in Prague      University of Copenhagen      PiVa AI

Acknowledgement

TAČR