Motivation

Lung cancer is the deadliest type of cancer worldwide for both men and women. Progress in increasing lung cancer survival rate has been notoriously slow in contrast to other cancer types, mainly due to late diagnosis of the disease. Low-dose computed tomography (CT) has long been suggested as a potential early screening tool and a 20% reduction in lung cancer mortality has been demonstrated for lung cancer risk groups. Nevertheless, translation of these screening programs to the general population has been challenging due to equipment and personnel costs as well as the complexity of the task. Namely, lung nodules present a large range of shapes and characteristics and thus the identification and characterization of these abnormalities is not trivial and prone to high interobserver variability. Computer-aided diagnosis (CAD) systems can thus facilitate the adoption and generalization of screening programs by reducing the burden on the clinicians and providing an independent second-opinion.


Aims

The main goal of this challenge is the automatic classification of chest CT scans according to the 2017 Fleischner society pulmonary nodule guidelines for patient follow-up recommendation. The Fleischner guidelines are widely used for patient management in the case of nodule findings, and are composed of 4 classes, taking into account the number of nodules (single or multiple), their volume (<100mm³, 100-250mm³ and ⩾250mm³) and texture (solid, part solid and ground glass opacities (GGO)). Furthermore, three additional sub-challenges will be held related to the different tasks needed to calculate a Fleischner score. The challenge is thus made up of four different parts:

  • Main Challenge - Fleischner Classification: From chest CT scans, participants must predict the correct follow-up according to the 2017 Fleischner guidelines;
  • Sub-Challenge A - Nodule Detection: From chest CT scans, participants must detect pulmonary nodules;
  • Sub-Challenge B - Nodule Segmentation: Given a list of >3mm nodule centroids, participants must segment the nodules in the corresponding chest CT scans;
  • Sub-Challenge C - Nodule Texture Characterization: Given a list of nodule centroids, participants must classify nodules into three texture classes - solid, sub-solid and GGO.

Teams may choose whether to participate only in the main challenge, in a single or multiple sub-challenges or in all challenges. Each task will be evaluated separately and a prize for the best performing method in each challenge will be awarded.


Participation guidelines

This challenge is organized in association with the ICIAR 2020 conference. Participation on the challenge requires a submission of a paper describing the approach and achieved results to the conference proceedings. Please read the Rules section.

 Important dates

  • Challenge start and training set release: 20 November, 2019
  • Train/validation set prediction submission and ICIAR 2020 paper submission deadline: 10 February, 2020
  • Test set release: 11 February, 2020
  • Test set prediction submission deadline: 17 February, 202020 February, 2020
  • Results announcement: 18 February, 202020 February, 2020

Challenge organization

The challenge is organized by INESC TEC, Porto, Portugal in collaboration with the São João Hospital Centre, Porto, Portugal, the Faculty of Engineering of Universidade do Porto and the Faculty of Medicine of Universidade do Porto. The team has experience in machine learning and computer vision, as well as clinical expertise, and has previous experience in the organization of challenges. The present grand challenge is organized by:

For questions related to the challenge, please use the Forum page or send us an e-mail at iciar2020.lndbchallenge@gmail.com.

This work was financed by the European Regional Development Fund (ERDF) through the Operational Programme for Competitiveness - COMPETE 2020 Programme and by National Funds through the Portuguese Funding agency, FCT - Fundac¸ao para a Ciência e Tecnologia within project PTDC/EEI-SII/6599/2014 (POCI01-0145-FEDER-016673) and by the FCT grant contract SFRH/BD/120435/2016.