Precise and Efficient Learning using Attention Mechanisms (PRELUNA)

Type: National Project Project

Duration: from 2022 Jan 01 to 2024 Dec 31

Financed by: FCT

Prime Contractor: R - INESC-ID Lisboa (Other) - Lisboa, Portugal

Study and apply attention and self-attention mechanisms to improve the performance of deep neural networks in computer vision tasks. Two use cases will be used to assess the effectiveness of the approach: the diagnosis of ischemic stroke severity from brain computed tomographies (brain-CT) and the diagnosis of coronary artery stenosis, from X-ray angiographies.
Our general strategy is to use attention to focus the networks in the regions of interest, which will improve the classification accuracy of the systems. In the process, we may also shed light on the important relations between the phenomena of attention, consciousness, and human ability to learn from small samples, a phenomenon known as fre-shot learning.


  • IT (Other)
  • R - INESC-ID Lisboa (Other) - Lisboa, Portugal

Principal Investigators