To main content

ContrastNER: Contrastive-based Prompt Tuning for Few-shot NER

Abstract

Prompt-based language models have produced encouraging results in numerous applications, including Named Entity Recognition (NER) tasks. NER aims to identify entities in a sentence and provide their types. However, the strong performance of most available NER approaches is heavily dependent on the design of discrete prompts and a verbalizer to map the model-predicted outputs to entity categories, which are complicated undertakings. To address these challenges, we present ContrastNER, a prompt-based NER framework that employs both discrete and continuous tokens in prompts and uses a contrastive learning approach to learn the continuous prompts and forecast entity types. The experimental results demonstrate that ContrastNER obtains competitive performance to the state-of-the-art NER methods in high-resource settings and outperforms the state-of-the-art models in low-resource circumstances without requiring extensive manual prompt engineering and verbalizer design.
Read the publication

Category

Academic chapter

Language

English

Author(s)

  • Amirhossein Layegh
  • Amir Hossein Payberah
  • Ahmet Soylu
  • Titi Roman
  • Mihhail Matskin

Affiliation

  • SINTEF Digital / Sustainable Communication Technologies
  • Royal Institute of Technology
  • OsloMet - Oslo Metropolitan University

Year

2023

Publisher

IEEE (Institute of Electrical and Electronics Engineers)

Book

2023 IEEE 47th Annual Computers, Software, and Applications Conference (COMPSAC)

ISBN

9798350326970

View this publication at Norwegian Research Information Repository