On The Cross-Modal Transfer from Natural Language to Code through Adapter Modules

Abstract

Pre-trained neural Language Models (PTLM), such as CodeBERT, are recently used in software engineering as models pre-trained on large source code corpora. Although adapters are known to facilitate adapting to many downstream tasks compared to fine-tuning the model that require retraining all of the models’ parameters – which owes to the adapters’ plug and play nature and being parameter efficient – their usage in software engineering is not explored.

Publication
30th International Conference on Program Comprehension 2022