Ahmed Elnaggar, Wei Ding, Llion Jones, Tom Gibbs, Tamas Feher, Christoph Angerer, Silvia Severini, Florian Matthes, Burkhard Rost
Currently, a growing number of mature natural language processing applications make people's life more convenient. Such applications are built by source code - the language in software engineering. However, the applications for understanding source code language to ease the software engineering process are under-researched. Simultaneously, the transformer model, especially its combination with transfer learning, has been proven to be a powerful technique for natural language processing tasks. These breakthroughs point out a promising direction for process source code and crack software engineering tasks. This paper describes CodeTrans - an encoder-decoder transformer model for tasks in the software engineering domain, that explores the effectiveness of encoder-decoder transformer models for six software engineering tasks, including thirteen sub-tasks. Moreover, we have investigated the effect of different training strategies, including single-task learning, transfer learning, multi-task learning, and multi-task learning with fine-tuning. CodeTrans outperforms the state-of-the-art models on all the tasks. To expedite future works in the software engineering domain, we have published our pre-trained models of CodeTrans. https://github.com/agemagician/CodeTrans
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Text Generation | CodeSearchNet - Python | Smoothed BLEU-4 | 20.39 | CodeTrans-MT-Base |
| Text Generation | CodeSearchNet - Go | Smoothed BLEU-4 | 19.54 | CodeTrans-TF-Large |
| Text Generation | CodeSearchNet - JavaScript | Smoothed BLEU-4 | 18.98 | CodeTrans-TF-Large |
| Text Generation | CodeSearchNet - Php | Smoothed BLEU-4 | 26.23 | CodeTrans-MT-Base |
| Text Generation | CodeSearchNet - Java | Smoothed BLEU-4 | 21.87 | CodeTrans-MT-Large |
| Text Generation | CodeSearchNet - Ruby | Smoothed BLEU-4 | 15.26 | CodeTrans-MT-Base |
| Code Generation | CodeSearchNet - Python | Smoothed BLEU-4 | 20.39 | CodeTrans-MT-Base |
| Code Generation | CodeSearchNet - Go | Smoothed BLEU-4 | 19.54 | CodeTrans-TF-Large |
| Code Generation | CodeSearchNet - JavaScript | Smoothed BLEU-4 | 18.98 | CodeTrans-TF-Large |
| Code Generation | CodeSearchNet - Php | Smoothed BLEU-4 | 26.23 | CodeTrans-MT-Base |
| Code Generation | CodeSearchNet - Java | Smoothed BLEU-4 | 21.87 | CodeTrans-MT-Large |
| Code Generation | CodeSearchNet - Ruby | Smoothed BLEU-4 | 15.26 | CodeTrans-MT-Base |
| Program Synthesis | AlgoLisp | Accuracy | 90.31 | CodeTrans-MT-TF-Small |
| Source Code Summarization | Summarizing Source Code using a Neural Attention Model - C# | Smoothed BLEU-4 | 23.57 | CodeTrans-MT-Large |
| Source Code Summarization | Summarizing Source Code using a Neural Attention Model - Python | Smoothed BLEU-4 | 13.37 | CodeTrans-MT-Base |
| Source Code Summarization | Summarizing Source Code using a Neural Attention Model - SQL | Smoothed BLEU-4 | 19.98 | CodeTrans-MT-TF-Large |
| Git Commit Message Generation | CommitGen | BLEU-4 | 44.41 | CodeTrans-TF-Large |
| API Sequence Recommendation | DeepAPI | BLEU-4 | 73.39 | CodeTrans-MT-TF-Large |
| Code Documentation Generation | CodeSearchNet - Python | Smoothed BLEU-4 | 20.39 | CodeTrans-MT-Base |
| Code Documentation Generation | CodeSearchNet - Go | Smoothed BLEU-4 | 19.54 | CodeTrans-TF-Large |
| Code Documentation Generation | CodeSearchNet - JavaScript | Smoothed BLEU-4 | 18.98 | CodeTrans-TF-Large |
| Code Documentation Generation | CodeSearchNet - Php | Smoothed BLEU-4 | 26.23 | CodeTrans-MT-Base |
| Code Documentation Generation | CodeSearchNet - Java | Smoothed BLEU-4 | 21.87 | CodeTrans-MT-Large |
| Code Documentation Generation | CodeSearchNet - Ruby | Smoothed BLEU-4 | 15.26 | CodeTrans-MT-Base |
| Code Comment Generation | DeepCom | Smoothed BLEU-4 | 39.5 | CodeTrans-TF-Large |