Standout Papers

CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation 2021 2026 2022 2024362
  1. CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation (2021)
    Yue Wang, Weishi Wang et al. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

Immediate Impact

1 by Nobel laureates 70 standout
Sub-graph 1 of 22

Citing Papers

A Comprehensive Overview of Large Language Models
2025 Standout
When LLMs meet cybersecurity: a systematic literature review
2025 Standout
7 intermediate papers

Works of Weishi Wang being referenced

CodeT5: Identifier-aware Unified Pre-trained Encoder-Decoder Models for Code Understanding and Generation
2021 Standout

Author Peers

Author Last Decade Papers Cites
Weishi Wang 190 248 133 5 368
Bas Cornelissen 122 1 257 117 13 353
Amy Moormann Zaremski 243 260 81 5 328
Eric Rescorla 133 198 81 12 405
James R. Lyle 53 273 248 12 348
Cathrin Weiß 206 188 83 7 354
Song Wang 59 1 254 219 14 344
Spyros T. Halkidis 116 1 202 86 8 319
Brian Foote 204 3 223 90 6 392
Lauren Wiener 182 3 187 97 5 352
Yan Shoshitaishvili 180 181 120 14 383

All Works

Loading papers...

Rankless by CCL
2026