Standout Papers

ERNIE 2.0: A Continual Pre-Training Framework for Language Understanding 2020 2026 2022 2024279
  1. ERNIE 2.0: A Continual Pre-Training Framework for Language Understanding (2020)
    Yu Sun, Shuohuan Wang et al. Proceedings of the AAAI Conference on Artificial Intelligence

Immediate Impact

1 from Science/Nature 71 standout
Sub-graph 1 of 22

Citing Papers

Security and Privacy Challenges of Large Language Models: A Survey
2025 Standout
A Survey on Intelligent Network Operations and Performance Optimization Based on Large Language Models
2025 Standout
3 intermediate papers

Works of Shuohuan Wang being referenced

ERNIE-M: Enhanced Multilingual Representation by Aligning Cross-lingual Semantics with Monolingual Corpora
2021
ERNIE 2.0: A Continual Pre-Training Framework for Language Understanding
2020 Standout

Author Peers

Author Last Decade Papers Cites
Shuohuan Wang 262 67 1 35 4 299
Victor Sanh 263 59 1 39 3 310
Canwen Xu 258 60 28 8 280
John Hewitt 201 50 5 33 7 303
Julien Plu 256 51 32 8 285
Minh-Thang Luong 269 69 4 44 11 327
Bryan Wilie 222 21 10 43 10 313
Yejin Bang 186 20 10 31 8 295
Holy Lovenia 180 19 10 32 9 268
Gregor Heinrich 184 47 99 4 304
Ganggao Zhu 257 23 3 44 5 310

All Works

Loading papers...

Rankless by CCL
2026