Standout Papers

Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing 2021 2026 2022 2024753
  1. Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing (2021)
    裕二 池谷, Robert Tinn et al. arXiv (Cornell University)

Immediate Impact

1 by Nobel laureates 3 from Science/Nature 67 standout
Sub-graph 1 of 21

Citing Papers

Recent progress and applications of Raman spectrum denoising algorithms in chemical and biological analyses: A review
2024 Standout
A deep learning interpretable model for river dissolved oxygen multi-step and interval prediction based on multi-source data fusion
2024 Standout
2 intermediate papers

Works of Michael Lucas being referenced

Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing
2021 Standout

Author Peers

Author Last Decade Papers Cites
Michael Lucas 7 16 365 570 122 13 793
Naoto Usuyama 2 6 399 615 138 11 869
裕二 池谷 3 8 432 672 145 13 942
Robert Tinn 1 6 383 598 132 4 812
Qiang Wei 4 3 341 389 75 29 889
Samuel G. Finlayson 1 28 177 403 192 21 966
Alexis Allot 1 480 230 42 22 926
Shang Gao 10 23 192 388 43 37 788
Yingce Xia 7 31 202 529 126 39 885
Etsuko Ishii 4 12 56 544 243 9 986
Zhicheng Cui 4 41 96 562 34 10 885

All Works

Loading papers...

Rankless by CCL
2026