Standout Papers

Parameter-efficient fine-tuning of large-scale pre-trained language ... 2021 2026 2022 2024270
  1. Parameter-efficient fine-tuning of large-scale pre-trained language models (2023)
    Ning Ding, Yujia Qin et al. Nature Machine Intelligence
  2. KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation (2021)
    Xiaozhi Wang, Tianyu Gao et al. Transactions of the Association for Computational Linguistics

Immediate Impact

1 by Nobel laureates 1 from Science/Nature 55 standout
Sub-graph 1 of 20

Citing Papers

Security and Privacy Challenges of Large Language Models: A Survey
2025 Standout
Mineral-based electromagnetic wave absorbers and shields: Latest progress and perspectives
2025 Standout

Works of Juanzi Li being referenced

A Survey of Knowledge Enhanced Pre-Trained Language Models
2023
Enhanced mechanical and electromagnetic properties of polymer composite with 2.5D novel carbon/quartz fiber core-spun yarn woven fabric
2019

Author Peers

Author Last Decade Papers Cites
Juanzi Li 1784 674 374 125 2.9k
Li Xiong 1649 603 148 145 3.7k
Zhibo Wang 1561 695 126 194 4.3k
Weinan Zhang 2008 643 221 143 3.9k
Ngoc Thanh Nguyên 1207 656 164 255 2.4k
Huanhuan Chen 1294 239 68 193 3.3k
Jundong Li 1952 703 267 120 3.7k
Yang Liu 1456 397 150 73 2.3k
Claudio Gutiérrez 1364 501 62 98 3.8k
Qiang Qu 868 688 194 166 3.2k
Zhili Zhou 898 479 240 188 4.1k

All Works

Loading papers...

Rankless by CCL
2026