Ruidan He

1.8k total citations
14 papers, 874 citations indexed

About

Ruidan He is a scholar working on Artificial Intelligence, Information Systems and Computer Vision and Pattern Recognition. According to data from OpenAlex, Ruidan He has authored 14 papers receiving a total of 874 indexed citations (citations by other indexed papers that have themselves been cited), including 14 papers in Artificial Intelligence, 1 paper in Information Systems and 1 paper in Computer Vision and Pattern Recognition. Recurrent topics in Ruidan He's work include Topic Modeling (14 papers), Natural Language Processing Techniques (8 papers) and Sentiment Analysis and Opinion Mining (7 papers). Ruidan He is often cited by papers focused on Topic Modeling (14 papers), Natural Language Processing Techniques (8 papers) and Sentiment Analysis and Opinion Mining (7 papers). Ruidan He collaborates with scholars based in Singapore, Cayman Islands and China. Ruidan He's co-authors include Hwee Tou Ng, Daniel Dahlmeier, Wee Sun Lee, Lidong Bing, Wee Sun Lee, Qingyu Tan, Luo Si, Zuozhu Liu, Yan Zhang and Kwan Hui Lim and has published in prestigious journals such as Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing and International Conference on Computational Linguistics.

In The Last Decade

Ruidan He

14 papers receiving 841 citations

Peers — A (Enhanced Table)

Peers by citation overlap · career bar shows stage (early→late) cites · hero ref

Name h Career Trend Papers Cites
Ruidan He Singapore 13 814 91 78 39 34 14 874
Marianna Apidianaki France 11 1.1k 1.3× 105 1.2× 55 0.7× 60 1.5× 24 0.7× 42 1.1k
Shuohuan Wang China 7 527 0.6× 75 0.8× 131 1.7× 26 0.7× 21 0.6× 15 605
Ziqiang Cao China 13 887 1.1× 85 0.9× 148 1.9× 17 0.4× 34 1.0× 38 1.0k
Mandy Guo United States 10 486 0.6× 57 0.6× 125 1.6× 28 0.7× 12 0.4× 11 566
Anastasia Shimorina France 6 396 0.5× 50 0.5× 67 0.9× 16 0.4× 27 0.8× 11 478
Vítor Mangaravite Brazil 5 382 0.5× 130 1.4× 38 0.5× 29 0.7× 22 0.6× 10 458
Marina Litvak Israel 11 418 0.5× 97 1.1× 32 0.4× 39 1.0× 22 0.6× 52 493
Mihir Kale United States 7 942 1.2× 85 0.9× 220 2.8× 14 0.4× 34 1.0× 9 1.0k
Yuanhe Tian China 16 702 0.9× 51 0.6× 76 1.0× 15 0.4× 25 0.7× 37 742
Rodrigo Agerri Spain 14 462 0.6× 58 0.6× 26 0.3× 37 0.9× 39 1.1× 47 536

Countries citing papers authored by Ruidan He

Since Specialization
Citations

This map shows the geographic impact of Ruidan He's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Ruidan He with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Ruidan He more than expected).

Fields of papers citing papers by Ruidan He

Since Specialization
Physical SciencesHealth SciencesLife SciencesSocial Sciences

This network shows the impact of papers produced by Ruidan He. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Ruidan He. The network helps show where Ruidan He may publish in the future.

Co-authorship network of co-authors of Ruidan He

This figure shows the co-authorship network connecting the top 25 collaborators of Ruidan He. A scholar is included among the top collaborators of Ruidan He based on the total number of citations received by their joint publications. Widths of edges represent the number of papers authors have co-authored together. Node borders signify the number of papers an author published with Ruidan He. Ruidan He is excluded from the visualization to improve readability, since they are connected to all nodes in the network.

All Works

14 of 14 papers shown
1.
Liu, Linlin, Xin Li, Ruidan He, et al.. (2022). Enhancing Multilingual Language Model with Massive Multilingual Knowledge Triples. 6878–6890. 10 indexed citations
2.
Tan, Qingyu, Ruidan He, Lidong Bing, & Hwee Tou Ng. (2022). Document-Level Relation Extraction with Adaptive Focal Loss and Knowledge Distillation. Findings of the Association for Computational Linguistics: ACL 2022. 1672–1681. 54 indexed citations
3.
Cheng, Liying, Lidong Bing, Ruidan He, et al.. (2022). IAM: A Comprehensive and Large-Scale Dataset for Integrated Argument Mining Tasks. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2277–2287. 13 indexed citations
4.
Li, Xin, Ruidan He, Lidong Bing, et al.. (2022). MELM: Data Augmentation with Masked Entity Language Modeling for Low-Resource NER. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2251–2262. 46 indexed citations
5.
He, Ruidan, Linlin Liu, Hai Ye, et al.. (2021). On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation. 2208–2222. 73 indexed citations
6.
Zhang, Yan, Ruidan He, Zuozhu Liu, Lidong Bing, & Haizhou Li. (2021). Bootstrapped Unsupervised Sentence Representation Learning. 5168–5180. 19 indexed citations
7.
Zhang, Wenxuan, Ruidan He, Haiyun Peng, Lidong Bing, & Wai Lam. (2021). Cross-lingual Aspect-based Sentiment Analysis with Aspect Term Code-Switching. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing. 9220–9230. 26 indexed citations
8.
Zhang, Yan, Ruidan He, Zuozhu Liu, Kwan Hui Lim, & Lidong Bing. (2020). An Unsupervised Sentence Embedding Method by Mutual Information Maximization. 1601–1610. 75 indexed citations
9.
Ye, Hai, Qingyu Tan, Ruidan He, et al.. (2020). Feature Adaptation of Pre-Trained Language Models across Languages and Domains with Robust Self-Training. 7386–7399. 27 indexed citations
10.
Li, Juntao, Ruidan He, Hai Ye, et al.. (2020). Unsupervised Domain Adaptation of a Pretrained Cross-Lingual Language Model. 3672–3678. 14 indexed citations
11.
He, Ruidan, Wee Sun Lee, Hwee Tou Ng, & Daniel Dahlmeier. (2018). Effective Attention Modeling for Aspect-Level Sentiment Classification. International Conference on Computational Linguistics. 1121–1131. 94 indexed citations
12.
He, Ruidan, Wee Sun Lee, Hwee Tou Ng, & Daniel Dahlmeier. (2018). Exploiting Document Knowledge for Aspect-level Sentiment Classification. 579–585. 144 indexed citations
13.
He, Ruidan, Wee Sun Lee, Hwee Tou Ng, & Daniel Dahlmeier. (2018). Adaptive Semi-supervised Learning for Cross-domain Sentiment Classification. 3467–3476. 43 indexed citations
14.
He, Ruidan, Wee Sun Lee, Hwee Tou Ng, & Daniel Dahlmeier. (2017). An Unsupervised Neural Attention Model for Aspect Extraction. 388–397. 236 indexed citations

Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive bibliographic database. While OpenAlex provides broad and valuable coverage of the global research landscape, it—like all bibliographic datasets—has inherent limitations. These include incomplete records, variations in author disambiguation, differences in journal indexing, and delays in data updates. As a result, some metrics and network relationships displayed in Rankless may not fully capture the entirety of a scholar's output or impact.

Explore authors with similar magnitude of impact

Rankless by CCL
2026