Hit papers significantly outperform the citation benchmark for their cohort. A paper qualifies
if it has ≥500 total citations, achieves ≥1.5× the top-1% citation threshold for papers in the
same subfield and year (this is the minimum needed to enter the top 1%, not the average
within it), or reaches the top citation threshold in at least one of its specific research
topics.
Learning Deep Features for Discriminative Localization
20166.5k citationsBolei Zhou, Aditya Khosla et al.profile →
Places: A 10 Million Image Database for Scene Recognition
20172.3k citationsBolei Zhou, Àgata Lapedriza et al.IEEE Transactions on Pattern Analysis and Machine Intelligenceprofile →
Scene Parsing through ADE20K Dataset
20171.8k citationsBolei Zhou, Antonio Torralba et al.DSpace@MIT (Massachusetts Institute of Technology)profile →
This map shows the geographic impact of Bolei Zhou's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Bolei Zhou with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Bolei Zhou more than expected).
This network shows the impact of papers produced by Bolei Zhou. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Bolei Zhou. The network helps show where Bolei Zhou may publish in the future.
Co-authorship network of co-authors of Bolei Zhou
This figure shows the co-authorship network connecting the top 25 collaborators of Bolei Zhou.
A scholar is included among the top collaborators of Bolei Zhou based on the total number of
citations received by their joint publications. Widths of edges
represent the number of papers authors have co-authored together.
Node borders
signify the number of papers an author published with Bolei Zhou. Bolei Zhou is excluded from
the visualization to improve readability, since they are connected to all nodes in the network.
Shen, Yujun, Jinjin Gu, Xiaoou Tang, & Bolei Zhou. (2020). Interpreting the Latent Space of GANs for Semantic Face Editing. 9240–9249.562 indexed citations breakdown →
10.
Yang, Ceyuan, Yinghao Xu, Jianping Shi, Bo Dai, & Bolei Zhou. (2020). Temporal Pyramid Network for Action Recognition. The HKU Scholars Hub (University of Hong Kong). 588–597.279 indexed citations breakdown →
11.
Zhu, Jiapeng, Deli Zhao, Bolei Zhou, & Bo Zhang. (2019). LIA: Latently Invertible Autoencoder with Adversarial Learning. arXiv (Cornell University).5 indexed citations
12.
Bau, David, Jun-Yan Zhu, Hendrik Strobelt, et al.. (2019). Visualizing and Understanding Generative Adversarial Networks (Extended Abstract).. arXiv (Cornell University).2 indexed citations
13.
Huang, Chaoqin, et al.. (2019). DrivingStereo: A Large-Scale Dataset for Stereo Matching in Autonomous Driving Scenarios. IEEE Conference Proceedings. 2019. 899–908.2 indexed citations
Zhou, Bolei, Aditya Khosla, Àgata Lapedriza, Aude Oliva, & Antonio Torralba. (2016). Learning Deep Features for Discriminative Localization. 2921–2929.6518 indexed citations breakdown →
18.
Zhou, Bolei, Aditya Khosla, Àgata Lapedriza, Aude Oliva, & Antonio Torralba. (2015). Object Detectors Emerge in Deep Scene CNNs. DSpace@MIT (Massachusetts Institute of Technology).275 indexed citations
19.
Zhou, Bolei. (2014). Research on Recognition EW and Cyberspace Operation Based on“OODA loop”Theory.1 indexed citations
Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive
bibliographic database. While OpenAlex provides broad and valuable coverage of the global
research landscape, it—like all bibliographic datasets—has inherent limitations. These include
incomplete records, variations in author disambiguation, differences in journal indexing, and
delays in data updates. As a result, some metrics and network relationships displayed in
Rankless may not fully capture the entirety of a scholar's output or impact.