Hit papers significantly outperform the citation benchmark for their cohort. A paper qualifies
if it has ≥500 total citations, achieves ≥1.5× the top-1% citation threshold for papers in the
same subfield and year (this is the minimum needed to enter the top 1%, not the average
within it), or reaches the top citation threshold in at least one of its specific research
topics.
UC-Net: Uncertainty Inspired RGB-D Saliency Detection via Conditional Variational Autoencoders
2020272 citationsSaeed Anwar, Nick Barnes et al.ANU Open Research (Australian National University)profile →
Simultaneously Localize, Segment and Rank the Camouflaged Objects
This map shows the geographic impact of Nick Barnes's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Nick Barnes with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Nick Barnes more than expected).
This network shows the impact of papers produced by Nick Barnes. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Nick Barnes. The network helps show where Nick Barnes may publish in the future.
Co-authorship network of co-authors of Nick Barnes
This figure shows the co-authorship network connecting the top 25 collaborators of Nick Barnes.
A scholar is included among the top collaborators of Nick Barnes based on the total number of
citations received by their joint publications. Widths of edges
represent the number of papers authors have co-authored together.
Node borders
signify the number of papers an author published with Nick Barnes. Nick Barnes is excluded from
the visualization to improve readability, since they are connected to all nodes in the network.
Naveed, Humza, Asad Ullah Khan, Shi Qiu, et al.. (2025). A Comprehensive Overview of Large Language Models. ACM Transactions on Intelligent Systems and Technology. 16(5). 1–72.72 indexed citations breakdown →
Wang, Jianyuan, Kaihao Zhang, Nick Barnes, et al.. (2023). Vicinity Vision Transformer. IEEE Transactions on Pattern Analysis and Machine Intelligence. 45(10). 12635–12649.25 indexed citations
6.
Qiu, Shi, Saeed Anwar, & Nick Barnes. (2021). Geometric Back-Projection Network for Point Cloud Classification. IEEE Transactions on Multimedia. 24. 1943–1955.169 indexed citations breakdown →
7.
Qiu, Shi, Saeed Anwar, & Nick Barnes. (2021). PnP-3D: A Plug-and-Play for 3D Point Clouds. IEEE Transactions on Pattern Analysis and Machine Intelligence. 45(1). 1312–1319.26 indexed citations
8.
Scheerlinck, Cedric, Davide Scaramuzza, Tom Drummond, et al.. (2020). How to Train Your Event Camera Neural Network. arXiv (Cornell University).5 indexed citations
9.
Ayton, Lauren N., Nick Barnes, Gislin Dagnelie, et al.. (2019). An update on retinal prostheses. Clinical Neurophysiology. 131(6). 1383–1398.124 indexed citations
10.
Allen, Penelope J., David A. X. Nayagam, Stephanie B. Epp, et al.. (2019). A 44 channel suprachoroidal retinal prosthesis : surgical approach, safety and stability.. Investigative Ophthalmology & Visual Science. 60(9). 4983–4983.1 indexed citations
11.
Wang, Tao, Xuming He, & Nick Barnes. (2015). Glass object localization by joint inference of boundary and depth. ANU Open Research (Australian National University).13 indexed citations
12.
Ayton, Lauren N., Fleur O’Hare, Chris McCarthy, et al.. (2015). A prototype suprachoroidal retinal prosthesis enables improvement in a tabletop object detection task. Investigative Ophthalmology & Visual Science. 56(7). 4782–4782.1 indexed citations
13.
Barnes, Nick, et al.. (2015). Tactile acuity determined with vibration motors for use in a sensory substitution device for the blind. Investigative Ophthalmology & Visual Science. 56(7). 2920–2920.2 indexed citations
Barnes, Nick, et al.. (2011). Mobility Experiments With Simulated Vision and sensory substitution of Depth. Investigative Ophthalmology & Visual Science. 52(14). 4945–4945.7 indexed citations
16.
Barnes, Nick, et al.. (2006). Insect Inspired Robots. Swinburne Research Bank (Swinburne University of Technology).5 indexed citations
17.
Walker, Janine, Nick Barnes, & Kaarin J. Anstey. (2006). Sign Detection and Driving Competency for Older Drivers with Impaired Vision. ANU Open Research (Australian National University).3 indexed citations
18.
Barnes, Nick, et al.. (2004). Active Vision - Rectification and Depth Mapping. ANU Open Research (Australian National University).8 indexed citations
19.
Barnes, Nick, et al.. (2004). Regular Polygon Detection as an Interest Point Operator for SLAM. ANU Open Research (Australian National University).4 indexed citations
20.
McCarthy, Chris & Nick Barnes. (2003). Performance of temporal filters for optical flow estimation in mobile robot corridor centring and visual odometry. Swinburne Research Bank (Swinburne University of Technology).5 indexed citations
Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive
bibliographic database. While OpenAlex provides broad and valuable coverage of the global
research landscape, it—like all bibliographic datasets—has inherent limitations. These include
incomplete records, variations in author disambiguation, differences in journal indexing, and
delays in data updates. As a result, some metrics and network relationships displayed in
Rankless may not fully capture the entirety of a scholar's output or impact.