Tim Moon

409 total citations
9 papers, 224 citations indexed

About

Tim Moon is a scholar working on Artificial Intelligence, Computer Vision and Pattern Recognition and Molecular Biology. According to data from OpenAlex, Tim Moon has authored 9 papers receiving a total of 224 indexed citations (citations by other indexed papers that have themselves been cited), including 7 papers in Artificial Intelligence, 4 papers in Computer Vision and Pattern Recognition and 1 paper in Molecular Biology. Recurrent topics in Tim Moon's work include Advanced Neural Network Applications (4 papers), Stochastic Gradient Optimization Techniques (3 papers) and Topic Modeling (2 papers). Tim Moon is often cited by papers focused on Advanced Neural Network Applications (4 papers), Stochastic Gradient Optimization Techniques (3 papers) and Topic Modeling (2 papers). Tim Moon collaborates with scholars based in United States and Jamaica. Tim Moon's co-authors include Brian Van Essen, Nikoli Dryden, Sam Adé Jacobs, Marc Snir, Naoya Maruyama, Andy Yoo, Ian Karlin, Kevin McLoughlin, Derek Jones and David Hysom and has published in prestigious journals such as The International Journal of High Performance Computing Applications, OSTI OAI (U.S. Department of Energy Office of Scientific and Technical Information) and IEEE International Conference on High Performance Computing, Data, and Analytics.

In The Last Decade

Tim Moon

9 papers receiving 216 citations

Peers — A (Enhanced Table)

Peers by citation overlap · career bar shows stage (early→late) cites · hero ref

Name h Career Trend Papers Cites
Tim Moon United States 6 164 88 51 39 26 9 224
Coleman Hooper United States 6 92 0.6× 52 0.6× 41 0.8× 92 2.4× 3 0.1× 11 215
Chun‐Chen Tu United States 4 188 1.1× 51 0.6× 10 0.2× 21 0.5× 14 0.5× 5 205
Aurko Roy United States 5 222 1.4× 85 1.0× 15 0.3× 41 1.1× 5 0.2× 7 267
Jin Kyu Kim United States 6 103 0.6× 75 0.9× 133 2.6× 31 0.8× 4 0.2× 7 227
Warit Sirichotedumrong Japan 5 117 0.7× 264 3.0× 11 0.2× 13 0.3× 15 0.6× 7 304
Rawad Bitar Germany 8 159 1.0× 26 0.3× 104 2.0× 32 0.8× 14 0.5× 41 222
Farzin Haddadpour United States 7 76 0.5× 13 0.1× 62 1.2× 36 0.9× 14 0.5× 7 115
Sharan Narang United States 7 192 1.2× 104 1.2× 10 0.2× 15 0.4× 2 0.1× 7 269
Vadim Sheinin United States 7 95 0.6× 63 0.7× 27 0.5× 24 0.6× 3 0.1× 28 174
William S. Moses United States 8 43 0.3× 23 0.3× 89 1.7× 14 0.4× 7 0.3× 17 192

Countries citing papers authored by Tim Moon

Since Specialization
Citations

This map shows the geographic impact of Tim Moon's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Tim Moon with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Tim Moon more than expected).

Fields of papers citing papers by Tim Moon

Since Specialization
Physical SciencesHealth SciencesLife SciencesSocial Sciences

This network shows the impact of papers produced by Tim Moon. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Tim Moon. The network helps show where Tim Moon may publish in the future.

Co-authorship network of co-authors of Tim Moon

This figure shows the co-authorship network connecting the top 25 collaborators of Tim Moon. A scholar is included among the top collaborators of Tim Moon based on the total number of citations received by their joint publications. Widths of edges represent the number of papers authors have co-authored together. Node borders signify the number of papers an author published with Tim Moon. Tim Moon is excluded from the visualization to improve readability, since they are connected to all nodes in the network.

All Works

9 of 9 papers shown
1.
Moon, Tim, et al.. (2022). Parallelizing Graph Neural Networks via Matrix Compaction for Edge-Conditioned Networks. 30. 386–395. 1 indexed citations
2.
Moon, Tim, et al.. (2021). SUPER: SUb-Graph Parallelism for TransformERs. 7 indexed citations
3.
Jacobs, Sam Adé, Tim Moon, Kevin McLoughlin, et al.. (2021). Enabling rapid COVID-19 small molecule drug design through scalable deep learning of generative models. The International Journal of High Performance Computing Applications. 35(5). 469–482. 21 indexed citations
4.
Dryden, Nikoli, et al.. (2019). Channel and filter parallelism for large-scale CNN training. 1–20. 21 indexed citations
5.
Dryden, Nikoli, Naoya Maruyama, Tim Moon, et al.. (2018). Aluminum: An Asynchronous, GPU-Aware Communication Library Optimized for Large-Scale Training of Deep Neural Networks on HPC Systems. 1–13. 32 indexed citations
6.
Dobrev, Veselin, Jack Dongarra, Jed Brown, et al.. (2017). CEED ECP Milestone Report: Identify initial kernels, bake-off problems (benchmarks) and miniapps. Zenodo (CERN European Organization for Nuclear Research). 3 indexed citations
7.
Dryden, Nikoli, Sam Adé Jacobs, Tim Moon, & Brian Van Essen. (2016). Communication quantization for data-parallel training of deep neural networks. IEEE International Conference on High Performance Computing, Data, and Analytics. 1–8. 52 indexed citations
8.
Dryden, Nikoli, Tim Moon, Sam Adé Jacobs, & Brian Van Essen. (2016). Communication Quantization for Data-Parallel Training of Deep Neural Networks. OSTI OAI (U.S. Department of Energy Office of Scientific and Technical Information). 1–8. 86 indexed citations
9.
Moon, Tim, et al.. (2014). Predicting the Diagnosis of Type 2 Diabetes Using Electronic Medical Records. 1 indexed citations

Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive bibliographic database. While OpenAlex provides broad and valuable coverage of the global research landscape, it—like all bibliographic datasets—has inherent limitations. These include incomplete records, variations in author disambiguation, differences in journal indexing, and delays in data updates. As a result, some metrics and network relationships displayed in Rankless may not fully capture the entirety of a scholar's output or impact.

Explore authors with similar magnitude of impact

Rankless by CCL
2026