This map shows the geographic impact of Greg Yang's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Greg Yang with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Greg Yang more than expected).
This network shows the impact of papers produced by Greg Yang. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Greg Yang. The network helps show where Greg Yang may publish in the future.
Co-authorship network of co-authors of Greg Yang
This figure shows the co-authorship network connecting the top 25 collaborators of Greg Yang.
A scholar is included among the top collaborators of Greg Yang based on the total number of
citations received by their joint publications. Widths of edges
represent the number of papers authors have co-authored together.
Node borders
signify the number of papers an author published with Greg Yang. Greg Yang is excluded from
the visualization to improve readability, since they are connected to all nodes in the network.
All Works
14 of 14 papers shown
1.
Yang, Greg, et al.. (2021). Tensor Programs IIb: Architectural Universality Of Neural Tangent Kernel Training Dynamics. International Conference on Machine Learning. 11762–11772.5 indexed citations
Yang, Greg & J. Edward Hu. (2021). Tensor Programs IV: Feature Learning in Infinite-Width Neural Networks. 11727–11737.13 indexed citations
4.
Salman, Hadi, Mingjie Sun, Greg Yang, Ashish Kapoor, & J. Zico Kolter. (2020). Denoised Smoothing: A Provable Defense for Pretrained Classifiers. Neural Information Processing Systems. 33. 21945–21957.2 indexed citations
5.
Yang, Greg, et al.. (2020). Randomized Smoothing of All Shapes and Sizes. 1. 10693–10705.15 indexed citations
6.
Yang, Greg, et al.. (2019). A Mean Field Theory of Batch Normalization. International Conference on Learning Representations.13 indexed citations
7.
Yang, Greg. (2019). Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes. Neural Information Processing Systems. 32. 9947–9960.15 indexed citations
8.
Salman, Hadi, Greg Yang, Huan Zhang, Cho‐Jui Hsieh, & Pengchuan Zhang. (2019). A Convex Relaxation Barrier to Tight Robustness Verification of Neural Networks. Neural Information Processing Systems. 32. 9832–9842.16 indexed citations
9.
Yang, Greg. (2019). Tensor Programs I: Wide Feedforward or Recurrent Neural Networks of Any Architecture are Gaussian Processes. Neural Information Processing Systems.1 indexed citations
10.
Salman, Hadi, Ilya Razenshteyn, Pengchuan Zhang, et al.. (2019). Provably Robust Deep Learning via Adversarially Trained Smoothed Classifiers. arXiv (Cornell University). 32. 11289–11300.29 indexed citations
11.
Gilboa, Dar, Bo Chang, Minmin Chen, et al.. (2019). The Dynamics of Signal Propagation in Gated Recurrent Neural Networks.1 indexed citations
12.
Yang, Greg, et al.. (2018). Deep Mean Field Theory: Layerwise Variance and Width Variation as Methods to Control Gradient Explosion. International Conference on Learning Representations.1 indexed citations
Yang, Greg & Samuel S. Schoenholz. (2017). Mean Field Residual Networks: On the Edge of Chaos. Neural Information Processing Systems. 30. 7103–7114.6 indexed citations
Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive
bibliographic database. While OpenAlex provides broad and valuable coverage of the global
research landscape, it—like all bibliographic datasets—has inherent limitations. These include
incomplete records, variations in author disambiguation, differences in journal indexing, and
delays in data updates. As a result, some metrics and network relationships displayed in
Rankless may not fully capture the entirety of a scholar's output or impact.