This map shows the geographic impact of Dan Alistarh's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Dan Alistarh with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Dan Alistarh more than expected).
This network shows the impact of papers produced by Dan Alistarh. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Dan Alistarh. The network helps show where Dan Alistarh may publish in the future.
Co-authorship network of co-authors of Dan Alistarh
This figure shows the co-authorship network connecting the top 25 collaborators of Dan Alistarh.
A scholar is included among the top collaborators of Dan Alistarh based on the total number of
citations received by their joint publications. Widths of edges
represent the number of papers authors have co-authored together.
Node borders
signify the number of papers an author published with Dan Alistarh. Dan Alistarh is excluded from
the visualization to improve readability, since they are connected to all nodes in the network.
Peşte, Alexandra, et al.. (2022). How Well Do Sparse ImageNet Models Transfer?. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 12256–12266.11 indexed citations
Faghri, Fartash, et al.. (2020). Adaptive Gradient Quantization for Data-Parallel SGD. Neural Information Processing Systems. 33. 3174–3185.3 indexed citations
10.
Alistarh, Dan, et al.. (2020). WoodFisher: Efficient second-order approximations for model compression.. arXiv (Cornell University).4 indexed citations
11.
Gürel, Nezihe Merve, et al.. (2018). Compressive Sensing with Low Precision Data Representation: Radio Astronomy and Beyond.. arXiv (Cornell University).1 indexed citations
12.
Alistarh, Dan, Zeyuan Allen-Zhu, & Jerry Li. (2018). Byzantine Stochastic Gradient Descent. arXiv (Cornell University). 31. 4613–4623.38 indexed citations
13.
Gürel, Nezihe Merve, et al.. (2018). Compressive Sensing with Low Precision Data Representation: Theory and Applications. arXiv (Cornell University).1 indexed citations
14.
Zhang, Hantian, Jerry Li, Kaan Kara, et al.. (2017). ZipML: Training Linear Models with End-to-End Low Precision, and a Little Bit of Deep Learning. International Conference on Machine Learning. 70. 4035–4043.52 indexed citations
15.
Alistarh, Dan, et al.. (2017). Communication-Efficient Stochastic Gradient Descent, with Applications to Neural Networks. London School of Economics and Political Science Research Online (London School of Economics and Political Science). 1669–1680.4 indexed citations
16.
Alistarh, Dan, et al.. (2016). QSGD: Communication-Optimal Stochastic Gradient Descent, with Applications to Training Neural Networks. arXiv (Cornell University).1 indexed citations
17.
Alistarh, Dan, Jerry Li, Ryota Tomioka, & Milan Vojnović. (2016). QSGD: Randomized Quantization for Communication-Optimal Stochastic Gradient Descent. arXiv (Cornell University).35 indexed citations
Alistarh, Dan, et al.. (2015). Streaming Min-max hypergraph partitioning. London School of Economics and Political Science Research Online (London School of Economics and Political Science). 28. 1900–1908.6 indexed citations
20.
Alistarh, Dan. (2015). The Renaming Problem: Recent Developments and Open Questions. Bulletin of the European Association for Theoretical Computer Science. 3(117).1 indexed citations
Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive
bibliographic database. While OpenAlex provides broad and valuable coverage of the global
research landscape, it—like all bibliographic datasets—has inherent limitations. These include
incomplete records, variations in author disambiguation, differences in journal indexing, and
delays in data updates. As a result, some metrics and network relationships displayed in
Rankless may not fully capture the entirety of a scholar's output or impact.