James J. Davis

782 total citations
29 papers, 434 citations indexed

About

James J. Davis is a scholar working on Hardware and Architecture, Electrical and Electronic Engineering and Artificial Intelligence. According to data from OpenAlex, James J. Davis has authored 29 papers receiving a total of 434 indexed citations (citations by other indexed papers that have themselves been cited), including 11 papers in Hardware and Architecture, 11 papers in Electrical and Electronic Engineering and 9 papers in Artificial Intelligence. Recurrent topics in James J. Davis's work include Advanced Neural Network Applications (8 papers), Parallel Computing and Optimization Techniques (8 papers) and Adversarial Robustness in Machine Learning (5 papers). James J. Davis is often cited by papers focused on Advanced Neural Network Applications (8 papers), Parallel Computing and Optimization Techniques (8 papers) and Adversarial Robustness in Machine Learning (5 papers). James J. Davis collaborates with scholars based in United Kingdom, United States and China. James J. Davis's co-authors include Bruce Kapferer, Peter Y. K. Cheung, Erwei Wang, George A. Constantinides, Paul Markham, Joshua M. Levine, Rongxuan Zhao, Xinyu Niu, Edward Stott and Ho-Cheung Ng and has published in prestigious journals such as ACM Computing Surveys, Computer and IEEE Transactions on Computers.

In The Last Decade

James J. Davis

27 papers receiving 386 citations

Peers — A (Enhanced Table)

Peers by citation overlap · career bar shows stage (early→late) cites · hero ref

Name h Career Trend Papers Cites
James J. Davis United Kingdom 10 165 128 103 100 59 29 434
Peter Sutton Australia 8 19 0.1× 51 0.4× 61 0.6× 64 0.6× 12 0.2× 23 218
Jinyoung Choi United States 10 146 0.9× 39 0.3× 154 1.5× 11 0.1× 64 1.1× 43 448
Hiroshi Ueda Japan 11 48 0.3× 80 0.6× 68 0.7× 39 0.4× 67 1.1× 63 461
Phillip Conrad United States 13 108 0.7× 39 0.3× 34 0.3× 26 0.3× 59 1.0× 38 548
Cody Coleman United States 9 25 0.2× 56 0.4× 149 1.4× 22 0.2× 10 0.2× 15 444
Shampa Chakraverty India 12 20 0.1× 75 0.6× 143 1.4× 21 0.2× 26 0.4× 52 359
Jae‐Eun Namkoong United States 7 35 0.2× 54 0.4× 28 0.3× 201 2.0× 74 1.3× 15 384
Caroline Sporleder Germany 19 23 0.1× 95 0.7× 1.0k 9.8× 42 0.4× 18 0.3× 57 1.2k
Sébastien George France 10 49 0.3× 46 0.4× 67 0.7× 5 0.1× 99 1.7× 67 514
Gareth Humphreys United States 8 8 0.0× 108 0.8× 34 0.3× 8 0.1× 41 0.7× 14 252

Countries citing papers authored by James J. Davis

Since Specialization
Citations

This map shows the geographic impact of James J. Davis's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by James J. Davis with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites James J. Davis more than expected).

Fields of papers citing papers by James J. Davis

Since Specialization
Physical SciencesHealth SciencesLife SciencesSocial Sciences

This network shows the impact of papers produced by James J. Davis. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by James J. Davis. The network helps show where James J. Davis may publish in the future.

Co-authorship network of co-authors of James J. Davis

This figure shows the co-authorship network connecting the top 25 collaborators of James J. Davis. A scholar is included among the top collaborators of James J. Davis based on the total number of citations received by their joint publications. Widths of edges represent the number of papers authors have co-authored together. Node borders signify the number of papers an author published with James J. Davis. James J. Davis is excluded from the visualization to improve readability, since they are connected to all nodes in the network.

All Works

20 of 20 papers shown
1.
Wang, Erwei, et al.. (2023). Enabling Binary Neural Network Training on the Edge. ACM Transactions on Embedded Computing Systems. 22(6). 1–19. 6 indexed citations
2.
Wang, Erwei, et al.. (2023). Logic Shrinkage: Learned Connectivity Sparsification for LUT-Based Neural Networks. ACM Transactions on Reconfigurable Technology and Systems. 16(4). 1–25. 2 indexed citations
3.
Wang, Erwei, James J. Davis, Piotr Zieliński, et al.. (2021). Enabling Binary Neural Network Training on the Edge. Spiral (Imperial College London). 37–38. 8 indexed citations
4.
Wang, Erwei, James J. Davis, Peter Y. K. Cheung, & George A. Constantinides. (2020). LUTNet: Learning FPGA Configurations for Highly Efficient Neural Network Inference. IEEE Transactions on Computers. 69(12). 1795–1808. 31 indexed citations
5.
Li, He, James J. Davis, John Wickerson, & George A. Constantinides. (2019). architect: Arbitrary-Precision Hardware With Digit Elision for Efficient Iterative Compute. IEEE Transactions on Very Large Scale Integration (VLSI) Systems. 28(2). 516–529. 8 indexed citations
6.
Wang, Erwei, James J. Davis, Rongxuan Zhao, et al.. (2019). Deep Neural Network Approximation for Custom Hardware: Where We've Been, Where We're Going. arXiv (Cornell University). 15 indexed citations
7.
Wang, Erwei, James J. Davis, Peter Y. K. Cheung, & George A. Constantinides. (2019). LUTNet: Rethinking Inference in FPGA Soft Logic. Spiral (Imperial College London). 26–34. 41 indexed citations
8.
Wang, Erwei, James J. Davis, & Peter Y. K. Cheung. (2018). A PYNQ-Based Framework for Rapid CNN Prototyping. 223–223. 32 indexed citations
9.
Zhao, Rongxuan, Shuanglong Liu, Ho-Cheung Ng, et al.. (2018). Hardware Compilation of Deep Neural Networks: An Overview. Spiral (Imperial College London). 1–8. 8 indexed citations
10.
Balsamo, Domenico, et al.. (2018). An Application- and Platform-agnostic Runtime Management Framework for Multicore Systems. 195–204. 4 indexed citations
11.
Li, He, James J. Davis, John Wickerson, & George A. Constantinides. (2018). Digit Elision for Arbitrary-accuracy Iterative Computation. Spiral (Imperial College London). 62. 107–114. 2 indexed citations
12.
Davis, James J., Joshua M. Levine, Edward Stott, et al.. (2017). STRIPE: Signal selection for runtime power estimation. Spiral (Imperial College London). 1–8. 4 indexed citations
13.
Xia, Fei, James J. Davis, Joshua M. Levine, et al.. (2017). Voltage, Throughput, Power, Reliability, and Multicore Scaling. Computer. 50(8). 34–45. 11 indexed citations
14.
Yang, Sheng, Rishad Shafik, Geoff V. Merrett, et al.. (2015). Adaptive energy minimization of embedded heterogeneous systems using regression-based learning. ePrints Soton (University of Southampton). 103–110. 35 indexed citations
15.
Davis, James J. & Peter Y. K. Cheung. (2014). Achieving low-overhead fault tolerance for parallel accelerators with dynamic partial reconfiguration. 6. 1–6. 6 indexed citations
16.
Davis, James J. & Peter Y. K. Cheung. (2013). Datapath fault tolerance for parallel accelerators. 366–369. 2 indexed citations
17.
Davis, James J.. (1995). Experimenting with Adult Second Language Learners: A Case with Spanish Language at Northwestern High School in Prince George's County, Maryland.. 1 indexed citations
18.
Davis, James J. & Paul Markham. (1991). Student Attitudes Toward Foreign Language Study at Historically and Predominantly Black Institutions. Foreign Language Annals. 24(3). 227–237. 26 indexed citations
19.
Davis, James J.. (1989). Foreign Language Study and Afro-Americans: An Annotated Bibliography, 1931-1988.. The Journal of Negro Education. 58(4). 1 indexed citations
20.
Davis, James J.. (1988). Teaching Cultural Gestures in the Foreign Language Classroom: A Review of Literature.. 1 indexed citations

Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive bibliographic database. While OpenAlex provides broad and valuable coverage of the global research landscape, it—like all bibliographic datasets—has inherent limitations. These include incomplete records, variations in author disambiguation, differences in journal indexing, and delays in data updates. As a result, some metrics and network relationships displayed in Rankless may not fully capture the entirety of a scholar's output or impact.

Explore authors with similar magnitude of impact

Rankless by CCL
2026