This map shows the geographic impact of F. E. Mcgarry's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by F. E. Mcgarry with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites F. E. Mcgarry more than expected).
This network shows the impact of papers produced by F. E. Mcgarry. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by F. E. Mcgarry. The network helps show where F. E. Mcgarry may publish in the future.
Co-authorship network of co-authors of F. E. Mcgarry
This figure shows the co-authorship network connecting the top 25 collaborators of F. E. Mcgarry.
A scholar is included among the top collaborators of F. E. Mcgarry based on the total number of
citations received by their joint publications. Widths of edges
represent the number of papers authors have co-authored together.
Node borders
signify the number of papers an author published with F. E. Mcgarry. F. E. Mcgarry is excluded from
the visualization to improve readability, since they are connected to all nodes in the network.
All Works
20 of 20 papers shown
1.
Adler, Paul S., et al.. (2005). Enabling Process Discipline: Lessons from the Journey to CMM Level 5. Journal of the Association for Information Systems. 4(1). 3.17 indexed citations
2.
Basili, Victor R. & F. E. Mcgarry. (1997). The experience factory. 643–644.21 indexed citations
3.
Mcgarry, F. E., Rose Pajerski, Gerald Page, et al.. (1994). Software Process Improvement in the NASA Software Engineering Laboratory.32 indexed citations
4.
Mcgarry, F. E., et al.. (1993). Profile of NASA software engineering: Lessons learned from building the baseline.1 indexed citations
5.
Mcgarry, F. E., et al.. (1993). Process improvement as an investment: Measuring its worth.
6.
Mcgarry, F. E.. (1992). Experimental software engineering: Seventeen years of lessons in the SEL. NASA Technical Reports Server (NASA).1 indexed citations
Mcgarry, F. E. & Rose Pajerski. (1990). Towards understanding software: 15 years in the SEL. NASA STI Repository (National Aeronautics and Space Administration).8 indexed citations
Agresti, William W. & F. E. Mcgarry. (1988). The Minnowbrook workshop on software reuse: a summary report. IEEE Computer Society Press eBooks. 33–40.16 indexed citations
13.
Mcgarry, F. E. & William W. Agresti. (1988). Measuring Ada for software development in the software engineering laboratory. 302–309.1 indexed citations
14.
Mcgarry, F. E. & D.N. Card. (1985). Studies and experiments in the Software Engineering Lab (SEL). NASA Technical Reports Server (NASA).1 indexed citations
15.
Card, David N., Gerald Page, & F. E. Mcgarry. (1985). Criteria for software modularization. International Conference on Software Engineering. 372–377.32 indexed citations
16.
Page, Gerald, F. E. Mcgarry, & David N. Card. (1985). A practical experience with independent verification and validation. NASA Technical Reports Server (NASA).5 indexed citations
17.
Selby, Richard W., et al.. (1984). Evaluating software testing strategies.2 indexed citations
18.
Agresti, William W., et al.. (1984). Managers Handbook for Software Development. NASA Technical Reports Server (NASA).8 indexed citations
19.
Mcgarry, F. E., et al.. (1984). An approach to software cost estimation. NASA STI Repository (National Aeronautics and Space Administration).1 indexed citations
Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive
bibliographic database. While OpenAlex provides broad and valuable coverage of the global
research landscape, it—like all bibliographic datasets—has inherent limitations. These include
incomplete records, variations in author disambiguation, differences in journal indexing, and
delays in data updates. As a result, some metrics and network relationships displayed in
Rankless may not fully capture the entirety of a scholar's output or impact.