Hit papers significantly outperform the citation benchmark for their cohort. A paper qualifies
if it has ≥500 total citations, achieves ≥1.5× the top-1% citation threshold for papers in the
same subfield and year (this is the minimum needed to enter the top 1%, not the average
within it), or reaches the top citation threshold in at least one of its specific research
topics.
This map shows the geographic impact of Dan Brickley's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Dan Brickley with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Dan Brickley more than expected).
This network shows the impact of papers produced by Dan Brickley. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Dan Brickley. The network helps show where Dan Brickley may publish in the future.
Co-authorship network of co-authors of Dan Brickley
This figure shows the co-authorship network connecting the top 25 collaborators of Dan Brickley.
A scholar is included among the top collaborators of Dan Brickley based on the total number of
citations received by their joint publications. Widths of edges
represent the number of papers authors have co-authored together.
Node borders
signify the number of papers an author published with Dan Brickley. Dan Brickley is excluded from
the visualization to improve readability, since they are connected to all nodes in the network.
All Works
20 of 20 papers shown
1.
Brickley, Dan, Yuvraj Agarwal, Laura Daniele, et al.. (2016). IoT and Schema.org : Getting Started. TNO Repository.
2.
Guha, Ramanathan V., et al.. (2016). Schema.org. Communications of the ACM. 59(2). 44–51.191 indexed citations
Keizer, Johannes, et al.. (2010). AGRIS - from a bibliographic database to a semantic data service on agricultural research information.. 3(1). 26–30.3 indexed citations
5.
Brickley, Dan, Vinay K. Chaudhri, Harry Halpin, & Deborah L. McGuinness. (2010). Linked data meets artificial intelligence : papers from the AAAI Spring Symposium.1 indexed citations
6.
Brickley, Dan, et al.. (2009). Designing AGRIS 2010: information linking and agricultural research. International Conference on Dublin Core and Metadata Applications. 145–146.1 indexed citations
Miles, Alistair, Brian Matthews, Michael Wilson, & Dan Brickley. (2005). SKOS core: simple knowledge organisation for the web. International Conference on Dublin Core and Metadata Applications. 3–10.84 indexed citations
9.
Brickley, Dan. (2004). FOAF Vocabulary Specification. Medical Entomology and Zoology.364 indexed citations
10.
McBride, Brian, et al.. (2004). Resource Description Framework (RDF) standard recommendation, World Wide Web Consortium.31 indexed citations
Eysenbach, Günther, et al.. (2001). A metadata vocabulary for self- and third-party labeling of health web-sites. STM:n Hallinnonalan avoin julkaisuarkisto (Julkari).6 indexed citations
14.
Brickley, Dan. (2000). RDF Site Summary (RSS) 1.0. Medical Entomology and Zoology.2 indexed citations
15.
Eysenbach, Günther, et al.. (2000). MedCERTAIN: quality management, certification and rating of health information on the Net.. PubMed. 230–4.25 indexed citations
Brickley, Dan, et al.. (1998). DESIRE - Developinment of a European Service for Information on Research and Education.. Bristol Research (University of Bristol).1 indexed citations
20.
Brickley, Dan, et al.. (1990). Restructuring a Comprehensive High School.. Educational leadership. 47(7). 23–26.12 indexed citations
Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive
bibliographic database. While OpenAlex provides broad and valuable coverage of the global
research landscape, it—like all bibliographic datasets—has inherent limitations. These include
incomplete records, variations in author disambiguation, differences in journal indexing, and
delays in data updates. As a result, some metrics and network relationships displayed in
Rankless may not fully capture the entirety of a scholar's output or impact.