Won-Chul Bang

1.4k total citations
48 papers, 978 citations indexed

About

Won-Chul Bang is a scholar working on Human-Computer Interaction, Computer Vision and Pattern Recognition and Radiology, Nuclear Medicine and Imaging. According to data from OpenAlex, Won-Chul Bang has authored 48 papers receiving a total of 978 indexed citations (citations by other indexed papers that have themselves been cited), including 18 papers in Human-Computer Interaction, 17 papers in Computer Vision and Pattern Recognition and 13 papers in Radiology, Nuclear Medicine and Imaging. Recurrent topics in Won-Chul Bang's work include Hand Gesture Recognition Systems (13 papers), Medical Image Segmentation Techniques (6 papers) and Ultrasound Imaging and Elastography (6 papers). Won-Chul Bang is often cited by papers focused on Hand Gesture Recognition Systems (13 papers), Medical Image Segmentation Techniques (6 papers) and Ultrasound Imaging and Elastography (6 papers). Won-Chul Bang collaborates with scholars based in South Korea, China and Bulgaria. Won-Chul Bang's co-authors include Zeungnam Bien, D. Stefanov, Ja-Yeon Jeong, Moon Ho Park, Seokmin Han, Wonsik Kim, Sung-Jung Cho, Wook Chang, Jing Yang and Eun-Seok Choi and has published in prestigious journals such as SHILAP Revista de lepidopterología, IEEE Transactions on Biomedical Engineering and Fuzzy Sets and Systems.

In The Last Decade

Won-Chul Bang

44 papers receiving 914 citations

Peers — A (Enhanced Table)

Peers by citation overlap · career bar shows stage (early→late) cites · hero ref

Name h Career Trend Papers Cites
Won-Chul Bang South Korea 14 321 292 290 269 161 48 978
Md. Rabiul Islam Bangladesh 16 217 0.7× 146 0.5× 259 0.9× 90 0.3× 155 1.0× 47 1.2k
Amr Elchouemi United States 13 248 0.8× 188 0.6× 234 0.8× 104 0.4× 125 0.8× 80 935
Brandon Rothrock United States 17 505 1.6× 86 0.3× 408 1.4× 378 1.4× 105 0.7× 37 1.3k
Avinash G. Keskar India 17 482 1.5× 229 0.8× 318 1.1× 40 0.1× 115 0.7× 109 1.1k
Cemal Köse Türkiye 18 423 1.3× 405 1.4× 174 0.6× 74 0.3× 109 0.7× 75 1.0k
Igi Ardiyanto Indonesia 15 348 1.1× 226 0.8× 310 1.1× 61 0.2× 57 0.4× 142 945
Mohammad Farukh Hashmi India 19 552 1.7× 264 0.9× 273 0.9× 37 0.1× 110 0.7× 94 1.2k
Jinghui Chu China 12 243 0.8× 49 0.2× 135 0.5× 148 0.6× 116 0.7× 27 565
David Windridge United Kingdom 18 477 1.5× 88 0.3× 259 0.9× 49 0.2× 63 0.4× 91 1.1k
AKM Azad Australia 19 272 0.8× 212 0.7× 364 1.3× 45 0.2× 122 0.8× 86 1.4k

Countries citing papers authored by Won-Chul Bang

Since Specialization
Citations

This map shows the geographic impact of Won-Chul Bang's research. It shows the number of citations coming from papers published by authors working in each country. You can also color the map by specialization and compare the number of citations received by Won-Chul Bang with the expected number of citations based on a country's size and research output (numbers larger than one mean the country cites Won-Chul Bang more than expected).

Fields of papers citing papers by Won-Chul Bang

Since Specialization
Physical SciencesHealth SciencesLife SciencesSocial Sciences

This network shows the impact of papers produced by Won-Chul Bang. Nodes represent research fields, and links connect fields that are likely to share authors. Colored nodes show fields that tend to cite the papers produced by Won-Chul Bang. The network helps show where Won-Chul Bang may publish in the future.

Co-authorship network of co-authors of Won-Chul Bang

This figure shows the co-authorship network connecting the top 25 collaborators of Won-Chul Bang. A scholar is included among the top collaborators of Won-Chul Bang based on the total number of citations received by their joint publications. Widths of edges represent the number of papers authors have co-authored together. Node borders signify the number of papers an author published with Won-Chul Bang. Won-Chul Bang is excluded from the visualization to improve readability, since they are connected to all nodes in the network.

All Works

20 of 20 papers shown
3.
Lee, Min Woo, Hyun Jeong Park, Tae Wook Kang, et al.. (2017). Image Fusion of Real-Time Ultrasonography with Computed Tomography: Factors Affecting the Registration Error and Motion of Focal Hepatic Lesions. Ultrasound in Medicine & Biology. 43(9). 2024–2032. 9 indexed citations
4.
Ik, Dong, Min Woo Lee, Tae Wook Kang, et al.. (2017). Comparison Between CT and MR Images as More Favorable Reference Data Sets for Fusion Imaging-Guided Radiofrequency Ablation or Biopsy of Hepatic Lesions: A Prospective Study with Focus on Patient’s Respiration. CardioVascular and Interventional Radiology. 40(10). 1567–1575. 5 indexed citations
5.
Han, Seokmin, et al.. (2017). A deep learning framework for supporting the classification of breast lesions in ultrasound images. Physics in Medicine and Biology. 62(19). 7714–7728. 269 indexed citations
6.
Ik, Dong, Min Woo Lee, Tae Wook Kang, et al.. (2017). Automatic image fusion of real-time ultrasound with computed tomography images: a prospective comparison between two auto-registration methods. Acta Radiologica. 58(11). 1349–1357. 13 indexed citations
7.
Ik, Dong, Min Woo Lee, Kyoung Doo Song, et al.. (2017). A prospective comparison between auto-registration and manual registration of real-time ultrasound with MR images for percutaneous ablation or biopsy of hepatic lesions. Abdominal Radiology. 42(6). 1799–1808. 11 indexed citations
9.
Oh, Young‐Taek, et al.. (2013). Patient-specific liver deformation modeling for tumor tracking. Proceedings of SPIE, the International Society for Optical Engineering/Proceedings of SPIE. 8671. 86711N–86711N. 1 indexed citations
11.
Bang, Won-Chul, et al.. (2006). A 3D Hand-drawn Gesture Input Device Using Fuzzy ARTMAP-based Recognizer. SHILAP Revista de lepidopterología. 10 indexed citations
12.
Cho, Sung-Jung, et al.. (2005). Two-stage Recognition of Raw Acceleration Signals for 3-D Gesture-Understanding Cell Phones. HAL (Le Centre pour la Communication Scientifique Directe). 27 indexed citations
13.
Choi, Eun-Seok, Wook Chang, Won-Chul Bang, et al.. (2004). Development of the gyro-free handwriting input device based on inertial navigation system (INS) theory. Society of Instrument and Control Engineers of Japan. 2. 1176–1181. 7 indexed citations
14.
Stefanov, D., Zeungnam Bien, & Won-Chul Bang. (2004). The smart house for older persons and persons with physical disabilities: structure, technology arrangements, and perspectives. IEEE Transactions on Neural Systems and Rehabilitation Engineering. 12(2). 228–250. 228 indexed citations
15.
Cho, Sung-Jung, et al.. (2004). Inertial Sensor Based Recognition of 3-D Character Gestures with an Ensemble of Classifiers. 112–117. 23 indexed citations
16.
Chang, Wook, Jing Yang, Eun-Seok Choi, et al.. (2003). A miniaturized attitude estimation system for a gesture-based input device with fuzzy logic approach. 한국지능시스템학회 국제학술대회 발표논문집. 616–620. 3 indexed citations
17.
Bien, Zeungnam, et al.. (2002). Machine intelligence quotient: its measurements and applications. Fuzzy Sets and Systems. 127(1). 3–16. 34 indexed citations
18.
Bang, Won-Chul, et al.. (2001). Classification of Arrhythmia Based on Discrete Wavelet Transform and Rough Set Theory. 제어로봇시스템학회 국제학술대회 논문집. 135–137. 4 indexed citations
19.
Kim, Jungbae, Kwang‐Hyun Park, Won-Chul Bang, Jong Sung Kim, & Zeungnam Bien. (2001). Continuous Korean Sign Language Recognition using Automata-based Gesture Segmentation and Hidden Markov Model. 제어로봇시스템학회 국제학술대회 논문집. 822–825. 5 indexed citations
20.
Bang, Won-Chul & Zeungnam Bien. (1997). Inductive Learning Algorithm using Rough Set Theory. 한국지능시스템학회 학술발표 논문집. 7(2). 331–337. 1 indexed citations

Rankless uses publication and citation data sourced from OpenAlex, an open and comprehensive bibliographic database. While OpenAlex provides broad and valuable coverage of the global research landscape, it—like all bibliographic datasets—has inherent limitations. These include incomplete records, variations in author disambiguation, differences in journal indexing, and delays in data updates. As a result, some metrics and network relationships displayed in Rankless may not fully capture the entirety of a scholar's output or impact.

Explore authors with similar magnitude of impact

Rankless by CCL
2026