政大機構典藏-National Chengchi University Institutional Repository(NCCUR):Item 140.119/155798
English  |  正體中文  |  简体中文  |  Post-Print筆數 : 27 |  Items with full text/Total items : 114611/145648 (79%)
Visitors : 53785240      Online Users : 648
RC Version 6.0 © Powered By DSPACE, MIT. Enhanced by NTU Library IR team.
Scope Tips:
  • please add "double quotation mark" for query phrases to get precise results
  • please goto advance search for comprehansive author search
  • Adv. Search
    HomeLoginUploadHelpAboutAdminister Goto mobile version
    Please use this identifier to cite or link to this item: https://nccur.lib.nccu.edu.tw/handle/140.119/155798


    Title: Large-Scale Hierarchical Medical Image Retrieval Based on a Multilevel Convolutional Neural Network
    Authors: 羅崇銘
    Lo, Chung-Ming;Hsieh, Cheng-Yeh
    Contributors: 圖檔所
    Keywords: Medical image;content-based medical image retrieval;multilevel convolutional neural network;hierarchical training
    Date: 2024-11
    Issue Date: 2025-02-24 15:55:38 (UTC+8)
    Abstract: Presently, with advancements in medical imaging modalities, various imaging methods are widely used in clinics. To efficiently assess and manage the images, in this paper, a content-based medical image retrieval (CBMIR) system is suggested as a clinical tool. A global medical image database is established through a collection of data from more than ten countries and dozens of sources, schools and laboratories. The database has more than 536 294 medical images, including 14 imaging modalities, 40 organs and 52 diseases. A multilevel convolutional neural network (MLCNN) using hierarchical progressive feature learning is subsequently proposed to perform hierarchical medical image retrieval, including multiple levels of image modalities, organs and diseases. At each classification level, a dense block is trained through a labeled classification. With the epochs increasing, four training stages are performed to simultaneously train the three levels with different weights of the loss function. Then, the trained features are used in the CBMIR system. The results show that using the MLCNN on a representative dataset can achieve a mAP of 0.86, which is higher than the 0.71 achieved by ResNet152 in the literature. Applying the hierarchical progressive feature learning can achieve a 12%-16% performance improvement in CNNs and outperform vision Transformer with only 63% of the training time. The proposed representative image selection and multilevel architecture improves the efficiency and precision of retrieving large-scale medical image databases.
    Relation: IEEE Transactions on Emerging Topics in Computational Intelligence, pp.1-11
    Data Type: article
    DOI link: https://doi.org/10.1109/TETCI.2024.3502404
    DOI: 10.1109/TETCI.2024.3502404
    Appears in Collections:[Graduate Institute of Library, Information and Archival Studies] Periodical Articles

    Files in This Item:

    File Description SizeFormat
    index.html0KbHTML12View/Open


    All items in 政大典藏 are protected by copyright, with all rights reserved.


    社群 sharing

    著作權政策宣告 Copyright Announcement
    1.本網站之數位內容為國立政治大學所收錄之機構典藏,無償提供學術研究與公眾教育等公益性使用,惟仍請適度,合理使用本網站之內容,以尊重著作權人之權益。商業上之利用,則請先取得著作權人之授權。
    The digital content of this website is part of National Chengchi University Institutional Repository. It provides free access to academic research and public education for non-commercial use. Please utilize it in a proper and reasonable manner and respect the rights of copyright owners. For commercial use, please obtain authorization from the copyright owner in advance.

    2.本網站之製作,已盡力防止侵害著作權人之權益,如仍發現本網站之數位內容有侵害著作權人權益情事者,請權利人通知本網站維護人員(nccur@nccu.edu.tw),維護人員將立即採取移除該數位著作等補救措施。
    NCCU Institutional Repository is made to protect the interests of copyright owners. If you believe that any material on the website infringes copyright, please contact our staff(nccur@nccu.edu.tw). We will remove the work from the repository and investigate your claim.
    DSpace Software Copyright © 2002-2004  MIT &  Hewlett-Packard  /   Enhanced by   NTU Library IR team Copyright ©   - Feedback