这是indexloc提供的服务,不要输入任何密码
Skip to main content
Log in

Training deep neural networks with noisy clinical labels: toward accurate detection of prostate cancer in US data

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose:

Ultrasound is the standard-of-care to guide the systematic biopsy of the prostate. During the biopsy procedure, up to 12 biopsy cores are randomly sampled from six zones within the prostate, where the histopathology of those cores is used to determine the presence and grade of the cancer. Histopathology reports only provide statistical information on the presence of cancer and do not normally contain fine-grain information of cancer distribution within each core. This limitation hinders the development of machine learning models to detect the presence of cancer in ultrasound so that biopsy can be more targeted to highly suspicious prostate regions.

Methods:

In this paper, we tackle this challenge in the form of training with noisy labels derived from histopathology. Noisy labels often result in the model overfitting to the training data, hence limiting its generalizability. To avoid overfitting, we focus on the generalization of the features of the model and present an iterative data label refinement algorithm to amend the labels gradually. We simultaneously train two classifiers, with the same structure, and automatically stop the training when we observe any sign of overfitting. Then, we use a confident learning approach to clean the data labels and continue with the training. This process is iteratively applied to the training data and labels until convergence.

Results:

We illustrate the performance of the proposed method by classifying prostate cancer using a dataset of ultrasound images from 353 biopsy cores obtained from 90 patients. We achieve area under the curve, sensitivity, specificity, and accuracy of 0.73, 0.80, 0.63, and 0.69, respectively.

Conclusion:

Our approach is able to provide clinicians with a visualization of regions that likely contain cancerous tissue to obtain more accurate biopsy samples. The results demonstrate that our proposed method produces superior accuracy compared to the state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+
from $39.99 /Month
  • Starting from 10 chapters or articles per month
  • Access and download chapters and articles from more than 300k books and 2,500 journals
  • Cancel anytime
View plans

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Douglas BR, Charboneau JW, Reading CC (2001) Ultrasound-guided intervention: expanding horizons. Radiol Clin N Am 39(3):415–428

    Article  CAS  Google Scholar 

  2. Yang H, Shan C, Kolen AF, de With PH (2020) Medical instrument detection in ultrasound-guided interventions: a review. arXiv preprint arXiv:2007.04807

  3. Shao Y, Wang J, Wodlinger B, Salcudean SE (2020) Improving prostate cancer (PCa) classification performance by using three-player minimax game to reduce data source heterogeneity. IEEE Trans Med Imaging 39:3148–3158

    Article  Google Scholar 

  4. Javadi G, Bayat S, Kazemi Esfeh MM, Samadi S, Sedghi A, Sojoudi S, Hurtado A, Chang S, Black P, Mousavi P, Abolmaesumi P (2022) Towards targeted ultrasound-guided prostate biopsy by incorporating model and label uncertainty in cancer detection. Int J Comput Assist Radiol Surg 17:121–128

    Article  Google Scholar 

  5. Siddiqui MM, Rais-Bahrami S, Turkbey B, George AK, Rothwax J, Shakir N, Okoro C, Raskolnikov D, Parnes HL, Linehan WM, Merino MJ, Simon RM, Choyke PL, Wood BJ, Pinto PA (2015) Comparison of MR/ultrasound fusion-guided biopsy with ultrasound-guided biopsy for the diagnosis of prostate cancer MR/ultrasound fusion biopsy for prostate cancer MR/ultrasound fusion biopsy for prostate cancer. JAMA 313(4):390–397

    Article  CAS  Google Scholar 

  6. Natarajan S, Marks LS, Margolis DJA, Huang J, Macairan ML, Lieu P, Fenster A (2011) Clinical application of a 3D ultrasound-guided prostate biopsy system. Urol Oncol 29(3):334–342

    Article  Google Scholar 

  7. Ahmed HU, Bosaily AE-S, Brown LC, Gabe R, Kaplan R, Parmar MK, Collaco-Moraes Y, Ward K, Hindley RG, Freeman A, Kirkham AP, Oldroyd R, Parker C, Emberton M (2017) Diagnostic accuracy of multi-parametric MRI and TRUS biopsy in prostate cancer PROMIS: a paired validating confirmatory study. The Lancet 389(10071):815–822

    Article  Google Scholar 

  8. Rohrbach D, Wodlinger B, Wen J, Mamou J, Feleppa E (2018) High-frequency quantitative ultrasound for imaging prostate cancer using a novel micro-ultrasound scanner. Ultrasound Med Biol 44(7):1341–1354

    Article  Google Scholar 

  9. Feleppa E, Porter C, Ketterling J, Dasgupta S, Ramachandran S, Sparks D (2007) Recent advances in ultrasonic tissue-type imaging of the prostate. Acoust Imaging, 331–339

  10. Moradi M, Mahdavi SS, Nir G, Mohareri O, Koupparis A, Gagnon L-O, Fazli L, Casey RG, Ischia J, Jones EC, Goldenberg SL, Salcudean SE (2014) Multiparametric 3D in vivo ultrasound vibroelastography imaging of prostate cancer: preliminary results. Med Phys 41(7):073505

    Article  Google Scholar 

  11. Maxeiner A, Fischer T, Schwabe J, Baur ADJ, Stephan C, Peters R, Slowinski T, von Laffert M, Garcia SRM, Hamm B, Jung E-M (2019) Contrast-enhanced ultrasound (CEUS) and quantitative perfusion analysis in patients with suspicion for prostate cancer. Ultraschall in der Medizin-Eur J Ultrasound 40(03):340–348

    Article  Google Scholar 

  12. Sedghi A, Pesteie M, Javadi G, Azizi S, Yan P, Kwak JT, Xu S, Turkbey B, Choyke P, Pinto P, Wood B, Rohling R, Abolmaesumi P, Mousavi P (2019) Deep neural maps for unsupervised visualization of high-grade cancer in prostate biopsies. Int J Comput Assist Radiol Surg 14(6):1009–1016

    Article  Google Scholar 

  13. Azizi S, Van Woudenberg N, Sojoudi S, Li M, Xu S, Anas EMA, Yan P, Tahmasebi A, Kwak JT, Turkbey B, Choyke P, Pinto P, Wood B, Mousavi P, Abolmaesumi P (2018) Toward a real-time system for temporal enhanced ultrasound-guided prostate biopsy. Int J Comput Assist Radiol Surg 13(8):1201–1209

    Article  Google Scholar 

  14. Sedghi A, Mehrtash A, Jamzad A, Amalou A, Wells WM, Kapur T, Kwak JT, Turkbey B, Choyke P, Pinto P, Wood B (2020) Improving detection of prostate cancer foci via information fusion of MRI and temporal enhanced ultrasound. Int J Comput Assisted Radiol Surg 15:1215–1223

    Article  Google Scholar 

  15. Javadi G, To M.N.N, Samadi S, Bayat S, Sojoudi S, Hurtado A, Chang S, Black P, Mousavi P, Abolmaesumi P (2020) Complex cancer detector: complex neural networks on non-stationary time series for guiding systematic prostate biopsy. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 524–533

  16. Javadi G, Samadi S, Bayat S, Pesteie M, Jafari MH, Sojoudi S, Kesch C, Hurtado A, Chang S, Mousavi P, Abolmaesumi P (2020) Multiple instance learning combined with label invariant synthetic data for guiding systematic prostate biopsy: a feasibility study. Int J Comput Assist Radiol Surg 15:1023–1031

    Article  Google Scholar 

  17. Arpit D, Jastrzundefinedbski S, Ballas N, Krueger D, Bengio E, Kanwal M.S, Maharaj T, Fischer A, Courville A, Bengio Y, Lacoste-Julien S (2017) A closer look at memorization in deep networks. In: Proceedings of the 34th international conference on machine learning. PMLR, pp 233–242

  18. Zhang C, Bengio S, Hardt M, Recht B, Vinyals O (2021) Understanding deep learning (still) requires rethinking generalization. Commun ACM 64(3):107–115

    Article  Google Scholar 

  19. Song H, Kim M, Park D, Shin Y, Lee J.-G (2022) Learning from noisy labels with deep neural networks: a survey. IEEE Trans Neural Netw Learn Syst, 1–19

  20. Ghosh A, Manwani N, Sastry PS (2015) Making risk minimization tolerant to label noise. Neurocomputing 160:93–107

    Article  Google Scholar 

  21. van Rooyen B, Menon AK, Williamson RC (2015) Learning with symmetric label noise: the importance of being unhinged. Adv Neural Inf Process Syst

  22. Charoenphakdee N, Lee J, Sugiyama M (2019) On symmetric losses for learning from corrupted labels. In: International conference on machine learning. PMLR, pp 961–970

  23. Zhang Z, Sabuncu M (2018) Generalized cross entropy loss for training deep neural networks with noisy labels. Adv Neural Inf Process Syst 31

  24. Qin Z, Zhang Z, Li Y, Guo J (2019) Making deep neural networks robust to label noise: cross-training with a novel loss function. IEEE Access 7:130893–130902

    Article  Google Scholar 

  25. Xu Y, Cao P, Kong Y, Wang Y (2019) L_dmi: a novel information-theoretic loss function for training deep nets robust to label noise. Adv Neural Inf Process Syst 32:6225–6236

    Google Scholar 

  26. Jiang L, Zhou Z, Leung T, Li L, Fei-Fei L (2018) Mentornet: Learning data-driven curriculum for very deep neural networks on corrupted labels. In: Thirty-fifth international conference on machine learning

  27. Wang Z, Hu G, Hu Q (2020) Training noise-robust deep neural networks via meta-learning. In: 2020 IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 4523–4532

  28. Northcutt C, Jiang L, Chuang I (2021) Confident learning: estimating uncertainty in dataset labels. J Artif Intell Res 70:1373–1411

    Article  Google Scholar 

  29. Han B, Yao Q, Yu X, Niu G, Xu M, Hu W, Tsang I, Sugiyama M (2018) Co-teaching: robust training of deep neural networks with extremely noisy labels. Adv Neural Inf Process Syst, 8527–8537

  30. Wei H, Feng L, Chen X, An B (2020) Combating noisy labels by agreement: a joint training method with co-regularization. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 13726–13735

  31. Yu X, Han B, Yao J, Niu G, Tsang I.W, Sugiyama M (2019) How does disagreement help generalization against label corruption? In: Thirty-sixth international conference on machine learning, pp 7164–7173

  32. Javadi G, Samadi S, Bayat S, Sojoudi S, Hurtado A, Chang S, Black P, Mousavi P, Abolmaesumi P (2021) Characterizing the uncertainty of label noise in systematic ultrasound-guided prostate biopsy. In: 2021 IEEE 18th international symposium on biomedical imaging (ISBI). IEEE, pp 424–428

  33. Javadi G, Samadi S, Bayat S, Sojoudi S, Hurtado A, Chang S, Black P, Mousavi P, Abolmaesumi P (2021) Training deep networks for prostate cancer diagnosis using coarse histopathological labels. In: International conference on medical image computing and computer-assisted intervention. Springer, pp. 680–689

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Golara Javadi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

G. Javadi and S. Samadi are contributed equally to this work.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Javadi, G., Samadi, S., Bayat, S. et al. Training deep neural networks with noisy clinical labels: toward accurate detection of prostate cancer in US data. Int J CARS 17, 1697–1705 (2022). https://doi.org/10.1007/s11548-022-02707-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Version of record:

  • Issue date:

  • DOI: https://doi.org/10.1007/s11548-022-02707-y

Keywords