Generating medical opinions and classification according to Bethesda using deep learning

E.V. Bobrova, A.Z. Makanov, S.S. Osnovin, E.V. Diuldin, B.M. Shifman, K.S. Zaytsev

Abstract


The purpose of this article is to study approaches to the intelligent processing of Russian-language medical textual information (NLP) of a cytological description of situations to solve the problems of detection, generation of observation text and augmentation of descriptions in case of their acute shortage. For a decade, the field of biomedicine in our country has not changed. Approaches for analyzing patient problems are in most cases based on manual processing and expert knowledge of physicians. The paper considers the creation of a machine production pipeline, a full cycle of data and model preprocessing in the field of measuring the incidence of the thyroid gland using the Bethesda protection method. The ideas of sequential and transformable neural networks were used to design the architecture of deep learning models. Approaches to cleaning and preprocessing information about "raw" medical descriptions that require detection are also considered. The obtained results show that subsequent neural networks are of great importance on small data sets, and the transformed architectures are superior to others when generating doctor circuits on large data sets. The solution obtained in the experiment can practically be used as an additional reference tool in the work of a cytologist to determine the thyroid gland.

Full Text:

PDF (Russian)

References


Ali S, Cibas E. The Bethesda System for Reporting Thyroid Cytopathology. (Ali SZ, Cibas ES, eds.). Cham: Springer International Publishing; 2018. doi: https://doi.org/10.1007/978-3-319-60570-8

Ali SZ, Baloch ZW, Cochand-Priollet B, Schmitt FC, Vielh P, VanderLaan PA. The 2023 Bethesda System for Reporting Thyroid Cytopathology. Thyroid®. July 2023. doi: https://doi.org/10.1089/thy.2023.0141

Zixu Wang, Julia Ive, Sumithra Velupillai, Lucia Specia, Is artificial data useful for biomedical Natural Language Processing algorithms. Aug 2019.

Vasilyev, Oleg & Bohannon, John. (2022). Neural Embeddings for Text. 10.48550/arXiv.2208.08386.

YU GU, ROBERT TINN, HAO CHENG, MICHAEL LUCAS, NAOTO USUYAMA, XIAODONG LIU, TRISTAN NAUMANN, JIANFENG GAO, and HOIFUNG POON, Domain-Specific Language Model Pretraining for Biomedical Natural Language Processing. 16 Sep 2021.

Houssein E. H., Mohamed R. E. and Ali A. A., "Machine Learning Techniques for Biomedical Natural Language Processing: A Comprehensive Review," in IEEE Access, vol. 9, pp. 140628-140653, 2021, doi: 10.1109/ACCESS.2021.3119621.

Giacomo Miolo, Giulio Mantoan, Carlotta Orsenigo Department of Management, Economics and Industrial Engineering Politecnico di Milano, Italy. ELECTRAMED: A NEW PRE-TRAINED LANGUAGE REPRESENTATION MODEL FOR BIOMEDICAL NLP. Apr 2021.

Naseem U, Khushi M, Reddy V, Rajendran S, Razzak I, Kim J. Bioalbert: a simple and effective pre-trained language model for biomedical named entity recognition. 2020.

Dina R. Mody, Michael J. Thrall, Savitri Krishnamurthy. Diagnostic Pathology: Cytopathology (Second Edition). 2018.

Victoria Hutterer1 , Ronny Ramlau2 and Iuliia Shatokhina3. Real-time Adaptive Optics with pyramid wavefront sensors: Accurate wavefront reconstruction using iterative methods. October 1, 2018

Lyu, C., Chen, B., Ren, Y. et al. Long short-term memory RNN for biomedical named entity recognition. BMC Bioinformatics 18, 462 (2017). https://doi.org/10.1186/s12859-017-1868-

Fartushny E. N., Sych Yu. P., Fartushny I. E., Koshechkin K. A., Lebedev G. S. STRATIFICATION OF THYROID NODES BY EU-TIRADS CATEGORIES USING TRANSFER LEARNING OF CONVOLUTIONAL NEURAL NETWORKS // KET. 2022. №2. URL: https://cyberleninka.ru/article/n/stratifikatsiya-uzlovyh-obrazovaniy-schitovidnoy-zhelezy-po-kategoriyam-eu-tirads-s-ispolzovaniem-transfernogo-obucheniya (date of access: 05/31/2023).

Van Houdt, G., Mosquera, C. & Nápoles, G. A review on the long short-term memory model. Artif Intel Rev 53, 5929–5955 (2020). https://doi.org/10.1007/s10462-020-09838-1

Alzubaidi, L., Zhang, J., Humaidi, A.J. et al. Review of deep learning: concepts, CNN architectures, challenges, applications, future directions. J Big Data 8, 53 (2021). https://doi.org/10.1186/s40537-021-00444-8

Vyucheyskaya M.V., Krainova I.N., Gribanov A.V. Neural network technologies in the diagnosis of diseases (review) // Journal of Biomedical Research. 2018. №3. URL: https://cyberleninka.ru/article/n/neyrosetevye-tehnologii-v-diagnostike-zabolevaniy-obzor (date of access: 05/31/2023).

Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg S. Corrado, and Jeff Dean. Distributed representations of words and phrases and their compositionality. In Advances in neural information processing systems, 2013b.

Jeffrey Pennington, Richard Socher, and Christopher Manning. GloVe: Global vectors for word representation. In Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), 2014.

Alec Radford, Karthik Narasimhan, Tim Salimans, and Ilya Sutskever. 2018. Improving language understanding with unsupervised learning. Technical report, OpenAI

Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. BERT: Pretraining of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805, 2018.

Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Lukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Advances in Neural Information Processing Systems, pages 6000–6010.

Peng, Y., Yan, S., & Lu, Z. (2019). Transfer Learning in Biomedical Natural Language Processing: An Evaluation of BERT and ELMo on Ten Benchmarking Datasets. BioNLP@ACL.

Devlin, J., Chang, M., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. ArXiv, abs/1810.04805.

Blinov, P., Reshetnikova, A., Nesterov, A., Zubkova, G., Kokh, V. (2022). RuMedBench: A Russian Medical Language Understanding Benchmark. In: Michalowski, M., Abidi, S.S.R., Abidi, S. (eds) Artificial Intelligence in Medicine. AIME 2022. Lecture Notes in Computer Science(), vol 13263. Springer, Cham. https://doi.org/10.1007/978-3-031-09342-5_38

Yalunin, A., Nesterov, A., & Umerenkov, D. (2022). RuBioRoBERTa: a pre-trained biomedical language model for Russian language biomedical text mining. ArXiv, abs/2204.03951.

Lee, J., Yoon, W., Kim, S., Kim, D., Kim, S., So, C.H., & Kang, J. (2019). BioBERT: a pre-trained biomedical language representation model for biomedical text mining. Bioinformatics, 36, 1234-1240.


Refbacks

  • There are currently no refbacks.


Abava  Кибербезопасность MoNeTec 2024

ISSN: 2307-8162