Keywords: platform, diagnostic imaging, testing, medical images, artificial intelligence
DOI: 10.26102/2310-6018/2025.49.2.023
The amount of AI-based software used in radiology has been rapidly increasing in recent years, and the effectiveness of such AI services should be carefully assessed to ensure the quality of the developed algorithms. Manual assessment of such systems is a labor-intensive process. In this regard, an urgent task is to develop a specialized unified platform designed for automated testing of AI algorithms used to analyze medical images. The proposed platform consists of three main modules: a testing module that ensures interaction with the software being tested and collects data processing results; a viewing module that provides tools for visually evaluating the obtained graphic series and structured reports; a metrics calculation module that allows calculating diagnostic characteristics of the effectiveness of artificial intelligence algorithms. During the development, such technologies as Python 3.9, Apache Kafka, PACS and Docker were used. The developed platform has been successfully tested on real data. The obtained results indicate the potential of using the developed platform to improve the quality and reliability of AI services in radiation diagnostics, as well as to facilitate the process of their implementation in clinical practice.
Keywords: platform, diagnostic imaging, testing, medical images, artificial intelligence
DOI: 10.26102/2310-6018/2025.49.2.028
Creating high-quality distractors for test items is a labor-intensive task that plays a crucial role in the accurate assessment of knowledge. Existing approaches often produce implausible alternatives or fail to reflect typical student errors. This paper proposes an AI-based algorithm for distractor generation. It employs a large language model (LLM) to first construct a correct chain of reasoning for a given question and answer, and then introduces typical misconceptions to generate incorrect but plausible answer choices, aiming to capture common student misunderstandings. The algorithm was evaluated on questions from the Russian-language datasets RuOpenBookQA and RuWorldTree. Evaluation was conducted using both automatic metrics and expert assessment. The results show that the proposed algorithm outperforms baseline methods (such as direct prompting and semantic modification), generating distractors with higher levels of plausibility, relevance, diversity, and similarity to human-authored reference distractors. This work contributes to the field of automated assessment material generation, offering a tool that supports the development of more effective evaluation resources for educators, educational platform developers, and researchers in natural language processing.
Keywords: distractor generation, artificial intelligence, large language models, knowledge assessment, test items, automated test generation, NLP
DOI: 10.26102/2310-6018/2025.49.2.024
Often, when constructing regression models, it is necessary to resort to nonlinear transformations of explanatory variables. Both elementary and non-elementary functions can be used for this. This is done because many patterns in nature are complex and poorly described by linear dependencies. Usually, the transformations of explanatory variables in a regression model are constant for all observations of the sample. This work is devoted to constructing nonlinear regressions with switching transformations of the selected explanatory variable. In this case, the least absolute deviations method is used to estimate the unknown regression parameters. To form the rule for switching transformations, an integer function "floor" is used. A mixed 0–1 integer linear programming problem is formulated. The solution of this problem leads to both the identification of optimal estimates for nonlinear regression and the identification of a rule for switching transformations based on the values of explanatory variables. A problem of modeling the weight of aircraft fuselages is solved using this method. The nonlinear regression constructed with the proposed method using switching transformations turned out to be more accurate than the model using constant transformations over the entire sample. An advantage of the mechanism developed for constructing regression models is that thanks to the knowledge of the rules for switching transformations, the resulting regression can be used for forecasting.
Keywords: regression analysis, nonlinear regression, least absolute deviations method, mixed 0–1 integer linear programming problem, integer function «floor», weight model of aircraft fuselage
DOI: 10.26102/2310-6018/2025.49.2.026
The paper describes the use of a ternary balanced number system for calculating the elements of the inverse matrix for ill-conditioned matrices. The conditionality of a matrix characterizes how strongly the solution of a linear equations system can change depending on small perturbations in the data. The higher the conditionality value, the more sensitive the matrix is to small changes in the data. As an example of an ill-conditioned matrix in this paper the three-by-three Hilbert matrix is considered. Based on the known expression, the true values of the elements of the inverse Hilbert matrix are calculated. An assessment of the errors in calculating the elements of the inverse Hilbert matrix, obtained with varying degrees of calculation accuracy in the binary number system (using a computer, software implementation in C language) and in the ternary balanced number system (calculations were performed manually), is given. Comparison of calculation results is performed in the decimal number system. It is shown that the use of a ternary balanced number system allows to reduce the calculation error of ill-conditioned matrices elements by several times (by 3 or more times for low-precision data and by 1,5 or more times for more precise data).
Keywords: inverse matrix, the Hilbert matrix, ternary balanced number system, ill-conditioned matrix, calculation errors
DOI: 10.26102/2310-6018/2025.49.2.020
The relevance of the study is due to the need to improve the reliability and efficiency of physical protection systems of protected objects in the face of growing security threats, which is possible through the use of more sensitive and selective methods of identification of intruders, which includes the developed method – electromagnetic detection and recognition of biological objects (BO). The purpose of the work is to study the bifurcation process of interaction of external electromagnetic field of radio wave range with the electromagnetic shell of a living organism to substantiate, evaluate and calculate informative features of electromagnetic detection and recognition of BO with the subsequent formation of a dictionary of typical features. The study is based on the previously developed mathematical model of a BO, which is refined and supplemented by analyzing the scientific literature devoted to the study of bioradioinformative technology and bioelectromagnetism. In the course of the work, the conditions and modes of functioning of a biological medium generating electromagnetic radiation are determined and described, depending on the combination of energy and frequency parameters of the external field with the characteristics of this medium. The nomenclature of the most informative signs of electromagnetic recognition – bifurcation parameters characterizing mass, dimensions and electrodynamic properties of a bioobject – is proposed and substantiated. Analytical expressions for calculating the features of classification of BO are derived, confirmed by the results of computational experiment. A dictionary of intruder attributes is developed, providing the possibility of informed decision-making about the presence of an object in the controlled space, its belonging to a certain class and motion parameters. The presented results can be used in the development of means of intruder identification for security and territory monitoring systems.
Keywords: informative features, information interaction, biological object, electromagnetic fields, strength, bioelectromagnetism, intruder identification, bifurcation parameters, feature dictionary
DOI: 10.26102/2310-6018/2025.49.2.016
In the context of the active implementation of artificial intelligence (AI) technologies in healthcare, ensuring stable, controlled and high-quality operation of such systems at the operational stage is of particular relevance. Monitoring of AI systems is enshrined in law: within three years after the implementation of medical devices, including AI systems, it is necessary to provide regular reports to regulatory authorities. The aim of the study is to develop methods for assessing the reliability and effectiveness of medical artificial intelligence for radiation diagnostics. The proposed methods were tested on the data of the Moscow Experiment on the use of innovative technologies in the field of computer vision in the direction of chest radiography, collected in 2023. The developed methods take into account a set of parameters: emerging technological defects, research processing time, the degree of agreement of doctors with the analysis results and other indicators. The proposed approach can be adapted for various types of medical research and become the basis for a comprehensive assessment of AI systems as part of the monitoring of medical devices with artificial intelligence. The implementation of these methods can increase the level of trust of the medical community not only in specific AI-based solutions, but also in intelligent technologies in healthcare in general.
Keywords: artificial intelligence, reliability, efficiency, artificial intelligence system, radiology, radiation diagnostics, monitoring
DOI: 10.26102/2310-6018/2025.49.1.018
The article discusses approaches and tools aimed at improving the intelligent management of the nomenclature component in the drug supply system using optimization problems. We are talking about the relationship between the list of drugs and their quantitative distribution in such a way that the degree of balance between supply and demand is taken into account. The problem lies in insufficient coordination of drug flows, imbalances in stocks and inefficient distribution of resources. All these factors lead to increased costs and reduced availability of vital drugs for end consumers. Effective management of the nomenclature-volume balance allows you to avoid shortages, excess stocks and increase the sustainability of the drug supply system, ensuring optimal stocks and availability of drugs. The main attention is paid to the use of optimization problems and expert assessments of their parameters in managing the digital interaction of suppliers and consumers, which allows for increased accuracy in controlling the range and demand. Control means minimizing shortages or excess stocks, guaranteeing the availability of the necessary drugs for the end consumer. The results of the study were used to develop an intelligent subsystem for supporting management decisions, promoting balanced resource management and increasing the availability of drugs.
Keywords: organizational system, drug provision, management, optimization, expert assessment
DOI: 10.26102/2310-6018/2025.49.2.025
One of the key issues in the process of organizing information security is the assessment of compliance with the requirements for infrastructure protection, as well as response to current threats and risks. This assessment is ensured by conducting an appropriate audit. Domestic and international standards specify various methods for conducting an information security audit, and also provide conceptual models for constructing the assessment process. However, the disadvantages of these standards include the impossibility of their in-depth adaptation within individual information systems, as well as the partial or complete lack of a numerical assessment of security parameters, which can negatively affect the objectivity of the assessment of the parameters used and not reflect real threats. In turn, the adaptation of numerical methods in the analysis of the maturity level of information security processes allows solving a number of important problems, for example, automation of the assessment process, providing a more accurate indicator of identifying vulnerable components of the information infrastructure, as well as the ability to integrate the obtained values with other processes aimed at neutralizing current security threats from intruders. The purpose of this work is to analyze the possibility of using a numerical assessment of the maturity level of information security, as well as the use of fuzzy sets in the audit.
Keywords: information security, audit, maturity level assessment, information security tools, numerical assessment, fuzzy sets, fuzzy logic, security criteria, risks
DOI: 10.26102/2310-6018/2025.49.2.015
This paper examines the features of building digital identity systems for managing information technology processes in an enterprise, the architecture of which depends on decentralized data registers - blockchains. The paper considers blockchains as weighted graphs and formulates a number of theses that speak about the specifics of the functioning of such distributed networks in real information technology enterprises. The features of various network topologies and possible architectural vulnerabilities and flaws that can affect the operation of the entire network are considered – centralization of mining, centralization of staking, various attacks on a functioning network (topological and 51% percent attack). Blockchains using various consensus-building algorithms, taking into account their features, are considered. The paper considers the task of finding the minimum coverage in a graph and emphasizes the importance of applying this task to the described digital personality system in order to increase the reliability of the blockchain computer network by analyzing its topology. Various methods of finding the minimum coverage in a graph are considered – exact and heuristic algorithms. The paper analyzes an application that implements the ant colony algorithm to solve the problem, provides numerical characteristics of the algorithm and its formal description.
Keywords: digital identity system, blockchain, distributed systems, graphs, minimum coverage search
DOI: 10.26102/2310-6018/2025.49.2.022
One of the important tasks of statistical analysis is to test statistical hypotheses, and in this group the most promising is the subgroup of nonparametric ranking criteria, which are very stable for work with small samples, when it is not possible to reliably justify the hypothetical law of distribution. In its turn, this fact causes the necessity to abandon asymptotic approximations and to have exact critical values of the criteria (or so-called p-values in modern literature). At present, analytical solutions are available only for a very limited class of criteria (signs, Wilcoxon, series, Ansari-Bradley). For all others, a computerized enumeration of a huge number of possible permutations of ranks is required for an exact solution. The creation of a universal algorithm for obtaining an accurate and fast distribution of ranks of nonparametric criteria is the focus of the present work. The algorithm, implemented in open-source programming languages C++, Javascript and Python, is based on a well-known combinatorics problem - permutations with repetitions, with its adaptation to the task of hypothesis testing by rank criteria. The following criteria are considered as such criteria: Kraskell-Wallis, Muda, Lehman-Rosenblatt, as well as a group of normal label criteria: Fisher-Yates, Capon, Klotz, Van der Varden. The algorithm is also adapted for other possible ranking problems of nonparametric statistics.
Keywords: statistical hypothesis testing, nonparametric criteria, rank criteria, exact distributions of rank criteria, permutations with repetitions, permutation algorithms, c++ programs for permutations
DOI: 10.26102/2310-6018/2025.49.2.019
The purpose of this article is to assess potential threats to cybersecurity arising from the development of quantum algorithms. The text analyzes existing quantum algorithms, such as Shor's algorithm and Grover's algorithm, and explores the possibility of their potential application in the context of compromising existing cryptographic systems. The research approach includes a literature review and examination of core mechanisms underlying quantum computers, along with assessment of their capability to perform algorithms potentially affecting various cryptographic systems, both symmetric and asymmetric. Additionally, the paper discusses the prospects for developing quantum-resistant cryptographic algorithms aimed at protecting against cryptanalysis using quantum computations. Based on the analysis of existing quantum algorithms and their potential impact on widely used cryptographic systems, the authors of the study conclude that, at present, there is no compelling evidence to assert the real possibility of compromising asymmetric or symmetric cryptographic algorithms in the near future within the context of quantum computations. However, considering the ongoing development of quantum technologies and the necessity of maintaining the confidentiality of information, the relevance of which will not significantly diminish over time, as well as the need to ensure the protection of confidential information in the future, there is a requirement for the development and active implementation of quantum-resistant cryptographic methods to ensure information confidentiality in the long term.
Keywords: post-quantum cryptography, shor's algorithm, grover's algorithm, asymmetric cryptography, symmetric cryptography, quantum computers, confidentiality preservation of information
DOI: 10.26102/2310-6018/2025.49.2.017
This paper investigates the dynamics of interaction between two species competing for a limited resource using a mathematical model that is an autonomous system of ordinary differential equations in normal form. The model is based on Gause's principle, Volterra's hypotheses, Tilman's theory of resource competition, and the Michaelis-Menten equation to describe population growth. The system of nonlinear ordinary differential equations is analyzed for stability at stationary points using the first approximation analytical method proposed by A.A. Lyapunov, which is suitable for the study of systems consisting of two or more equations, and analytically and numerically solved for various values of model parameters. The results show that species survival and coexistence depend on the level of the limiting resource, the ratio of fertility and mortality rates and intraspecific competition, and substrate concentration. Numerical simulations correspond to scenarios of extinction of one species, dominance of one species, or their coexistence depending on environmental conditions. The results obtained in this work are consistent with natural ecological relationships and emphasize the importance of considering anthropogenic factors, such as eutrophication, when predicting changes in ecological systems.
Keywords: population dynamics, limiting resource, mathematical model, lyapunov method, simulation, eigenvalues, stability of equilibrium state
DOI: 10.26102/2310-6018/2025.49.2.014
The insufficient explainability of machine learning models has long constituted a significant challenge in the field. Specialists across various domains of artificial intelligence (AI) application have endeavored to develop explicable and reliable systems. To address this challenge, DARPA formulated a contemporary approach to explainable AI (XAI). Subsequently, Bellucci et al. expanded DARPA's XAI concept by proposing a novel methodology predicated on semantic web technologies. Specifically, they employed OWL2 ontologies for the representation of user-oriented expert knowledge. This system enhances confidence in AI decisions through the provision of more profound explanations. Nevertheless, XAI systems continue to encounter difficulties when confronted with incomplete and imprecise data. We propose a novel approach that utilizes fuzzy logic to address this limitation. Our methodology is founded on the integration of fuzzy logic and machine learning models to imitate human thinking. This new approach more effectively interfaces with expert knowledge to facilitate deeper explanations of AI decisions. The system leverages expert knowledge represented through ontologies, maintaining full compatibility with the architecture proposed by Bellucci et al. in their work. The objective of this research is not to enhance classification accuracy, but rather to improve the trustworthiness and depth of explanations generated by XAI through the application of "explanatory" properties and fuzzy logic.
Keywords: explainable artificial intelligence, explainability, ontology, fuzzy system, fuzzy clustering
DOI: 10.26102/2310-6018/2025.49.2.011
The purpose of the study is to develop a methodology for cognitive determination of medical halftone images’ parameters based on dual spectral scanning methods. The mathematical model of radiopaque images of vessels is described in this work. Based on this model, the method for determining the vessel parameters using spectral scanning was developed. The model is based on the representation of oriented brightness differences using Walsh functions. This vessel model was convolved with wavelets based on the first Walsh functions. The result of the convolution will yield extremes at the points of brightness differences. We can use this result as an informative parameter for the presence of a vessel contour. Information from many such parameters in a local area is aggregated and gives an averaged characteristic of this area. This leads to a significant decrease in the influence of noise on the final result due to an acceptable decrease in the resolution of localization of significant arterial occlusions. The averaged results of the convolution of Walsh functions are recommended to be calculated using a two-dimensional spectral Walsh transform in a sliding window with subsequent frequency selection. The method is illustrated by the example of classifying the contour of the boundary of a vessel model and a real radiopaque image of an artery with a high noise level. A comparison of theoretical and practical approaches to solving the problem of detecting the contour of arteries is carried out. Experimental studies of the proposed method have shown the possibility of estimating informative parameters even under conditions of analyzing images with unsatisfactory contrast and with a low signal-to-noise ratio. The use of the dual spectral scanning method in systems for automatic analysis of radiopaque angiographic images allows obtaining informative parameters in conditions of high noise in the images.
Keywords: spectral analysis, informative parameters, image of a vessel, radiopaque angiography, walsh functions
DOI: 10.26102/2310-6018/2025.49.2.013
In the context of increasing informatization of various production areas, when most technological processes and information flows are automated and controlled by computer technology, the choice of measures to ensure the security of information (SI) of critical information infrastructure objects (CIIO) becomes a pressing issue. The article discusses existing methods and approaches to assessing the risk of implementing SI threats to CIIO, which include automated process control systems, information systems, and information and telecommunication networks. These approaches help SI specialists assess the risks associated with possible cyberattacks and data leaks. A method is proposed for quantitatively assessing the degree of danger of implementing SI threats based on the intelligent analysis of data stored in the CIIO logging subsystem. The method allows for a quantitative assessment of the degree of danger of implementing SI threats by potential violators with respect to a specific CIIO. The developed method complements the available assessments of SI specialists by forming expert assessments from additionally involved specialists - professionals in the field of technological processes and information flows of CIIO. The results of the study are recommended for use in modeling SI threats and developing requirements for information security tools in the CIIO.
Keywords: information security, critical information infrastructure, automated control system, technological process, threat, violator, potential, danger of threat realization, risk, damage
DOI: 10.26102/2310-6018/2025.49.2.012
Currently, the widespread use of additive technologies fully raises the issues of creating and implementing optimal bio-inspired designs, because a number of technological restrictions on the geometry and shaping of surfaces are removed. This article presents the results of developing control system algorithms that take into account the operation of an articulated robot as part of technological equipment for multi-axis printing of parts by the fusion deposition method. For non-solid filling of the internal volume of parts, a bio-inspired tree-like structure was chosen, which was formally described using a fractal in the trajectory planning problem. The geometry of the printed object is presented in a cylindrical coordinate system, based on which it is possible to create a layer-by-layer trajectory with a set of concentric circles using a simplified procedure for recalculating coordinates. The results of the work performed are part of a hardware and software complex in a robotic cell for manufacturing parts from PLA and ABS thermoplastics. The trajectory planning is carried out in a simulator, the program code of which is written in the C language and refers to the functions of the Raylib library to perform mathematical operations with vectors, matrices and quaternions. The robot's movement along the planned trajectory is controlled by the STM32H743VIT6 microcontroller with the Free RTOS real-time operating system.
Keywords: additive manufacturing, bio-inspired structures, tree-like fractal, six-axis articulated robot, kinematics simulation, trajectory planning
DOI: 10.26102/2310-6018/2025.49.2.002
This study examines different ways to optimize a system designed to generate source code from an image. The system itself consists of two parts: an autoencoder for processing images and extracting the necessary features from them, and text processing using LSTM blocks. Recently, many new approaches have been released to solve problems of both improving image processing performance and text processing and prediction. In this study, ResNet architectures were chosen to improve the image processing part and Transformer architecture to improve the text prediction part. As part of the experiments, a comparison was made of the performance of systems consisting of various combinations of architectural solutions of the original system, ResNet architecture and transformers, and a conclusion was made about the quality of prediction based on the performance of the BLEU, chrF++ metrics, as well as the execution of functional tests. The experiments showed that the combination of ResNet and Transformer architectures shows the best result in the task of generating source code from an image, but this combination also requires the longest time for its training.
Keywords: code generation, image, machine learning, resNet, transformers
DOI: 10.26102/2310-6018/2025.49.2.007
The relevance of the study is determined by the need to increase the level of self-organization of urban systems through the involvement of the population in the processes of management and optimization of infrastructure, which corresponds to the concept of «The right to the city». In this regard, this article aims to identify effective methods of organizing feedback between residents and city authorities through multiplatform online surveys with geospatial reference. The leading approach to the study of this problem is the development of a client-server system that combines a web client, a Telegram bot and other platforms, which allows for a comprehensive review of the features of data collection, analysis and visualization in real time. The article presents the architecture and functionality of the system, reveals the principles of its operation, identifies the advantages of a multiplatform approach compared with traditional survey methods, and substantiates the importance of geospatial mapping for localization of problem areas. It has been experimentally confirmed that using multiple channels of interaction increases the activity of respondents and the representativeness of data: 6022 publications from 94 participants were collected in four months. The materials of the article are of practical value for city administrations, researchers in the field of urban studies and developers of civic engagement platforms focused on creating adaptive management systems for the urban environment.
Keywords: system management, feedback, multi-agent system, self-organization, urban studies
DOI: 10.26102/2310-6018/2025.48.1.041
This article presents an algorithm for calculating time parameters and resource optimization of a network graph, the lengths of which are estimated by an expert group in the form of fuzzy triangular numbers. To account for the variation in expert assessments, the examination results are first summarized as fuzzy interval-digit numbers and then converted into fuzzy triangular numbers based on the risk factor of the decision maker. The use of fuzzy interval-valued numbers allows not only to take into account the uncertainty of expert opinions regarding the duration of work, but also the differences in expert opinion when forming the membership function of fuzzy triangular numbers. The network planning algorithm is based on the classical algorithm for finding the critical path using special methods for calculating the early and late times of events when setting the duration of work in the form of fuzzy triangular numbers. Instead of taking the maximum and minimum operations when finding the early and late times of events, a probabilistic comparison of fuzzy numbers is used. Based on the calculated fuzzy triangular estimates of the early and late completion of events, fuzzy estimates of the early and late moments of the start and completion of each job and the probability of each job being completed at each time are calculated. The probabilities obtained allow us to estimate the resource availability of the project at any given time. The paper also proposes a mathematical model for optimizing the resource availability of a project due to shifts in the beginning of each work within the early and late start.
Keywords: network graph of the project, fuzzy triangular and interval-valued representation, duration of the project work, fuzzy time parameters of the project work, resource optimization of the project
DOI: 10.26102/2310-6018/2025.49.2.004
This work focuses on the development of a C to Promela translator for the automated verification of programs written by programming students. The goal is to create a tool that allows checking the correctness of intermediate program execution steps using Model Checking. The proposed translator is geared towards sequential programs with a single main function, operating on integers and arrays. It analyzes the student’s C code, input data range requirements, and a hierarchical LTL specification (goal tree) that describes the expected program behavior. The translation process utilizes clang to construct the syntax tree and creates additional variables to track array accesses. The generated Promela code contains variable declarations, a main process that includes non-deterministic variable input and the translated C code, and an LTL properties block. The resulting Promela model is verified using Spin against the LTL properties specified by the instructor. If a violation is detected, a counterexample is generated, demonstrating the program execution trace containing the error. The result of this work is a command-line utility written in Python that generates a .pml file with Promela code and LTL properties, as well as a .json file containing the annotated goal tree and the counterexample. Future plans include automating the generation of LTL properties from natural language requirement descriptions and generating hints for students based on the counterexamples.
Keywords: intelligent tutoring systems, programming learning, formal verification, model checking, code translation
DOI: 10.26102/2310-6018/2025.49.2.009
In recent years, there has been a surge of gardeners' interest in growing plants both on farms and at home. The aim of the study is to develop a method for the integrated application of neural networks for plant identification from photos and mivar technologies to provide personalized recommendations. A residual convolutional neural network ResNet20, pre-trained on a dataset of plants, is used for image classification. The mivar expert system provides a personalized recommendation based on the growing conditions and parameters of the plant defined by the neural network. A model for describing the provision of recommendations is created, which helps users to get the desired result in the form of the name of the plant. A method of applying neural and mivar networks is developed to generate logically sound plant recommendations depending on environmental conditions and user preferences. According to the results of experiments, we can conclude that in order to increase the accuracy of image classification, it is necessary to increase the number of layers of the neural network by about 1.5 times when increasing the recognized plants from 3 to 9. The complex application of convolutional neural networks and mivar technologies allows to achieve high accuracy of plant detection and provide high-quality recommendations for users.
Keywords: intelligent system, convolutional neural network, mivar, providing recommendations, mivar networks, mivar expert systems
DOI: 10.26102/2310-6018/2025.49.2.010
The problem of efficient automation of visually rich document processing is an important part of computer vision research. This paper is devoted to the development of a computer vision model for region detection in visually rich documents, with an emphasis on receipt processing using reinforcement learning. In the context of the growing volume of paper documentation and the need to automate data processing, efficient identification of key elements of receipts (such as amounts, dates, and product names) is becoming especially relevant. The paper presents the architecture of the model based on convolutional neural networks (CNN), which is trained on a variety of datasets including receipt images of different formats and qualities. The methods of information extraction and the reinforcement learning algorithm are considered, which uses a trimmed loss function, a reinforcement learning loop presented in SpanIE-Recur. The stages of data preprocessing are described, including sample augmentation and image normalization, which contributes to increasing the detection accuracy. The experimental results show the high efficiency of the proposed model, achieving significant accuracy and recall in identifying regions of interest. Possible applications of this technology in the fields of accounting automation, financial analysis and electronic document management are also discussed. In conclusion, the importance of further research in the field of improving image processing algorithms and expanding the functionality of the model to work with other types of documents is emphasized.
Keywords: visually rich document, computer vision, reinforcement learning, object detection, receipt processing, automation, document areas, data preprocessing, electronic document management
DOI: 10.26102/2310-6018/2025.48.1.045
The article provides an analysis of a number of gas-dynamic processes affecting the efficiency of the gas-dynamic temperature stratification device. The relevance of the study is due to the need for a more accurate description of the processes of gas-dynamic temperature stratification in energy separation devices, which is important for improving the efficiency of heat exchange and aerodynamic systems. This article is aimed at identifying patterns of energy redistribution in the flow, taking into account the Bernoulli law and the Joule-Thomson effect, as well as analyzing their impact on temperature gradients inside the gas-dynamic temperature stratification device. The study employs mathematical modeling conducted within the STAR-CCM+ framework, enabling a thorough exploration of gas flow characteristics, as well as variations in velocity, pressure, and temperature throughout the system. The article presents the results of a numerical experiment, reveals the mechanisms of influence of the main gas-dynamic effects on temperature stratification, identifies key dependencies between the input parameters of the device and the flow characteristics, and substantiates the possibility of targeted optimization of energy separation. Mathematical models are derived, supplemented by equations that take into account the role of Bernoulli's law and the Joule-Thomson effect. The corresponding equations are considered. The materials of the article are of practical value for the development and improvement of energy separation devices, optimization of working processes in gas-dynamic systems and increasing the efficiency of temperature stratification in aerodynamic installations for use in the real sector of the economy.
Keywords: gas dynamic temperature stratification, energy separation device, mathematical modeling, STAR-CCM+, bernoulli's law, joule-Thomson effect
DOI: 10.26102/2310-6018/2025.49.2.008
The relevance of the study is due to the need to improve methods for solving multi-criteria transportation problems, which represent an important class of optimization problems with a wide range of practical applications. Traditional approaches often fail to handle the computational complexity of such problems, while existing heuristic methods require additional adaptation and parameter tuning.In this regard, this paper aims to identify the most effective configurations of evolutionary algorithms for solving multi-criteria transportation problems in terms of both solution quality and speed. The leading approach to studying this problem is the comparative analysis of various configurations of evolutionary algorithms on a large set of test tasks (about 85 thousand unique tasks with 4 criteria), allowing for a comprehensive examination of the features of each algorithm under different parameters. The paper presents the results of analyzing the effectiveness of about 50 configurations of evolutionary algorithms, reveals patterns of how various parameters influence solution quality and speed, identifies optimal configurations for each type of algorithm, and justifies the advantage of a combined approach to problem-solving. The materials of the paper are of practical value for software developers in the field of logistics and transportation systems, as well as for researchers working on optimization and evolutionary design issues, as they enable the creation of more efficient automated systems for solving multi-criteria transportation problems.
Keywords: optimization, evolutional algorithms, travelling salesman problem, transportation problem, multicriterial problems
DOI: 10.26102/2310-6018/2025.48.1.040
The article presents a study on the application of the nnU-Net (v2) framework for automatic segmentation and classification of liver space-occupying lesions on abdominal computed tomography. The main attention is paid to the effect of the batch size and the use of data from different contrast phases on the classification accuracy of such lesions as cysts, hemangiomas, carcinomas, and focal nodular hyperplasia (FNH). During the experiments, batch sizes of 2, 3, and 4 were used, as well as data from two contrast phases ‒ arterial and venous. The results showed that the optimal batch size is 3 or 4, depending on the pathology, and the use of data from two contrast phases significantly improves the accuracy and sensitivity of space-occupying lesions classification, especially for carcinomas and cysts. The achieved best sensitivity rates were 100% for carcinomas, 94% for cysts, 81% for hemangiomas, and 84% for FNH. The paper confirms the effectiveness of nnU-Net v2 for solving medical image segmentation and classification problems and highlights the importance of choosing the right training parameters and data to achieve the best results in medical diagnostics.
Keywords: nnU-Net v2, CT images, liver pathologies, batch size, segmentation, classification, medical images, contrast phases, carcinoma
DOI: 10.26102/2310-6018/2025.49.2.005
The article discusses a method for designing a wireless local area network using a digital twin of a room. It considers the capabilities of a digital twin to simplify the storage and synchronization of data on the structure of the room and the location of devices. It describes the developed structure of the building, which includes floors storing information on the coordinates and models of access points, user devices and obstacles. The information on the floors is divided into corresponding layers, which allows for quick access to any data by coordinates. The article also discusses the implementation of an algorithm for the automated placement of access points in the room. The algorithm includes a system of "agents", where each access point acts as a separate entity trying to fulfill the set conditions in its area. Depending on the number of iterations set by the designer, the initial number of access points, accuracy and limitations, different results can be obtained. Thus, the result of the algorithm allows you to evaluate various situations and choose the most suitable option for arranging access points in the room, taking into account all the set conditions. Using the developed tools, the designer can clearly see how the access points were located using the algorithm, how the signal from each point spreads throughout the room, and whether all the conditions for the devices located in the room are met.
Keywords: wireless local area network, wireless local area network design, BIM-technologies, digital twin, automation algorithm, placement of access points
DOI: 10.26102/2310-6018/2025.49.2.026
Based on existing research, the need to integrate virtual reality technologies into the educational process of medical university students studying in the specialty "Therapy" is considered. Existing software solutions in the field of educational medicine are analyzed, a shortage of VR simulators focused on the general therapeutic specialty is revealed. The importance of developing a virtual simulator that allows students to practice therapeutic skills in a realistic and safe virtual environment is substantiated. A comparative analysis of modern VR headsets and controllers is carried out, based on which suitable devices for implementing the task are determined: Oculus Quest 2 and Valve Index. Requirements for the technical characteristics of the equipment are formulated and development technologies are selected: the Unity game engine, the XR Interaction Toolkit and OpenXR libraries, as well as software frameworks for creating 3D models and interactive scenes. The process of designing a virtual environment for a therapist's office is described, including the development of 3D models of medical equipment and a virtual patient using the Blender editor. The game scene has been assembled and mechanics of interaction with the environment have been implemented, such as movement, object grabbing, a dialogue system and a results assessment system. The dialogue system has been developed in C# using the .NET Framework platform. It is expected that the introduction of the VR simulator into the educational process will improve the level of training of student therapists, thanks to the development of professional skills in an interactive and safe virtual environment.
Keywords: virtual reality, therapy, interactive virtual environment, game engines, medical education
DOI: 10.26102/2310-6018/2025.49.2.001
The article presents the results of experimental studies aimed at modeling five basic breathing patterns of newborns using an electrical impedance tomograph and a simplified physical model of the neonatal mediastinum. The study covers such patterns as normal breathing (eupnea), periodic breathing, tachypnea, breathing with retractions and central apnea. The previously developed simplified physical model of the neonatal mediastinum is equipped with a controlled air filling system, which allows reproducing various volumes and modes of ventilation. Experimental studies confirmed the possibility of modeling and recording each of the five breathing patterns using an electrical impedance tomography. The developed technique allows research and testing of new data processing algorithms in the field of electrical impedance tomography of the lungs of newborns. The results confirm that electrical impedance tomography is a promising tool for diagnosing and monitoring respiratory disorders in newborns. The proposed solutions can be used to develop new approaches to the diagnosis and treatment of respiratory diseases in neonatology.
Keywords: electrical impedance tomography, newborns, patterns, diagnostics, monitoring, lungs
DOI: 10.26102/2310-6018/2025.48.1.043
The article explores the application of an optimization approach in managerial decision-making within a digitalized organizational system for consumer order fulfillment. It is demonstrated that when constructing a model of interaction between consumers and producers, the characteristics of human-machine environment elements must be taken into account. Such consideration enables the optimization of management in the interaction between ergatic and non-ergatic elements based on performance, reliability, and cost indicators. The formation of the optimization model is based on the introduction of alternative variables characterizing the choice of the number of ergatic elements interacting with a specific non-ergatic element. The extremal requirement considered is the maximization of the performance of the consumer order fulfillment process in the digitalized organizational system, while the boundary requirements are the specified levels of reliability and costs. A transition to an equivalent unconstrained optimization function is implemented. The algorithmic procedure for managerial decision-making is oriented towards the structure of the equivalent optimization function and includes several stages: automatic generation of feasible solutions in a randomized environment, iterative settings of variables, verification of the stopping condition for the iterative process, and expert selection of the final solution.
Keywords: organizational system, digitalization, management, human-machine environment, optimization, expert evaluation
DOI: 10.26102/2310-6018/2025.48.1.044
This article is devoted to the development of software designed to manage the activities of a large IT company by assessing the start time of individual project tasks and assigning specialists to them. Optimizing the process of solving these two interrelated tasks is one of the key factors in the effective functioning of an IT company. In addition to the specific features of this industry, which include different qualifications of specialists, the need to finalize tasks after their completion, and others, a key factor in planning is the periodic occurrence of unplanned events that increase the duration of the project (for example, adjusting certain tasks after agreement with the customer, the emergence of new tasks during discussion, etc.). All this requires the use of new algorithms that take into account the above nuances. This necessitates the development of software that implements the main management mechanisms for IT companies and allows for a prompt response to random factors that lead to a change in the previously found characteristics of the IT project. This software will combine a management system, client applications that allow recording all the nuances related to individual tasks (their implementation, changes in customer requirements, correction, etc.) and a database containing all the data on the project tasks, their interdependence, specialists, etc. As a result, a software structure has been obtained that manages the activities of an IT company by planning the start time of individual tasks, assigning specialists to them, and monitoring execution by introducing subsystems for planning, correction and evaluation of stochastic parameters.
Keywords: project management, IT company management, software, planning, schedule adjustment