ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

4
Publication date:
09 December 2024

Journal articles №2 2024

1. Calculating inverse distribution functions: Algorithms and programs [№2 за 2024 год]
Authors: Agamirov L.V. (mmk@mati.ru) - MATI (Russian National Research University), Ph.D; Agamirov V.L. (avl095@mail.ru) - Moscow Aviation Institute (National Research University, Ph.D; Vestyak V.A. (kaf311@mai.ru) - Moscow Aviation Institute (National Research University, Ph.D;
Abstract: In applied problems of mathematical statistics, the exact calculation of some distribution functions of random variables causes significant computational difficulties due to infinite integration limits, the need to minimize objective functions, and the lack of satisfactory approximations. To solve these problems, the authors of the paper have obtained exact analytical relations that allow numerical integration of the distribution functions of a variation coefficient and the non-central Student distribution, which are reduced to calculating single integrals. Inverse distribution functions are determined by minimization based on the Nelder–Mead simplex method. The problem of accurately calculating the numerical characteristics of order statistics is solved in a similar way. The paper describes the developed algorithms and open source JavaScript programs to implement these computational tasks. The calculations are illustrated by graphs and tables that present the results of calculating non-central Student t-distribution quantiles in a range of sample sizes from 3 to 50, probabilities from 0.01 to 0.99 and confidence probabilities of 0.9, 0.95 and 0.99. The calculation time in the full range of all parameters is no more than 10-15 seconds on an average-performance computer. The calculation accuracy is about 10-5. It is noted that the main time expenditure is numerical integration due existing infinite integration limits, while minimization is quick (no more than 20–30 iterations). The paper also presents calculations results of relative variation coefficient quantiles for sample sizes of 3–10, general variation coefficients of 0.05, 0.3 and 0.5 and probabilities in the range from 0.01 to 0.99. There are also comparative calculations of the numerical characteristics of normal and Weibull order statistics obtained by direct integration and corresponding approximations. The programs under consideration use only the simplest and fairly accurate approximations of standard distributions: normal distribution, gamma function, incomplete gamma function. It is noted that the developed algorithms are suitable for a wide class of continuous distributions with their inverse functions that have no acceptable approximations.
Keywords: Weibull distribution, normal distribution, order statistics distribution, variation coefficient exact distribution, non-central Student distribution, javascript, algorithms and programs
Visitors: 3013

2. Global optimization based on hybridization of locust swarm and spider colony algorithm [№2 за 2024 год]
Authors: Rodzin, S.I. (srodzin@sfedu.ru) - Southern Federal University (Associate Professor, Professor), Ph.D;
Abstract: A promising solution to global optimization problems are metaheuristics inspired by nature. They are non-deterministic algorithms that explore solution search space, learn in the search process; they are not tied to a specific task, although they do not guarantee accurate solutions. The purpose of this study is to develop an effective algorithm for solving applied problems of global optimization of multidimensional multi extreme functions in computational phylogenetics problems, in designing electrical circuits, calculations of building engineering safety, calibration of radio propagation models, and others. To achieve this goal, the authors of the paper propose a hybrid algorithm that simulates behavior patterns of a locust swarm and a spider colony. The paper focuses on the issue of reducing the probability of hybrid algorithm premature convergence, maintaining a balance between an algorithm convergence rate and the diversification of a solution search space (intensification/diversification). The paper presents the stages of modified algorithms for a spider colony and a locust swarm that model various patterns of their behavior, which reduces the effect of very good or bad decisions on the search process. The algorithms are hybridized by their sequential combination (preprocessor/postprocessor). The algorithm was tested on seven known multidimensional functions. The results were compared with competing algorithms for a particle swarm, a differential evolution, and a bee colony. The proposed algorithm provides the best results for all considered functions. Verification of the results obtained using the Wilcoxon sum of ranks T-test for independent samples showed that the algorithm results are statistically significant. The developed software application is intended for using in terms of a university course on machine learning and bioinspired optimization, as well as for solving a wide range of scientific and applied problems of search engine optimization.
Keywords: Wilcoxon test, global optimum, agent, locust swarm, spider colony, behavior pattern, search intensification, search diversification, test function, algorithm
Visitors: 2787

3. Developing a specialized ontology to represent the economic concept of a cluster [№2 за 2024 год]
Authors: Napolskikh, D.L. (NapolskihDL@yandex.ru) - Volga State University of Technology (Associate Professor), Ph.D;
Abstract: The object of the research a specialized ontology as a form representing the economic concept of a cluster for using in intelligent systems. The subject of the research is the hierarchy of concepts (classes) of the “Clusters” domain ontology and the structure of relations between them. The methodological tools of the research are the second-version OWL ontological language, an ontology editor and the Protégé framework for building knowledge bases, software tools for working with ontologies. The paper proposes a block diagram describing the ratio of the tools used in the study. It also describes a sequence of development stages for a specialized ontology to present an economic concept: a definition of the list of ontology concept main classes; forming a taxonomic hierarchy of a subject ontology; developing the structure of composite concepts included in the ontology; a definition of relations between ontology elements. The paper presents a list of the main ontology classes of the “Clusters” domain, the developed taxonomic hierarchy of clusters and cluster-type economic systems. The study involved the arrangement of the relations used in terms of the ontology. In addition to the Protégé standard universal relations between an object and a class (IsA), as well as between a subclass and a class (AKO), the paper has identified 17 types of relations necessary for a complete representation of the cluster concept. The proposed ontology of the “Clusters” domain is the basis for the intellectual analysis of various data on clusters in terms of research tasks and cluster policy. The results presented in the paper are the basis for further studies of the integration processes for innovation clusters, digital platforms and ecosystems in the context of regional development management problems.
Keywords: Web ontology language, protege, thesaurus, specialized ontology, semantic technologies, knowledge graph, region management digitalization, clusterization
Visitors: 2870

4. Applying deep learning in brain-computer interfaces for motion recognition [№2 за 2024 год]
Authors: Pavlenko, D.V. (pavlenkoprog@gmail.com) - V.I. Vernadsky Crimean Federal University (Programmer), ; Tataris, Sh.E. (Tataris.shevkhie1@gmail.com) - V.I. Vernadsky Crimean Federal University (Laboratory Assistant); Ovcharenko, V.V. (rk_vladimir@mail.ru) - V.I. Vernadsky Crimean Federal University (Associate Professor, Head of the Research Center), Ph.D;
Abstract: The promising areas of machine learning application are electroencephalogram (EEG) analysis and development of neural interfaces that serve to help people with disabilities, as well as in rehabilitation procedures. Artificial neural networks make it possible to significantly simplify the development of such devices. Neural networks enable automatic identification of EEG patterns associated with certain states and complex signal processing in real time. One of the important aspects in creating neural interfaces is developing software capable of detecting and classifying user movements. Within the framework of this study, the authors solved the tasks of classifying patterns of EEG sensorimotor rhythms, which are associated with imaginary and real voluntary movements of the upper extremities. The developed model is similar to the EEGNet model architecture. This model is designed for real-time EEG analysis using a domestic NVX52 encephalograph. For this purpose, the authors of the paper have collected two datasets with records of healthy people using an incomplete international 10-10 electrode overlay scheme, which included 32 channels. During the study, the authors achieved the accuracy of classifying a number of imaginary movements up to 80 %, the accuracy of recognizing signs of real movements up to 78 % and exceeded recognition rates by more than 10 % compared to the original model. In the future, it is planned to use this model in correctional trainings using a complex consisting of a non-invasive brain-computer interface and hand exoskeletons. Such complexes are used as a part of rehabilitation measures aimed at the rehabilitation of children suffering from cerebral palsy.
Keywords: , sensory-motor rhythm, convolutional neural network, deep learning, EEG, neurointerface, brain-computer interface
Visitors: 2802

5. Compression methods for tabular data: Comparative analysis [№2 за 2024 год]
Authors: Garev, K.V. (garev.kv@gmail.com, kv@garev.ru) - Russian Center for Science Information, Joint Supercomputer Center of RAS (Head of the Department, Research Associate);
Abstract: This paper presents a comparative analysis of table data compression methods used within in terms of the Research Data Infrastructure platform (RDI Platform) operated by the Russian Center for Science Information. The main purpose of the study was to identify the most optimal data compression methods that can be integrated into the RDI Platform structure to enhance the functionality of data exchange and management. The author of the study has carried out a thorough analysis of the five most popular data compression methods available for implementation using the following Python software tools: Deflate (gzip), LZMA, Bzip2, Brotli, and Snappy. The author has analyzed advantages and disadvantages of each considered compression technology, taking into account the specifics of table data processing, including compression ratio, processing speed, and the degree of information preservation. The results of this study can contribute to the practice of exchanging, storing, processing, and analyzing table data in terms of the RDI Platform. The author paid particular attention to analyzing the advantages and disadvantages of each compression method, which allowed forming recommendations for choosing the most suitable technologies that meet strict requirements for performance, compression efficiency, and data preservation reliability. An important aspect of the study is focusing on the possibilities of optimizing processes within the RDI Platform, which increases its efficiency as a tool in the field of working with scientific data. Moreover, this work helps deepen the understanding of the potential application of modern data compression methods in terms of scientific data exchange practice; it opens prospects for further research and development in this area.
Keywords: comparative analysis, Snappy, Brotli, Bzip2, LZMA, Deflate, compression algorithms, compression methods
Visitors: 2441

6. The problem of cognitive representation of the project space of multilayer radiation shields: Information compression algorithms [№2 за 2024 год]
Authors: Zinchenko L.A. (lyudmillaa@mail.ru) - Bauman Moscow State Technical University (Professor), Ph.D; V.V. Kazakov (kazakov.VADIM.2012@yandex.ru) - Bauman Moscow State Technical University (Master Student); Karyshev, B.V. (boris.karyshev@gmail.com) - Bauman Moscow State Technical University (Student);
Abstract: The paper focuses on dimensional reduction algorithm application in the problems of designing multilayer radiation shields for protecting electronic equipment in outer space. The considered algorithms project the studied data on multilayer shields from high-dimensional space into low-dimensional space with preservation of data semantics, which allows visualizing large sets of high-dimensional information and simplifies user visual analysis, as well as application of some algorithms and approaches in an automated mode. The paper analyses the application of the following dimensionality reduction algorithms: principal component analysis (PCA), kernel principal component analysis (KernelPCA), stochastic neighbour embedding with t-distribution (t-SNE), uniform approximation and projection (UMAP), autoencoder (AE), variational autoencoder (VAE). In terms of neural network compression architectures, the paper presents network architectures used in computation and testing. Moreover, according to the proposed methodology, the authors of the paper investigate feasibility of combining several dimensionality reduction algorithms applied along the chain. Based on the conducted research the authors make a conclusion about the efficiency of the mentioned algorithms, as well as their combination for further processing or visualization. There is a brief description of the software that implements one of the proposed approaches in analyzing and processing information about multilayer radiation shields for electronic equipment used in outer space. On the basis of the conducted research it is recommended to use the UMAP algorithm. To analyze configurations with a sufficiently large number of parameters, it is recommended to use the t-SNE algorithm with precompression by the UMAP algorithm, which simplifies the initial data set, thus improving the result of t-SNE.
Keywords: projection, dimensionality reduction, data compression, cognitive visualization
Visitors: 2329

7. Algorithm for segmentation of non-executable files in terms of exploit detection [№2 за 2024 год]
Authors: Arkhipov, A.N. (diskpart111@mail.ru) - Bauman Moscow State Technical University (Teaching Assistant); Kondakov, S.E. (sergeikondakov@list.ru) - Bauman Moscow State Technical University (Associate Professor), Ph.D;
Abstract: The paper solves the applied task of segmenting non-executable files in order to identify information security threats implemented in the form of exploits (malicious code). The relevance of the study is due to the need to solve the scientific problem of detecting exploits, including those undergoing obfuscation technologies, as well as and to increase the effectiveness of the anti-virus information protection subsystem by increasing the sensitivity and specificity of detecting exploits. The aim of the work is to develop an algorithm for segmenting non-executable files, which allows presenting them in the form of blocks (fragments) that ensure the maximum probability entering exploit elements into their composition. Segmentation of non-executable files enables subsequent in-depth analysis of not the entire file, but its fragments for the malicious code. The subject of the study is a variety of methods, techniques, models, algorithms for segmenting non-executable files in order to identify information security threats implemented in the form of exploits (malicious code). The research uses scientific methods of analysis, measurement and comparison. The authors of the paper have developed a segmentation algorithm that allows presenting a non-executable file in the form of blocks (fragments) of optimal sizes necessary to identify exploit elements in their composition. The proposed algorithm is based on an exploit mathematical model embedded in a non-executable file. The model was developed by the authors and mathematically describes the structure, constituent elements and indicators that characterize the algorithm. The proposed algorithm can be used to create new methods, techniques, models, algorithms and tools aimed at improving the effectiveness of protecting information from the effects of malicious code distributed in the form of exploits, including those created using program code obfuscation (obfuscation) technologies.
Keywords: malware, computer virus, anti-virus information protection, obfuscation technologies, segmentation algorithm, exploit detection, information security system, computer attack
Visitors: 2748

8. Psychodiagnostics software with computer vision feedback [№2 за 2024 год]
Authors: Ivaschenko A.V. (anton-ivashenko@yandex.ru) - Samara State Aerospace University, Ph.D; Aleksandrova, M.V. (margarita.alexandrowa@mail.ru) - SEC “Open code” (Project Manager); Zheikov, D.S. (d.s.zhejkov@samsmu.ru) - Samara State Medical University (Lecturer); Mazankina, E.V. (e.v.mazankina@samsmu.ru) - Samara State Medical University (Director of the Center for Psychology at Samara State Medical Universit); Zakharova, E.V. (e.v.zakharova@samsmu.ru) - Samara State Medical University (Associate Professor, Head of Chair), Ph.D; Kolsanov A.V. (avkolsanov@mail.ru) - Innovative Development Institute of the Samara State Medical University (Professor, Director), Ph.D;
Abstract: The paper presents psychological diagnostic software that considers the involvement of respondents when taking tests. The research focuses on managing the depth of user immersion in terms of interactive user interfaces. The paper proposes a formal model of an immersive environment intended to implement a system for managing user immersion into human-computer interaction processes by monitoring and controlling the actions performed in response to emerging events. Such events are effects on a user in the real world, as well as audiovisual stimuli generated automatically in an immersive environment. Response behavior includes action chains automatically recorded by computer vision and oculography means. Unlike analogues, the developed software package for psychological diagnostics and medical rehabilitation allows implementing feedback on monitoring and control of patient involvement. For this purpose, the system includes a computer vision subsystem with a video camera and software for video monitoring of patient’s motor activity. The software tracks the total volume of facial movements, head movements, and also identifies the current emotional state using an artificial neural network. The user interface and tests are personalized to ensure high user involvement. The practical significance of the work is monitoring the involvement of patients when undergoing psychological testing and using this information to personalize medical services. Analysis of involvement of psychological diagnostic software package users makes it possible to supplement psychological testing results and adapt the sequence and content of tests while maintaining user’s interest and reducing the influence of external factors.
Keywords: accented visualization, immersive reality, user interfaces, automation of psychological testing, human-computer interaction
Visitors: 2692

9. Intelligent tutoring system with automatic assignment generation according to the results of using an assignment bank [№2 за 2024 год]
Authors: Sychev, O.A. (oasychev@gmal.com) - Volgograd State Technical University (Associate Professor), Ph.D; Prokudin, A.A. (prokudin@vstu.ru) - Volgograd State Technical University (Postgraduate Student); Denisov, M.E. (denisov@vstu.ru) - Volgograd State Technical University (Postgraduate Student);
Abstract: An educational process includes two feedback loops: learner feedback directs actual learning for a given individual (small loop), while teacher feedback about the quality of the bank of learning problems (big loop) allows improving a learning process in time, increasing the number of assignments which are often used repeatedly and so learners can share solutions. Automating the big feedback loop is an important step in developing intelligent tutoring systems, which grants course authors relief from routine work on creating and enhancing learning assignment banks. Modern advances in methods of learning assignment generation and their classification for using in a learning process allows implementing an intelligent tutoring system, which can generate new learning assignments in the background mode if necessary. The paper describes the architecture of an intelligent tutoring system with background learning assignment generation and the CompPrehension system, which implements the mentioned architecture. We describe all major functional system components: a training aid, a learning assignment bank, an assignment generator, a domain; and their interaction with the assignment database and external semantic reasoners for solving learning assignments. Practical evaluation of the described approach showed that the system can selectively improve the initial unbalanced learning assignment bank, taking into account the history of assignment requests. In less than 5 days, the system has generated several thousand additional learning assignments for each concept with low learning assignment count. This gives low probability of receiving the same assignments even when testing hundreds of students. The study results are significant for educational institutions seeking to improve the learning process.
Keywords: learning assignment bank, CompPrehension, learning system development, distance education, question generation, intelligent tutoring systems
Visitors: 2820

10. Digital simulator for training sheet glass forming operators [№2 за 2024 год]
Authors: Meshalkin V.P (clogist@muctr.ru) - D. Mendeleev University of Chemical Technology of Russia, Ph.D; Chistyakova T.B. (-) - Saint Petersburg State Institute of Technology (Technical University) (Professor), Ph.D; Petrov D.Yu. (iac_sstu@mail.ru) - Yuri Gagarin State Technical University of Saratov, Ph.D;
Abstract: The paper analyzes flat glass forming by a float method and determines the types of abnormal technological process situations. It shows a cluster analysis of many emergency situations; each situation is described as a set-theoretic model. To simplify a digital simulator interface, there are developed business process models for operator training and certification. The paper shows logical and information models of promising business processes Test development and Employee certification developed in BPMN 2.0 notation based on the analysis of existing business processes of an enterprise. When developing the architecture and modes of operation and software and information support of the digital simulator, the authors used a model-based system engineering methodology implemented in the IBM Rhapsody visual modeling environment based on UML and SysML notations. The authors developed use case diagrams and class diagrams of the digital simulator using the UML language, a software structure, as well as a training system database structure that includes criteria for grouping situations, situation characteristics, functions of system users. The main implemented functions are the following: training and testing of operators; maintaining a database of digital simulator personnel; compiling reports on test results; editing an emergency database based on a preliminary analysis of technologists’ and operators’ experience. The digital simulator is designed in the PascalABC language, the results are recorded in a text database. The user interface of the digital simulator has been also developed.
Keywords: emergency situation, class diagram, use case diagram, operator, business process of forming flat glass, business-process, the training, digital simulator
Visitors: 2859

| 1 | 2 | Next →