ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

3
Publication date:
13 September 2024

Latest issue articles

Order result by:
Public date | Title | Authors

1. Calculating inverse distribution functions: Algorithms and programs [№2 за год]
Authors: Agamirov L.V., Agamirov V.L., Vestyak V.A.
Visitors: 403
In applied problems of mathematical statistics, the exact calculation of some distribution functions of random variables causes significant computational difficulties due to infinite integration limits, the need to minimize objective functions, and the lack of satisfactory approximations. To solve these problems, the authors of the paper have obtained exact analytical relations that allow numerical integration of the distribution functions of a variation coefficient and the non-central Student distribution, which are reduced to calculating single integrals. Inverse distribution functions are determined by minimization based on the Nelder–Mead simplex method. The problem of accurately calculating the numerical characteristics of order statistics is solved in a similar way. The paper describes the developed algorithms and open source JavaScript programs to implement these computational tasks. The calculations are illustrated by graphs and tables that present the results of calculating non-central Student t-distribution quantiles in a range of sample sizes from 3 to 50, probabilities from 0.01 to 0.99 and confidence probabilities of 0.9, 0.95 and 0.99. The calculation time in the full range of all parameters is no more than 10-15 seconds on an average-performance computer. The calculation accuracy is about 10-5. It is noted that the main time expenditure is numerical integration due existing infinite integration limits, while minimization is quick (no more than 20–30 iterations). The paper also presents calculations results of relative variation coefficient quantiles for sample sizes of 3–10, general variation coefficients of 0.05, 0.3 and 0.5 and probabilities in the range from 0.01 to 0.99. There are also comparative calculations of the numerical characteristics of normal and Weibull order statistics obtained by direct integration and corresponding approximations. The programs under consideration use only the simplest and fairly accurate approximations of standard distributions: normal distribution, gamma function, incomplete gamma function. It is noted that the developed algorithms are suitable for a wide class of continuous distributions with their inverse functions that have no acceptable approximations.

2. Global optimization based on hybridization of locust swarm and spider colony algorithm [№2 за год]
Author: Rodzin, S.I.
Visitors: 322
A promising solution to global optimization problems are metaheuristics inspired by nature. They are non-deterministic algorithms that explore solution search space, learn in the search process; they are not tied to a specific task, although they do not guarantee accurate solutions. The purpose of this study is to develop an effective algorithm for solving applied problems of global optimization of multidimensional multi extreme functions in computational phylogenetics problems, in designing electrical circuits, calculations of building engineering safety, calibration of radio propagation models, and others. To achieve this goal, the authors of the paper propose a hybrid algorithm that simulates behavior patterns of a locust swarm and a spider colony. The paper focuses on the issue of reducing the probability of hybrid algorithm premature convergence, maintaining a balance between an algorithm convergence rate and the diversification of a solution search space (intensification/diversification). The paper presents the stages of modified algorithms for a spider colony and a locust swarm that model various patterns of their behavior, which reduces the effect of very good or bad decisions on the search process. The algorithms are hybridized by their sequential combination (preprocessor/postprocessor). The algorithm was tested on seven known multidimensional functions. The results were compared with competing algorithms for a particle swarm, a differential evolution, and a bee colony. The proposed algorithm provides the best results for all considered functions. Verification of the results obtained using the Wilcoxon sum of ranks T-test for independent samples showed that the algorithm results are statistically significant. The developed software application is intended for using in terms of a university course on machine learning and bioinspired optimization, as well as for solving a wide range of scientific and applied problems of search engine optimization.

3. Developing a specialized ontology to represent the economic concept of a cluster [№2 за год]
Author: Napolskikh, D.L.
Visitors: 342
The object of the research a specialized ontology as a form representing the economic concept of a cluster for using in intelligent systems. The subject of the research is the hierarchy of concepts (classes) of the “Clusters” domain ontology and the structure of relations between them. The methodological tools of the research are the second-version OWL ontological language, an ontology editor and the Protégé framework for building knowledge bases, software tools for working with ontologies. The paper proposes a block diagram describing the ratio of the tools used in the study. It also describes a sequence of development stages for a specialized ontology to present an economic concept: a definition of the list of ontology concept main classes; forming a taxonomic hierarchy of a subject ontology; developing the structure of composite concepts included in the ontology; a definition of relations between ontology elements. The paper presents a list of the main ontology classes of the “Clusters” domain, the developed taxonomic hierarchy of clusters and cluster-type economic systems. The study involved the arrangement of the relations used in terms of the ontology. In addition to the Protégé standard universal relations between an object and a class (IsA), as well as between a subclass and a class (AKO), the paper has identified 17 types of relations necessary for a complete representation of the cluster concept. The proposed ontology of the “Clusters” domain is the basis for the intellectual analysis of various data on clusters in terms of research tasks and cluster policy. The results presented in the paper are the basis for further studies of the integration processes for innovation clusters, digital platforms and ecosystems in the context of regional development management problems.

4. Applying deep learning in brain-computer interfaces for motion recognition [№2 за год]
Authors: Pavlenko, D.V., Tataris, Sh.E., Ovcharenko, V.V.
Visitors: 373
The promising areas of machine learning application are electroencephalogram (EEG) analysis and development of neural interfaces that serve to help people with disabilities, as well as in rehabilitation procedures. Artificial neural networks make it possible to significantly simplify the development of such devices. Neural networks enable automatic identification of EEG patterns associated with certain states and complex signal processing in real time. One of the important aspects in creating neural interfaces is developing software capable of detecting and classifying user movements. Within the framework of this study, the authors solved the tasks of classifying patterns of EEG sensorimotor rhythms, which are associated with imaginary and real voluntary movements of the upper extremities. The developed model is similar to the EEGNet model architecture. This model is designed for real-time EEG analysis using a domestic NVX52 encephalograph. For this purpose, the authors of the paper have collected two datasets with records of healthy people using an incomplete international 10-10 electrode overlay scheme, which included 32 channels. During the study, the authors achieved the accuracy of classifying a number of imaginary movements up to 80 %, the accuracy of recognizing signs of real movements up to 78 % and exceeded recognition rates by more than 10 % compared to the original model. In the future, it is planned to use this model in correctional trainings using a complex consisting of a non-invasive brain-computer interface and hand exoskeletons. Such complexes are used as a part of rehabilitation measures aimed at the rehabilitation of children suffering from cerebral palsy.

5. Compression methods for tabular data: Comparative analysis [№2 за год]
Author: Garev, K.V.
Visitors: 253
This paper presents a comparative analysis of table data compression methods used within in terms of the Research Data Infrastructure platform (RDI Platform) operated by the Russian Center for Science Information. The main purpose of the study was to identify the most optimal data compression methods that can be integrated into the RDI Platform structure to enhance the functionality of data exchange and management. The author of the study has carried out a thorough analysis of the five most popular data compression methods available for implementation using the following Python software tools: Deflate (gzip), LZMA, Bzip2, Brotli, and Snappy. The author has analyzed advantages and disadvantages of each considered compression technology, taking into account the specifics of table data processing, including compression ratio, processing speed, and the degree of information preservation. The results of this study can contribute to the practice of exchanging, storing, processing, and analyzing table data in terms of the RDI Platform. The author paid particular attention to analyzing the advantages and disadvantages of each compression method, which allowed forming recommendations for choosing the most suitable technologies that meet strict requirements for performance, compression efficiency, and data preservation reliability. An important aspect of the study is focusing on the possibilities of optimizing processes within the RDI Platform, which increases its efficiency as a tool in the field of working with scientific data. Moreover, this work helps deepen the understanding of the potential application of modern data compression methods in terms of scientific data exchange practice; it opens prospects for further research and development in this area.

6. The problem of cognitive representation of the project space of multilayer radiation shields: Information compression algorithms [№2 за год]
Authors: Zinchenko L.A., V.V. Kazakov , Karyshev, B.V.
Visitors: 250
The paper focuses on dimensional reduction algorithm application in the problems of designing multilayer radiation shields for protecting electronic equipment in outer space. The considered algorithms project the studied data on multilayer shields from high-dimensional space into low-dimensional space with preservation of data semantics, which allows visualizing large sets of high-dimensional information and simplifies user visual analysis, as well as application of some algorithms and approaches in an automated mode. The paper analyses the application of the following dimensionality reduction algorithms: principal component analysis (PCA), kernel principal component analysis (KernelPCA), stochastic neighbour embedding with t-distribution (t-SNE), uniform approximation and projection (UMAP), autoencoder (AE), variational autoencoder (VAE). In terms of neural network compression architectures, the paper presents network architectures used in computation and testing. Moreover, according to the proposed methodology, the authors of the paper investigate feasibility of combining several dimensionality reduction algorithms applied along the chain. Based on the conducted research the authors make a conclusion about the efficiency of the mentioned algorithms, as well as their combination for further processing or visualization. There is a brief description of the software that implements one of the proposed approaches in analyzing and processing information about multilayer radiation shields for electronic equipment used in outer space. On the basis of the conducted research it is recommended to use the UMAP algorithm. To analyze configurations with a sufficiently large number of parameters, it is recommended to use the t-SNE algorithm with precompression by the UMAP algorithm, which simplifies the initial data set, thus improving the result of t-SNE.

7. Algorithm for segmentation of non-executable files in terms of exploit detection [№2 за год]
Authors: Arkhipov, A.N., Kondakov, S.E.
Visitors: 323
The paper solves the applied task of segmenting non-executable files in order to identify information security threats implemented in the form of exploits (malicious code). The relevance of the study is due to the need to solve the scientific problem of detecting exploits, including those undergoing obfuscation technologies, as well as and to increase the effectiveness of the anti-virus information protection subsystem by increasing the sensitivity and specificity of detecting exploits. The aim of the work is to develop an algorithm for segmenting non-executable files, which allows presenting them in the form of blocks (fragments) that ensure the maximum probability entering exploit elements into their composition. Segmentation of non-executable files enables subsequent in-depth analysis of not the entire file, but its fragments for the malicious code. The subject of the study is a variety of methods, techniques, models, algorithms for segmenting non-executable files in order to identify information security threats implemented in the form of exploits (malicious code). The research uses scientific methods of analysis, measurement and comparison. The authors of the paper have developed a segmentation algorithm that allows presenting a non-executable file in the form of blocks (fragments) of optimal sizes necessary to identify exploit elements in their composition. The proposed algorithm is based on an exploit mathematical model embedded in a non-executable file. The model was developed by the authors and mathematically describes the structure, constituent elements and indicators that characterize the algorithm. The proposed algorithm can be used to create new methods, techniques, models, algorithms and tools aimed at improving the effectiveness of protecting information from the effects of malicious code distributed in the form of exploits, including those created using program code obfuscation (obfuscation) technologies.

8. Psychodiagnostics software with computer vision feedback [№2 за год]
Authors: Ivaschenko A.V., Aleksandrova, M.V., Zheikov, D.S., Mazankina, E.V., Zakharova, E.V., Kolsanov A.V.
Visitors: 333
The paper presents psychological diagnostic software that considers the involvement of respondents when taking tests. The research focuses on managing the depth of user immersion in terms of interactive user interfaces. The paper proposes a formal model of an immersive environment intended to implement a system for managing user immersion into human-computer interaction processes by monitoring and controlling the actions performed in response to emerging events. Such events are effects on a user in the real world, as well as audiovisual stimuli generated automatically in an immersive environment. Response behavior includes action chains automatically recorded by computer vision and oculography means. Unlike analogues, the developed software package for psychological diagnostics and medical rehabilitation allows implementing feedback on monitoring and control of patient involvement. For this purpose, the system includes a computer vision subsystem with a video camera and software for video monitoring of patient’s motor activity. The software tracks the total volume of facial movements, head movements, and also identifies the current emotional state using an artificial neural network. The user interface and tests are personalized to ensure high user involvement. The practical significance of the work is monitoring the involvement of patients when undergoing psychological testing and using this information to personalize medical services. Analysis of involvement of psychological diagnostic software package users makes it possible to supplement psychological testing results and adapt the sequence and content of tests while maintaining user’s interest and reducing the influence of external factors.

9. Intelligent tutoring system with automatic assignment generation according to the results of using an assignment bank [№2 за год]
Authors: Sychev, O.A., Prokudin, A.A., Denisov, M.E.
Visitors: 364
An educational process includes two feedback loops: learner feedback directs actual learning for a given individual (small loop), while teacher feedback about the quality of the bank of learning problems (big loop) allows improving a learning process in time, increasing the number of assignments which are often used repeatedly and so learners can share solutions. Automating the big feedback loop is an important step in developing intelligent tutoring systems, which grants course authors relief from routine work on creating and enhancing learning assignment banks. Modern advances in methods of learning assignment generation and their classification for using in a learning process allows implementing an intelligent tutoring system, which can generate new learning assignments in the background mode if necessary. The paper describes the architecture of an intelligent tutoring system with background learning assignment generation and the CompPrehension system, which implements the mentioned architecture. We describe all major functional system components: a training aid, a learning assignment bank, an assignment generator, a domain; and their interaction with the assignment database and external semantic reasoners for solving learning assignments. Practical evaluation of the described approach showed that the system can selectively improve the initial unbalanced learning assignment bank, taking into account the history of assignment requests. In less than 5 days, the system has generated several thousand additional learning assignments for each concept with low learning assignment count. This gives low probability of receiving the same assignments even when testing hundreds of students. The study results are significant for educational institutions seeking to improve the learning process.

10. Digital simulator for training sheet glass forming operators [№2 за год]
Authors: Meshalkin V.P, Chistyakova T.B. , Petrov D.Yu.
Visitors: 311
The paper analyzes flat glass forming by a float method and determines the types of abnormal technological process situations. It shows a cluster analysis of many emergency situations; each situation is described as a set-theoretic model. To simplify a digital simulator interface, there are developed business process models for operator training and certification. The paper shows logical and information models of promising business processes Test development and Employee certification developed in BPMN 2.0 notation based on the analysis of existing business processes of an enterprise. When developing the architecture and modes of operation and software and information support of the digital simulator, the authors used a model-based system engineering methodology implemented in the IBM Rhapsody visual modeling environment based on UML and SysML notations. The authors developed use case diagrams and class diagrams of the digital simulator using the UML language, a software structure, as well as a training system database structure that includes criteria for grouping situations, situation characteristics, functions of system users. The main implemented functions are the following: training and testing of operators; maintaining a database of digital simulator personnel; compiling reports on test results; editing an emergency database based on a preliminary analysis of technologists’ and operators’ experience. The digital simulator is designed in the PascalABC language, the results are recorded in a text database. The user interface of the digital simulator has been also developed.

| 1 | 2 | Next →