ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 3 at 2018 year.

Order result by:
Public date | Title | Authors

11. A mathematical model approach for evaluation of oil recovery changes caused by wettability during in-situ combustion [№3 за 2018 год]
Authors: I.V. Afanaskin , A.V. Korolev, V.A. Yudin
Visitors: 4455
The paper considers a problem arising in simulation of EOR thermal methods that is wettability changes influenced by formation heating. EOR thermal methods may make a formation temperature rise up to 100–500 ºС above the initial value. Wettability may change in time and space as a result of such artificial temperature distribution. Thus, the fluid flow in formation and the result of applying EOR thermal methods may change dramatically. Investigation of wettability temperature changes requires complicated experiments that take into account temperature changes of a big number of physical processes effecting wettability. Before conducting such experiments, it is mandatory to estimate the importance of such factor in forecasting EOR thermal methods result using computer simulation. The article took into account wettability temperature changes by implementing temperature dependence of critical points on phase relative permeability curves, both for oil and water. The authors consider the problem of multi-phase multi-component non-isothermal filtration through with chemical reactions that took into account a temperature dependence of relative phase permeabilities. They proposed an original finite-difference scheme that is implicit on pressure and explicit on temperature and concentrations. The calculations showed that if wettability changes due to temperature profile variability in formation space and time are neglected, considerable errors may occur in estimates of cumulative production and time of oxygen breakthrough in predicting in-situ combustion results by computer simulation. These errors may be as much as 20 % in cumulative production and 10 % in breakthrough time. To take into account this effect, it is necessary to study the changes in wettability (and relative permeability) experimentally with increasing formation temperature.

12. Research on compression of raster images using artificial neural networks [№3 за 2018 год]
Authors: A.A. Genov, K.D. Rusakov , A.A. Moiseev, V.V. Osipov
Visitors: 10484
Modern rates of information growth stored on hard disks transferred over the Internet and local enterprise networks has made it necessary to solve the problem of compressing, transferring and storing data. Most of the transferred data is in the form of multimedia content. Nowadays, the algorithms for compressing visual information based on the neural network apparatus are becoming more popular. Unlike classical algorithms, which are based on the elimination of redundancy, these algorithms are based on artificial neural networks. The field is relevant due to the development of mathematical algorithms for network learning, which will improve existing compression methods in the future. The analysis of publications showed that nowadays there is no particular information about the influence of the artificial neural network architecture on a learning process and the quality of their work in real multimedia content. The important task is to select a network topology, which is most suitable for compressing visual information. The purpose of the article is to describe the capabilities of one of the types of artificial neural networks called a multi-layer perceptron in the area of compression and recovery of images of an arbitrary type. The paper analyzes topologies of artificial neural networks, algorithms for their learning, and the efficiency of their work. It also describes the architecture of a “bottleneck”, which is most often used in solving the problem of image compression and recovery. The authors give one of the ways of encoding and decoding data obtained during network operation. The paper describes a computational experiment and gives its results. The experiment showed that using a multilayer perceptron with an input vector of more than eight values turned out to be less effective. As a result, the authors propose the most suitable network architecture to use in practice.

13. Methods of developing graphics subsystem drivers [№3 за 2018 год]
Authors: Efremov I.A., Mamrosenko K.A., Reshetnikov V.N.
Visitors: 9040
The paper describes problems of software development for the problems of interaction between systems-on-a-chip (SoC) and the Linux operating system (OS). The OS architecture provides various instruments for creating a driver that is a component allowing the device data exchange using a software interface. The development of drivers for an open source OS is difficult due to continuous changes in functions and a kernel structure. The paper describes graphics subsystem structure and components. The subsystem is a component kit located in different address spaces of OS virtual storage. The components interact through a system call interface. Programming of a graphics engine is performed by filling a command buffer. Each application has a graphics engine context that contains its own command buffer and all necessary data used by the graphics engine for rendering/calculations: coordinates, normal vectors, colors, textures. There are several approaches to setting graphics mode. However, the most reasonable solution is using KMS module (Kernel Mode Setting). Key manufacturers of microprocessors and graphics cards commonly use these modules. It is necessary to ensure the interaction between OS kernel modules and user space through creating own specific system calls. These system calls regulate low-level operations with the device and allow taking full advantage of the graphics unit capabilities. Using FPGA-based prototyping platforms allows verifying software functionality, getting performance characteristics and finding errors in SoC hardware design at early stages. Debugging kernel modules is time-consuming due to limitations imposed both by a prototyping platform and the OS. In addition, the errors in a kernel code are difficult to reproduce, which also complicateы debugging of kernel modules. The paper considers some approaches to implementation of Linux OS KMS module and graphics subsystem components, which provide correct interaction of the OS and the SoC display controller.

14. Models of enterprise information system support in lifecycle stages [№3 за 2018 год]
Author: Yu.M. Lisetskiy
Visitors: 9434
The article considers an enterprise as a complex organizational system, which requires a modern management information system for effective functioning. Such system enables information collection, storage, and procession to increase relevance and timeliness of made decisions. The problem might be solved based on complex automation of all industrial and technological processes and required resources management. The paper shows that the information system description is formed based on a lifecycle model, which defines the order of development stages and criteria of stage transition. An information system lifecycle model is a structure that defines a sequence of completion and interconnection of processes, actions and tasks throughout a life cycle. The structure of the information system life cycle is based on three groups of processes: primary (acquisition, supply, development, operation, maintenance), supplementary (documenting, configuration management, quality assurance, verification, attestation, assessment, audit, problem resolution) and organizational (project infrastructure building, project management, definition, life cycle assessment and improvement, training). The paper describes the most widely spread life cycle models such as waterfall, iterative and incremental (stage-by-stage model with intermediate control) and spiral. It demonstrates that the enterprise information system appears to be a passive category in the processes of study and design. This category functioning can be described using support life cycle models including composition, functioning and development models. Development of these three models appears to be an additional informational factor that enables structuring of the process of enterprise information system creation and functioning.

15. Modeling brain activity recognizing anagrammatically distorted words [№3 за 2018 год]
Author: Z.D. Usmanov
Visitors: 6300
The object of research are natural language texts the words in which were corrupted by random letter transpositions. The authors analyze the ability of a human brain to accurately recognize the meaning of distorted texts. offer mathematical models how the brain decides the problem. The paper describes a mathematical model that explains how the brain solves the problem in cases when a) the first, b) the last, c) the first and last letters of words remain in their places, and all others are reset arbitrarily and, finally, in the most general case, d) when no letter is fixed and all letters within a word can be placed in any order. The explanation is based on the concept of a word anagram (in the broad sense, the set of its letters arranged in any sequence) as well as on the concept of an anagram prototype. A simplified mathematical model assumes that the brain perceives each anagram separately; recognizes it correctly if it has a single prototype. In the case when there are several such prototypes, the brain automatically selects the one that has the highest frequency of occurrence in texts. The acceptability of this model was tested in English, Lithuanian, Russian and Tajik, as well as in the artificial language such as Esperanto. For all languages, efficiency of the correct recognition of distorted text was at the level of 97–98%. If it is necessary to achieve higher indicators, one can refer to an extended idea in which the brain takes into account couples, and maybe triples of neighboring letter sets.

16. Modeling nanoporous structures of silica-resorcinol-formaldehyde aerogels [№3 за 2018 год]
Authors: I.V. Lebedev, A.Yu. Tyrtyshnikov, Ivanov S.I., Menshutina N.V.
Visitors: 6271
The paper is dedicated to investigating and modeling the structure of silica-resorcinol-formaldehyde aerogels. It considers experimental research on production of hybrid silica-resorcinol-formaldehyde aerogels based on the varying conditions for their production (reagent ratio, amount of solvent, etc.). The structural characteristics were the following parameters: a specific surface area and pore size distribution. The generation of structures corresponding to the real ones makes it possible to model various properties of aerogels in silico, which in turn saves resources when carrying out costly experiments. The authors have studied the existing methods of generating porous structures of silicon-resorcinol-formaldehyde aerogels. To model such aerogel structures, they have chosen the Diffusion-Limited Cluster Aggregation (DLCA) method. The paper considers the conducted computational experiments for generating model structures and compares them with experimental ones according to the selected criteria (pore size distribution and specific surface area). The results of a number of computational experiments showed good convergence between experimental and simulated structures of hybrid silicon-resorcinol-formaldehyde aerogels. To implement this method, a C# algorithm was developed in the Microsoft Visual Studio development environment. The created software requires the Microsoft Windows 7 operating system and above and at least 2 GB of RAM. The paper presents the results of computational experiments and the algorithm for generating silica-resorcinol-formaldehyde aerogel structures. The developed software allows obtaining real structures of silica-resorcinol-formaldehyde aerogels with given structural characteristics.

17. Modeling traffic flows in AnyLogic [№3 за 2018 год]
Authors: Ya.I. Shamlichky, A.S. Okhota, S.N. Mironenko
Visitors: 10522
The article proposes a technique for modeling traffic flows in a simulation environment. The goal was to simulate a section of the road network in Krasnoyarsk. The goal included the following tasks: to gather data on the traffic flow intensity at a site, to develop a simulation model of an intersection. The authors have chosen the AnyLogic modeling environment to solve the problems. The simulation experiment requires the input parameters. I this case, they are the intensity of vehicles arrival and the distribution of vehicles by a direction. The developed simulation model has the following structural elements: road network elements, model agent generation system, vehicle traffic logic blocks, model parameters control elements, and an agent statistics gathering module. The model execution mode allows displaying animation, which is a two-dimensional plan of the simulated system with moving vehicles. In addition, there is a functional for switching between a two-dimensional and three-dimensional system plan. The statistical data take into account the time a vehicle passes a road network segment, as well as the total intersection traffic capacity. The procedure of performing the experiment is a preliminary tuning of a simulation model to the average traffic capacity of the intersection (this usually occurs after a number of vehicles exiting the model using the Sink component reaches 20 or more thousands). Then while changing the time of the traffic lights, the simulation model is started alternately. After finishing a series of runs, follows the calculation of the difference in the mean delay of hard and adaptive regulation, construction of the graphs, and making the conclusions. As a result, we get a simulation model with an experimental technique that can be useful in determining the maximum traffic capacity at traffic intersections, planning road infrastructure, etc.

18. A nested model of radio-electronic systems for estimation of temporary reliability [№3 за 2018 год]
Authors: S.V. Ignatev, Yu.A. Plaksa, A.V. Krasnikov , А.V. Drozhin
Visitors: 4322
The effective intended application of special-purpose complexes based on radio-electronic systems involves a proper choice of optimal exploitation methods, as well as organization of maintenance, first line repair and supply of the sys- tems with replacement tools and supplies to provide high readiness of the systems for using as intended. For this purpose, there is a system of maintenance with the operating effectiveness depending on the relative position of the radio-electronic systems in a terrain. The essence of the article implies the construction of a nested model of radio-electronic systems designed to develop environment tools allowing to estimate temporary reliability characteristics of the nested radio-electronic systems as well as to undertake the study taking into account a relative position of nest elements and temporary correlation between them. The construction of the nest model has two stages. The first one is the construction of a terrain transport network representing a combination of a graph and a reachability matrix. The approach allows obtaining all possible routes between transport network elements. The second stage includes a description of radio-electronic nest systems by highlighting special type apexes in the transportation network, which include nest radio-electronic systems elements. Then there goes the construction of a spatial and temporary nested model of radio-electronic systems that represens the combination of a graph and reachability sub-matrixes with each route being relevant of temporary features (the route movement time). The spatial and temporary nested model is implemented in C#. It allows calculating temporal reliability indicators taking into account the influence of various factors and estimating the degree of their influence on the availability factor.

19. Multi-agent simulation of epidemics' distribution on supercomputers [№3 за 2018 год]
Author: S.Yu. Lapshina
Visitors: 5868
The paper considers the possibility of using modern supercomputers to solve resource-intensive problems of multi-agent simulation of the advance of mass epidemics based on the percolating cluster growth theory. In the problems of determining quarantine zones in advance of epidemics a multi-agent percolation model supposes the formation of an interaction grid of population representatives, modeling of a disease distribution medium, the collection of information on the population size, the implementation of a parallel algorithm for multiple marking of percolating clusters with a tagging mechanism, and result visualization. The article describes an improved variant of the algorithm of multiple marking of Hoshen-Kopelman percolation clusters for a multiprocessor system, as well as a working prototype of its implementation developed at the JSCC RAS (Branch of SRISA). This algorithm can be used in any area as a tool for differentiating large-size lattice clusters, since it has the input in a format that is independent of the application. The paper demonstrates the possibility of revealing the dependencies of latent periods of epidemic spread on the probability of infecting aggregates of population representatives and the formation of threshold values for the transition of local epidemics into large-scale pandemics. After setting a latent period, the chance and the source of infection one can determine the range of cities where infection can be expected. This information is used to determine the radius of a quarantine zone. If a hotbed of disease is found in some city and the latent period has already ended, then this tool might help to determine a zone to be isolated from the outside world. The article also provides estimates of the execution time of the multiple-labeling algorithm for Hoshen-Kopelman percolating clusters for different values of input parameters in two high-performance computing systems installed in the MSC RAS – MVS-100K and MVS-10P.

20. A neural network method for detecting malicious programs on the Android platform [№3 за 2018 год]
Authors: Tatarnikova, T.M., A.M. Zhuravlev
Visitors: 5507
Continuous growth in the number of malicious programs aimed at the Android operating system makes the problem of their detection very important. The lack of a centralized mechanism for distributing applications and effectiveness of existing solutions exacerbate the problem. The paper demonstrates experimentally the lack of effectiveness of the existing mechanisms for detecting malicious programs in the Android operating system. The need for an automated solution that increases the probability of malicious program detection in the Android operating system including previously unknown modified and obfuscated versions of already known programs determined the purpose of the work. The authors used the methods of neural network classification, static code analysis and analysis of program behavior in a virtual environment. The novelty of the proposed solution is the use of classification features obtained both by static code analysis and by analysis of program behavior on a virtual device. The proposed approach eliminates the shortcomings of existing solutions to the problem under consideration. The ability of the proposed solution to detect malicious programs that were not used in neural network training, as well as obfuscated instances, has been experimentally proven. The practical significance of the proposed solution is its use for building malware detection systems of the Android operating system, which can be applied in the Android application store.

← Preview | 1 | 2 | 3 | 4 | Next →