ISSN 0236-235X (P)
ISSN 2311-2735 (E)


Next issue

Publication date:
16 March 2021

Articles of journal № 4 at 2019 year.

Order result by:
Public date | Title | Authors

1. Algorithm for identifying parameters of a liquid heating device [№4 за 2019 год]
Authors: Lgotchikov V.V., T.S. Larkina
Visitors: 2849
The paper discusses the algorithm for identification of parameters of a liquid heating device used to prepare, pasteurize and conserve agricultural products. Control part of the device uses a microcontroller, which suggests new consumer properties - im-proved quality of the processed product. Liquid heating device parameters are identified programmati-cally. The authors selected two identification parameters: active electric power released in the second-ary body and heat capacity of the liquid medium. These parameters cannot be directly measured with the help of sensors. Identification is made using the Eickhoff algorithm adapted to the process. Performance of the algo-rithm is confirmed by MATLAB simulation results. The model identifies subsystems that solve equa-tions of electro- magnetic and thermal balance for individual elements of the device, control loops of the temperature control system and the process of identifying parameters. It was discovered that the mathematical model with lumped parameters is a sufficient basis for im-proving the algorithm of the device with the control part. It was implemented on the basis of a micro-controller. The proposed modification of the Eickhoff identification algorithm showed good perfor-mance in the field of identifiable quantities of different sizes. Regression dependencies were obtained, which allow implementing the strategy for adjusting the software part. The choice of gain coefficients of the residual signal amplification curves for unobservable process parameters was made easier for known quantization periods of the observed feedback signals.

2. The algorithm for forming many options of a supercomputer structure to solve military applied problems [№4 за 2019 год]
Authors: Ya.N. Gusenitsa , D.O. Petrich
Visitors: 2257
The paper justifies the necessity of creating and applying supercomputers for military applied prob-lems. It presents the rationale to develop a qualitatively new scientific and methodological apparatus for substantiating supercomputer architectures based on the principles of unification, complexation and integration, which provides maximum performance. There is a description of a supercomputer architecture using the model of a computer association including its structure and algorithm. It is shown that a supercomputer structure should be a software-indivisible hardware resource in the computing process organization. Therefore, it is advisable to form a supercomputer structure according to the principle of modular scalability from the same basic mod-ules (computing clusters) that, on the one hand, have structural limitations, and on the other hand, must be formed for specific military-applied tasks. Each computing cluster includes a balanced number of elementary processors and memory units combined using a fully accessible fast channel system and capable of implementing basic sets of various tasks. The presented algorithm, unlike the existing ones, is based on using the graph spectrum when gen-erating many communication structures between computing clusters, as well as elementary processors in the composition of computing clusters. This significantly reduces the computational complexity of forming many variants of a supercomputer structure. As a result, the proposed algorithm can be used to justify architectural decisions when creating new supercomputers, as well as to justify the allocation of computing resources for solving military applied problems on existing supercomputers.

3. Systems model verification based on equational characteristics of СTL formulas [№4 за 2019 год]
Authors: Korablin Yu.P., Shipov A.A.
Visitors: 2938
The paper proposes and examines the RTL notation based on systems of recursive equations and standard Linear Temporal Logic (LTL) semantic definitions and the Computational Tree Logic (CTL). When this notation was still called RLTL, the previous works of the authors showed that it enables easy formulation and verifying of LTL properties with respect to system models, even with those that are al-so specified using the RLTL notation. Then the authors expanded the capabilities of the RLTL notation, so it has become possible to formulate LTL and CTL expressions. Therefore, the first version of the RTL notation was created. This article presents the second version of the RTL, which was the result of refinement and simpli-fication of notation semantic definitions, which allowed increasing the visibility and readability of its expressions. The purpose of the article is to demonstrate the possibility of using the RTL notation as a tool to formulate and verify properties defined by formulas of both LTL and CTL logics using common axioms and rules. This lets RTL to become a single and universal notation for these logics. At the same time, it is possible for RTL to include expressiveness of other temporal logics too by minor additions to its basic definitions. It means that in future it is possible for RTL to become a full-fledged universal tem-poral logic that has all of the necessary tools and means for implementing all stages of verification.

4. Earth's surface visualization in simulation systems [№4 за 2019 год]
Authors: Giatsintov A.M., Mamrosenko K.A., P.S. Bazhenov
Visitors: 2469
Planet visualization is used in many fields: development of geographic information systems, multime-dia systems, simulation systems and simulators. The present paper describes approaches to displaying the Earth's surface that provide real-time visualization. It also lists the problems that arise during ex-tended landscapes visualization and are related to productivity, the coordinates system transformation and visualization accuracy. The paper presents an approach to the Earth's surface visualization that is based upon the use of clipmaps. It allows simplifying data preparation for various parts of the Earth's surface, as well as re-ducing the number of prepared data sets for both the underlying surface and textures with height data. To implement the proposed approach, the architecture of the component used to generate and visu-alize the Earth's surface was developed. The component is built into existing visualization systems. This architecture has the following advantages. It is not necessary to develop own specialized visuali-zation system from scratch. It is possible to use existing visualization systems both open and proprie-tary. Computing load is distributed across available threads in the thread pool. Creation of drawing calls that will be sent to the visualization system is one of multi-threaded pro-cessing examples. The algorithm for processing incoming drawing calls depends on implementation of integration with the particular visualization system. Component's architecture provides the ability to set the time during which the data will be processed inside the component. After each operation is fin-ished the time taken to complete the operation is calculated and the ability to continue processing other tasks is determined. If the time limit is exceeded, task processing will be terminated until the next com-ponent update call.

5. Smart data collection from distributed data sources [№4 за 2019 год]
Author: M.S. Efimova
Visitors: 3754
The paper describes collecting and analysing data from distributed data sources using an example of analysing heterogeneous distributed financial information, analyzes and compares existing approaches to information collection and analysis. Most of the existing approaches that solve this problem require all data to be collected in a single repository to perform analysis on that data. However, such methods imply a delay from the moment when the data is generated until the moment when the analysis methods are applied to it due to the need to transfer the data from the source to the storage location. This signifi-cantly reduces the decision-making efficiency and increases network traffic. In addition, collecting da-ta from all sources can lead to significant costs if access to some of the sources is not free or is limited by a tariff plan. The considered approaches include data warehouses, ETL tools (extraction, transformation and loading), lambda architectures, cloud computing, fog computing, distributed data analysis based on the actor model. It has been concluded that these approaches do not take into account the cost and priori-ties of data sources and do not allow accessing them dynamically. Therefore, they do not meet all the requirements. The paper proposes and describes a method of smart information collection with dynamic reference to data sources depending on current need, cost and source priority. The proposed method allows to re-duce network traffic, speed up data analysis and reduce the costs associated with accessing data sources.

6. Investigation of the optimal number of processor cores for parallel cluster multiple labeling on supercomputers [№4 за 2019 год]
Authors: S.Yu. Lapshina, A.N. Sotnikov, V.E. Loginova , C.Yu. Yudintsev
Visitors: 3264
The article considers the optimum number of processor cores for launching the Parallel Cluster Multi-ple Labeling Technique in the course of conducting simulation experiments on the problem of multi-agent modeling of the spread of mass epidemics on modern supercomputer systems installed in the JSCC RAS. This algorithm can be used in any field as a tool for differentiating large lattice clusters, because he is given input in a format independent of the application. At the JSCC RAS, this tool was used to study the problem of the spread of epidemics, for which an appropriate multiagent model was developed. The model considers an abstract disease transmitted by contact. During the simulation, the thresh-old value of the probability of infection is determined (i.e., the probability of infection itself is a varia-ble parameter), at which the percolation effect appears on the distribution grid of the disease. If this value is close to the contagiousness index of a particular disease, then there is every chance of expect-ing an epidemic to spread on a planetary scale. In the course of imitation experiments, a variant of the Parallel Cluster Multiple Labeling Technique for percolation Hoshen-Kopelman clusters related to the tag linking mechanism, which can also be used in any area as a tool for differentiating large-size lattice clusters, was used to be improved on a multiprocessor system. The article provides an estimate of the execution time of the Parallel Cluster Multiple Labeling Technique for Hoshen-Kopelman percolation clusters for various values of input parameters on high-performance computing systems installed in the JSCC RAS: MVS-10P MP2 KNL, MVS-10P OP, MVS 10P Tornado, MVS-100K.

7. Component in the Kotlin programming language for integrating executable programs into internet resources [№4 за 2019 год]
Authors: A.V. Prendota , P.V. Balakshin
Visitors: 1824
The paper provides an overview of tasks, methods and tools for training and working with the Kotlin programming language. This language was created as an alternative to other popular programming lan-guages based on JVM (Java Virtual Machine), and also to the languages that were used to write various Android and iOS applications. Due to the fact that Kotlin leveled inconveniences of competitive pro-gramming languages, it has become the official tool for the Android operating system since 2017. The importance of creating and supporting an online programming environment as a full-fledged training resource is shown. Using the online environment in training courses makes it possible to create sample programs for studying stylistic features of the language in the form of code that is executed di-rectly in the browser. Such approach allows ensuring the development of new projects even in the ab-sence of a full-fledged development environment on the developer's computer. Syntax highlighting and code completion also attract new users. The paper presents a brief analysis of existing online development platforms and highlights their shortcomings associated with integration problems with third-party sites, a small number of illustrative examples, and the lack of syntax highlighting and code completion. Furthermore the paper provides in-formation on how to solve a number of issues associated with writing and executing code in the Kotlin language. The paper considers the use of the Kotlin Playground library, which converts HTML blocks into specific code editors. This makes it possible to execute the created editors directly in the browser. Kotlin Playground library features such as code execution and compilation for various platforms, markup, code completion and highlighting capabilities, test script creation and execution, and working with the API are also considered. These features allow users to adapt every component of the online environment to his/her needs and to correctly integrate the resulting environment into Internet re-sources. The paper gives examples of the Kotlin Playground library functions, presents its syntax, installa-tion and further loading rules using Node Package Manager, as well as the use of this library. The paper concludes with a link to documentation and identifies common Russian and international educational online platforms on which Kotlin training courses have been or are being implemented using the Kotlin Playground library.

8. The method of fuzzy controllers automatic synthesis [№4 за 2019 год]
Authors: V.V. Ignatyev , V.V. Solovev , A.A. Vorotova
Visitors: 3298
The paper presents the method of fuzzy controllers automatic synthesis based on the measured data. In the course of fuzzy controllers development for technical facilities management systems issues arise related to choosing the number of linguistic variable terms, to determining the type of member-ship functions and to creating the rule base. These issues are solved with the help of experts, but this process is quite labour-intensive and time-consuming. One of possible solutions can be automatic crea-tion of fuzzy controllers based on the measured data, which can be taken from a real management sys-tem or from a simulation model. Authors of the paper developed the structure of control/management system in MatLab Simulink al-lowing to take input and output signals of the controller during simulation process and save them to a file as an array. They also developed an approach to analyze data arrays in order to determine parame-ters of input and output variables of a fuzzy controller and a data clustering mechanism that allows creating a database of fuzzy rules. After analyzing the data arrays, the rules in the database can either be completely duplicated or have the same antecedents and different consequents, which leads to uncertainty. In this regard, the al-gorithm is proposed for eliminating completely duplicate rules from the database and for averaging the rules with different consequents. Software has been developed in the MatLab environment, which al-lows taking the initial data from the technical facility management system with a PI control law, per-forming clustering and parameterization of input and output signals, and creating a rule base and re-duce it. The suggested method of fuzzy controllers automatic synthesis can be used to create controllers that will replace traditional management laws with intellectual ones.

9. Method and software for intellectual support of making logistics decisions [№4 за 2019 год]
Authors: Borisov V.V., A.V. Ryazanov
Visitors: 1768
The paper proposes intellectual support method for making logistic decisions allowing to solve the complex of the following problems: to determine required resources in the course of orders distribution by territorial zones; to divide logistics service territory into zones based on genetic clustering; to dis-tribute orders by zones according to their purpose; to fuzzily assess and assign logistics facilities to fulfill orders based on the modified G. Kuhn method. To implement the stages of intellectual support method for making logistic decisions a set of proce-dures has been developed, and namely procedures: to divide the service territory into zones based on genetic clustering; to determine the required number of logistics facilities (resources) when distributing orders by zones based on a moving time window; to distribute orders by zones and to assign logistic fa-cilities to fulfill orders based on the modified G. Kuhn method. Distribution of orders by zones evaluation and logistic facilities assignment are based on the inte-gral indicator of logistic facilities and orders degree of compliance, considered in the paper. To deter-mine the indicator, a cascading fuzzy production model has been developed that allows taking into consideration different-quality characteristics of logistics facilities and various strategies for orders distribution. Algorithms and software have been developed implementing the proposed method of logistic deci-sions intellectual support. The software includes: (1) subsystem for division of territory into zones, consisting of genetic clusterization and zonal division program modules; (2) subsystem for orders dis-tribution and logistics facilities assignment, consisting of program modules to determine the required number of logistics facilities for zones, to assess conformity between logistics facilities and orders, and to assign logistics facilities. A comparative assessment was carried out, confirming the improved quality of logistics decisions made using the proposed method and software.

10. A method of identifying technical condition radio engineering means using artificial neural network technologies [№4 за 2019 год]
Authors: Dopira R.V., A.A. Shvedun , D.V. Yagolnikov, I.E. Yanochkin
Visitors: 2752
Due to the fact that modern military-grade radio equipment is becoming functionally and technologi-cally more complicated, the urgency of the task of creating functional control systems and identifying technical state of radio equipment is increasing. Nowadays, there are no effective and fully automatic systems for identifying technical state of vari-ous types of radio equipment. One of the ways to solve the problem is to create systems for identifying technical state of radio equipment is based on machine learning principles. A distinctive feature of the application of trained artificial neural networks to solve the identifying problem is the development of a prototype of the observed situations, generalizations for the predomi-nance and similarity in a variety of same type radio equipment, as well as high efficiency and reliabil-ity of solving this problem. The paper presents a method for identifying technical state of radio equipment using case-law prin-ciples of machine learning of artificial neural networks. It allows solving the problem of identifying current classes of the radio equipment technical condition based on measurement results of the main system controlled parameters in real time. Taking into account the problem specifics, the choice of a multilayer direct distribution neural net-work including three hidden layers is substantiated. The number of neurons of the input layer is deter-mined by the number of controlled parameters of the technical condition of the main systems of radio equipment of a particular type. The number of output layer neurons is determined by the number of possible classes of the radio equipment technical condition. Elementary converters of this network have an activation function of a sigmoid type. To train an artificial neural network, the authors used a heuristic modification of the Levenberg-Marquardt algorithm.

| 1 | 2 | 3 | Next →