ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 4 at 2019 year.

Order result by:
Public date | Title | Authors

11. Web-robot detection method based on user’s navigation graph [№4 за 2019 год]
Authors: A.A. Menshchikov , Yu.A. Gatchin
Visitors: 7428
According to reports of web security companies, every fifth request to a typical website is from mali-cious automated system (web robots). Web robots already prevail over ordinary users of web resources in terms of traffic volume. They threaten data privacy and copyright, provide unauthorized information gathering, lead to statistics spoiling, and performance degradation. There is a need to detect and block the source of robots. The existing methods and algorithms involve syntactic and analytical processing of web server logs to detect web robots. Such approaches cannot reliably identify web robots that hide their presence and imitate the behavior of legitimate users. This article proposes a method of web-robot detection based on the characteristics of the page web-graph. The characteristics of the analyzed sessions include not only the features of a user web graph, but also parameters of each node visited by him (in and out degrees, centrality measures, and others). To calculate such characteristics, a connectivity graph of pages was constructed. Based on the analysis of these parameters, as well as the characteristics of the web robot's behav-ioral graph, the authors make a decision to classify the session. The authors provide an analysis of different behavioral patterns, describe the basic principles of ex-tracting the necessary data from web server logs, and the method of the connectivity graph construction as well as the most significant features. The paper conciders a detection procedure and selection of an appropriate classification model. For each studied model, the authors select optimal hyperparameters and perform cross-validation of the results. The analysis of the accuracy and precision of such detec-tion shows that the usage of XGboost library allows obtaining F1 measure equals 0.96.

12. The method for translating first-order logic formulas into positively constructed formulas [№4 за 2019 год]
Authors: Davydov A.V., A.A. Larionov , E.A. Cherkashin
Visitors: 4793
The paper considers the logic calculus of positively constructed formulas (PCF calculus) and based on it automated theorem proving (ATP) method. The PCF calculus was developed and described as a first-order logic formalism in works of S.N. Vassilyev and A.K. Zherlov as a result of formalizing and solv-ing problems of control theory. There are examples of describing and solving some control theory problems, effectively (from the point of view of the language expressiveness and the theorem proving means efficiency) solved using PCF calculus, for example, controlling a group of lifts; directing a tele-scope at the planet center, which is in an incomplete phase, and mobile robot control. Comparing to the capabilities of other logical means for subject domain formalization and logic conclusion search, the PCF calculus have the advantage of the expressiveness combined with the com-pactness of knowledge representation, the natural parallelism of their processing, large block size and lower combinatorial complexity of conclusions, high compatibility with heuristics, and great capabili-ties for interactive proof. The selected class of formulas makes it possible to build constructive proofs. This class of formulas is much wider than the class of Horn clauses used in the Prolog. There are no re-strictions in the logical formalization of the axiomatic base of the subject domain, and the target state-ment is a conjunction of queries (in terms of the Prolog). To test the ATP software system (prover) based on the PCF calculus the authors used the TPTP (Thousands of Problems for Theorem Provers) library. The TPTP format has become a standard in the community that studies automated reasoning. There is a natural need for the developed prover to ac-cept problems in this format as input. Thus, the problem of translating the first-order predicate logic formulas presented in the TPTP format to the POF format arises. This problem is nontrivial due to the special structure of the PCF calculus formulas. The paper proposes a more efficient translation method (compared to the previously developed al-gorithm in the first implementation of the prover based on the PCF calculus) for the first-order predi-cate calculus language preserving the original heuristic knowledge structure, and its simplified version for the problems presented in language of clauses. The efficiency is a number of steps and the length of the obtained formulas. The proposed method was implemented as a software system – a language trans-lator of first-order TPTP logic formulas to the PCF calculus language. The paper presents test results of the developed method, which imply that there is a certain class of first-order formulas that are not tak-en into account as special by existing ATP systems, while the PCF calculus has special strategies that increase the efficiency of the inference search for such class of formulas.

13. Method of forming a priority list of automated control equipment in special purpose systems and its software implementation [№4 за 2019 год]
Authors: V.L. Lyaskovsky , I.B. Bresler , M.A. Alasheev
Visitors: 4112
The paper considers the method of forming a priority list of control equipment for distributed infor-mation management systems (DIMS) designed for special and military applications that have to be equipped with automation tools, as well as its software implementation as a part of decision making support system. The need to develop and apply this method arises from the fact that DIMS, as a rule, are created in several stages over a long time. This is mainly due to high complexity and cost of development, manu-facture and supply of automation equipment complexes, as well as to limited financial resources, tech-nological and production capabilities of all participants of this process. At the same time it is intuitive-ly clear that equipping some controls with automation tools can make a more significant contribution to improving the efficiency of the entire system than automating other controls. However, there has been no formalized method till present that could substantiate the sequence of equipping controls with au-tomation facilities based on their most significant parameters and characteristics. In this regard devel-opment of a method for forming the priority list of DIMS control equipment is a very important and practically significant task. The essence of the proposed method lies in the consistent assessment of every unit of control equipment (CE) in accordance with the developed system of classification criteria. Moreover, all clas-sification criteria are hierarchically interconnected. Their importance decreases from the first to the last one. Application of the method is connected with the need to collect, store and process arrays of initial data. To use the method in a more convenient way, to reduce the time of information processing and the number of errors associated with the human factor, the authors developed the software implementing the method as an integral part of the developed decision making support system. The method can be used by contractors and research organizations to substantiate the sequence of work in the course of DIMS development and elaboration.

14. . Methods and tools for modeling supercomputer job management system [№4 за 2019 год]
Authors: Baranov, A.V., D.S. Lyakhovets
Visitors: 6572
The paper discusses the methods and tools of modeling supercomputer job management systems, such as SLURM, PBS, Moab, and the domestic management system of parallel job passing. There are high-lighted job management system modeling methods including modeling with real supercomputer system, JMS modeling by a virtual nodes, and simulation modeling. The authors consider methods and tools for constructing a model job stream. The management system of parallel job passing example shows the impossibility of accurate repro-ducing a full-scale experiment with real supercomputer. The paper investigates the adequacy of the job management systems model in a broad and narrow sense. It is shown that an adequate in the narrow sense job management system model ensures compliance only with interval indicators and cannot be used as a forecast model. The authors consider a numerical estimate of the proximity of two event streams in order to determine the adequacy in a broad sense. The first event stream is the stream of real supercomputer events. The second one is the stream of events produced by a job management systems model. The normalized Euclidean distance between two vectors corresponding to the compared streams is proposed as a measure of proximity of two streams. The vectors' dimension is equal to the number of processed jobs, the vectors components are the job residence times in the job management systems. The method of adequacy determination is based on a comparison of the real supercomputer statis-tics and the results of job management systems modeling. The adequacy measure reference value is de-termined as the normalized Euclidean distance between the vectors of job residence times in the real system and in the job management system model.

15. Modeling and analysis of programs for multidimensional interval-logic controllers [№4 за 2019 год]
Authors: A.F. Antipin , E.V. Antipina
Visitors: 4908
The paper examines special software designed for modeling the operation of multidimensional fuzzy interval-logic controllers and analysis of their programs for programmable logic controllers. They can be used at enterprises of chemical, oil and oil refining industries in the development of automatic con-trol systems by technological processes and objects that do not have adequate mathematical models. The relevance of software development arises from the lack of application programs designed to simu-late operation of fuzzy controllers according to available experimental data. The software described in the paper allows calculating the necessary and sufficient number of pro-duction rules and the number of critically important rules that make up a production system. In addi-tion, according to available initial data obtained as a result of experiments, it is possible to create fuzzy models of multidimensional fuzzy interval-logic controllers. The paper presents the results of a computational experiment in creating a fuzzy model of multidi-mensional fuzzy interval-logic controllers and analyzing their programs for programmable logic con-trollers, during which the main parameters of this type of controllers were calculated: the maximum number of production rules that make up the production system; the total number of terms and the number of critically important terms for each variable; the total number of variable groups and the number of critically important variable groups; the maximum number of production rules, the number of critical production rules for each of variable groups and the actual number of critical rules that make up the production system. Based on calculation results, conclusions were drawn regarding the complexity of a production sys-tem for multidimensional fuzzy interval-logic controllers and about how to achieve the required calcu-lation accuracy.

16. Model and algorithm of selecting software architecture for IoT systems [№4 за 2019 год]
Author: Yu.V. Yadgarova
Visitors: 2772
The paper presents an analytical model for cost estimation and an algorithm for selecting a basic pat-tern of software architecture, as well as design tactics for the IoT (Internet of things) systems. It gener-alizes the concept of IoT technologies, reviews software systems quality parameters, emphasizes the main significant parameters applicable to IoT systems and proposes methods to achieve them. The nec-essary quality parameters of software systems are achieved by implementing the basic pattern of soft-ware architecture and related design tactics. The paper introduces the analytical model of dependency of project labor intensity on the used el-ements of software architecture. The project labor intensity is calculated according to the COCOMO II method. The search algorithm for the basic architecture pattern and design tactics is presented. The in-dicated algorithm is built on the basis of a local search when solving the problem of satisfying con-straints with minimizing the labor intensity function. Selecting a pattern, user preferences are also tak-en into account. The model and algorithm allow selecting the most appropriate architecture patterns and tactics for a particular type of project at early stages of their design. Indicated approach allows reducing errors in software architecture design at the initial stage of IoT-architecture pattern selection. The paper considers implementation of this approach in the project of flexible workspace management system development. It is advisable to use the approach to achieve the required system quality parameters and minimize errors when selecting software architecture at the ini-tial stages of the project, finally reducing the project cost. The approach can also be used to develop fully functional prototypes on a tight schedule.

17. Fuzzy expert evaluations of adverse external impacts on the marine search and rescue operations effectiveness [№4 за 2019 год]
Authors: V.V. Kochnev , A.S. Rekhov , Sorokin V.E., E.V. Taranukha
Visitors: 3552
The paper considers using fuzzy expert evaluations of adverse external impacts upon effectiveness of various marine search and rescue operations. According to existing methodology evaluation of search and rescue operation effectiveness is cal-culated according to a mathematical model under ideal external conditions. Further it is adjusted by the probability of pairwise independent and joint events of adverse external influences for various types of activities of forces and means of a search and rescue operation, which are characterized by certain un-favorable factors with empirically determined weights (significances) decreasing the efficiency. Ab-sence of gradations for most adverse factors makes evaluation significantly rough. To overcome this fact it is proposed to abandon weights and assess the level of exposure to adverse factors using fuzzy multi-valued verbal expert evaluations. To implement such a transition, linguistic variables of adverse factors impacts are introduced. Their term-sets are determined on the basis of a person’s known abili-ties to distinguish gradations in verbal evaluations. The values of term-sets (fuzzy sets at which maxi-mum membership functions are achieved) carrier base scale are correlated with the available weights (significance) of adverse factors. The paper provides a practical example of fuzzy expert evaluation of adverse factors influence on search and rescue operation effectiveness under the assumption that the adjacent gradations of verbal evaluations by a person are equidistant and that the values on the carrier’s base scale are equal to zero in the absence of adverse factors. The use of widespread normalized triangular membership functions allows effectively evaluating and adequately adjusting the values of search and rescue operation effec-tiveness under the influence of adverse factors even with a small amount of initial data in the form of weights (significance) of adverse factors. Significant simplicity of expert choice deffuzzification im-plementation is achieved provided that the value on the base scale of the carrier can refer to no more than two adjacent terms of the linguistic variable. On the other hand, the proposed approach to use fuzzy expert evaluations can easily be adapted to detail or expand the initial data in the form of weights of adverse factors, and can also serve as the basis for further development of methods for as-sessing search and rescue operations effectiveness.

18. Peculiarities of porting the Robot Operating System framework onto Elbrus platform [№4 за 2019 год]
Authors: A.A. Tachkov, A.Yu. Vukolov, A.V. Kozov
Visitors: 4914
The Robot Operating System is the most widespread helper framework for mobile robots control sys-tems development. However, it fully supports only Ubuntu/Debian Linux-based software, which leads to limitation of the possibility to use calculating equipment developed natively in Russia within design control systems. The authors ported ROS Melodic Morenia onto Elbrus platform natively developed in Russia (computer based on Elbrus-4C CPU). This paper describes main peculiarities of the porting process associated with the differences be-tween Elbrus operating system and most of Linux distributions. Version matching for Elbrus integrated software on which the ROS depends is also considered. The authors attempted to determine ROS build dependency libraries that could be fully installed on Elbrus using only system packages and to repack these libraries into deb-packages for further installation onto similar computers. In addition, the paper describes the developed deployment scenarios for ROS ready-out-of-the-box. There is a description of testing of the prepared distribution performed for the task of multilayered passability map building. This task was solved on control system with LIDAR point cloud input. The authors demonstrate the results of tests as treatment times for the identical point clouds processing and map layer refreshment. The tests were run on systems based on Elbrus-4C, Intel Core i3-3220 and Intel Core i7-6700HQ CPUs using the same ROS version. The authors conclude that the ROS Melodic More-nia deployed onto Elbrus platform iы fully operational.

19. Features of using neural network models to classify short text messages [№4 за 2019 год]
Authors: Dli M.I., O.V. Bulygina
Visitors: 12767
Nowadays, public authorities are actively developing technologies of electronic interaction with organ-izations and citizens. One of the key tasks in this area is classification of incoming messages for their operational processing. However, the features of such messages (small size, lack of a clear structure, etc.) do not allow using traditional approaches to the analysis of textual information. To solve this problem, it is proposed to use neural network models (artificial neural networks and neuro-fuzzy classifier), which allow finding hidden patterns in documents written in a natural lan-guage. The choice of a specific method is determined by the approach to forming thematic headings: convolutional neural networks (for unambiguous definition of rubrics); recurrent neural networks (for significant word order in the title of rubrics); neuro-fuzzy classifier (for intersecting thesauri of ru-brics).

20. Domain-specific languages for testing web applications [№4 за 2019 год]
Authors: V.G. Fedorenkov , P.V. Balakshin
Visitors: 6210
The desire to release a high quality product with minimal errors often raises many problems regarding product testing for developers of both large and smaller projects. This work is devoted to searching for solutions for these problems. The paper compares the main methods as well as the existing software tools for creating and sup-porting domain specific languages aimed at working with test scripts to testing interfaces of web appli-cations. It also considers existing tools for working with Selenium, reviews the methodology of writing DSL (with further selection of the most appropriate), shows how to implement a prototype of DSL based on Selenium and to test and assess the applicability of a prototype. It describes the advantages of using DSL in testing, its functional and non-functional requirements, shows the developed DSL in a simplified form, the language structure (Java-packages). One of the main criteria for working with all of the abovementioned is the involvement of non-technical specialists at each testing stage (solving the so-called translation problem), which is im-portant for implementing comprehensive testing of a software product. One of the key features of the article is the demonstration of implementing a DSL prototype based on Selenium, followed by testing and evaluating the applicability of the implemented prototype. The paper shows a method of creating an additional metaprogramming tool for further simplification of cre-ation, support, and modification of the developed test scripts.

← Preview | 1 | 2 | 3 | Next →