ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 2 at 2016 year.

Order result by:
Public date | Title | Authors

1. An approach to data normalization in the internet of things for security analysis [№2 за 2016 год]
Authors: A.I. Pechenkin, M.A. Poltavtseva, D.S. Lavrova
Visitors: 5119
This paper analyzes a concept of the Internet of Things to develop a system for security incidents detection. System development requires effective methods and algorithms for preprocessing and storing high volumes of data from heterogeneous devices. The authors define a notion of a “thing”, formulate the basic information coming from the “thing” to an information system. In this paper the “thing” is considered as a data source with its own features. The authors apply the ETL (Extract Transfer Loading) technology to the Internet of Things data preprocessing. Heterogeneous devices provide a great number of different data types. To analyze this information in SIEM the system needs to show information and command messages from the Internet of Things in the event space. For this purpose it is necessary to adjust them to the same form, due to the fact that a researcher often needs to analyze various events together and one event may include various messages. This problem may be solved using metadata. The authors propose an approach that applies hierarchical directories to normalize high volumes of heterogeneous data from the Internet of Things. Hierarchical directories contain information about data source and their contents. The paper describes the basic metadata directories.

2. Soil & Environment as a tool for soil environmental functions evaluation [№2 за 2016 год]
Authors: Ángeles Gallegos-Tavera, Francisco Bautista, Dubrovina I.A.
Visitors: 9545
Soil degradation is a part of total ecological crisis due to the fact that soil is the link of any ecosystem. The soil loses its environmental functions (EF) under the comprehensive loads. One of the key topics of nature protection in the last decade is the evaluation and accounting ecosystem services in human economic activity. Therefore, the search and development of spatial planning tools for areas based on their EF is very important. The article considers the software for evaluation of EF using TUSEC algorithms (Technique for Soil Evaluation and Categorization). The technique implies a score evaluation of basic environmental functions of natural and anthropogenic soils. EF evaluation allows keeping a balance of benefits and losses at a spatial planning as a result of lower environmental impacts on soil functions. The central component of the software is a relational DBMS Derby designed in Java using IDE Eclipse. Data on the site, field description and analysis of soil profiles are stored in the database using input tools. Intermediate calculations and evaluation of EF is based on input data by TUSEC algorithms. The forcasting modeling tool allows calculating the change of EF ranks for different types of land use. The evaluation results of EF and predictive models can be presented by graphs. Export of tabular and graphical information is possible as well as the spatial reference data into the GIS. Friendly interface for data input and output and database management is designed for users who do not know SQL query language.

3. Classification algorithm based on random forest principes for a forecasting problem [№2 за 2016 год]
Authors: Kartiev S.B., Kureichik V.M.
Visitors: 13792
This article considers the methods of constructing ensembles of models to solve the forecasting problem. One of the major forecasting stages is classification. This stage includes the basic logic of predictive models. It describes the “random forest” classification method. It also presents the pros and cons of the methods used. During the research the authors justify the choice of this method for using in the developed forecasting system. The paper presents an algorithm for random forest construction based on a combination of decision-making elements and training methods for generated data structures using a modified random forest (MRF) training algorithm. The fundamental difference of this method is finding the optimal class which possesses the object in question for a forecasting task. The paper describes the software implementation in Java using the principles of generic programming. It also describes the basic data structure as an UML-diagram. The article defines the place of the developed module in the diagnostic system of complex technical systems for software system maintenance using modeling principles based on temporal logic. The experimental research showes the efficiency of the described method compared to existing ones. Classification quality has improved at approximately 5 % compared to previous experiments.

4. Algorithms for equipment reliability test in an automatic control system [№2 за 2016 год]
Authors: Rusin A.Yu., Abdulkhamed M., Baryshev Ya.V.
Visitors: 9941
Economic efficiency of equipment reliability test system can be improved by running time reduction or decrease in the amount of specimens. When running time reduces, sample trimming rating increases. Decrease in the amount of specimens leads to decrease in the sample number of equipment running. Evaluation test specifications may be reduced only if information processing methods ensure the validity of the calculated reliability characteristics. The result of test operations is forming small censored samples of mean-time-between-equipment failures. Reliability measurement using such samples is made by the maximum likelihood method. The article presents experimental studies of estimating precision of maximum a likelihood parameter of the exponential distribution law on small singly right-censored samples. In their studies the authors used computer simulating of censored samples, which are similar to the samples formed in equipment reliability testing. These experimental data show that the majority of maximum likelihood estimates obtained using small singly right-censored samples have significant deviations from ideal values. The work includes regression models that set a relation between a deviation of maximum likelihood estimate from ideal value and the parameters characterizing the sample structure. They allow calculating and putting amendments to maximum likelihood estimates. The paper also includes experimental studies of its usage results. After applying developed models and putting amendments to maximum likelihood estimates the accuracy of maximum likelihood estimates increases. There also is a developed software to apply regression models in practice.

5. The analysis of activity and development trends of malicious programs “file locker-encoder” type [№2 за 2016 год]
Author: Drobotun E.B.
Visitors: 12071
All companies engaged in development of anti-virus software noted that since the middle of 2013 there has been a burst of malware infection of computers. These malicious programs cipher user information. Thus, they are the most dangerous type of malicious programs of Ransomware class (extortioner programs). These programs not only block access to a computer for victims, but also block users’ access to files using various enciphering algorithms. Usually such malicious programs cipher popular types of the user files, which can be of a certain value: documents, spreadsheets, database files, photos, video and audio files, etc. To encrypt files the user is offered to pay ransom by means of some Internet payment services or cryptocurrency (usually bitcoins). The first versions of such malicious programs appeared in 2006–2007. However, at that time these programs used nonresistant enciphering algorithms, a small size of enciphering keys and extremely inefficient distribution methods. Therefore, they weren't widely distributed. Modern malicious programs do not have these shortcomings. They use quite resistant enciphering algorithms (AES or RSA), rather big size of enciphering keys and effective infection methods (infected web pages and malicious spam e-mails). The paper describes the main tendencies of such programs development on the basis of the analysis of several most widespread types of malicious programs. It also offers possible ways of eliminating consequences of their activity.

6. Parallel programming in mathematical suites [№2 за 2016 год]
Author: Chernetsov A.M.
Visitors: 11970
Recently tools and features of parallel programming have been used for calculating difficult tasks. Programming models in shared and distributed memory are well-known. Later hybrid models have appeared. However, all these tools suppose fairly low-level programming when a source code is modified significantly. A significant number of mathematical calculations is performed not in algorithmic languages (C/C++, Fortran), but in special mathematical suites such as MATLAB, Maple, Mathematica, MathCad. The paper discusses parallel programming tools in modern mathematical suites. There is a short review of parallel programming tools development in well-known suites, such as MATLAB, Maple, Mathematica and Mathcad. The paper briefly describes the main primitives of parallel programming and their analogs in MPI for MATLAB. It also mentions other operators of parallel programming. It describes different features of parallelism in Maple (threads programming, high-level Task Programming Model, parallel programming). There are some basic constructions of parallel programming in Mathematica Wolfram language. The paper describes different examples. Different possibilities are available depending on an operation suite. However, any problem can be solved in each of these suites (except MathCad).

7. A hybrid desktop/cloud platform for design space exploration [№2 за 2016 год]
Authors: Prokhorov A.A., Nazarenko A.M., Perestoronin N.O., Davydov A.V.
Visitors: 8527
Modern engineering practice shows that simulation driven design is arguably the most promising method to reduce lead time and development costs. However, its application involves a number of methodological and operational difficulties. Thus, it remains limited and in general is not available for smaller companies that lack the required resources. High entry level of this method is the consequence of high complexity and cost of implementing the simulation models required in solving modern multidisciplinary engineering problems. Development of such models requires a high level of expertise in many subject domains, as well as using various specific software products which are usually available on commercial basis only. Moreover, performing large scale simulations leads to additional costs for development and maintenance a high-performance computing system. The paper considers the main issues of performing large scale automated simulations that are required when computational methods are applied at early design stages in order to support a search for new design decisions. On contrary, we have a yet more common practice of using simulation experiments only at the later stage of design validation, which does not require mass calculations. The paper discusses the ways of lowering the entrance level paying attention to the existing practice of developing integrated solutions that are accessible to a wide range of users, as well as to the opportunity of at least partial moving simulation experiments into a cloud, which would allow lowering simulation costs. The authors also consider developing hybrid integrated applications based both on cloud and desktop software. The paper formulates related requirements for the process integration and automation platform that would support both cloud and desktop components in order to allow developing hybrid integrated applications aimed to solve classes of similar tasks. It then proceeds to describe the software architecture developed with regard to these requirements, which allows minimizing resources required for implementation thanks to the fact that its main components can be used both in the cloud and desktop versions.

8. Fuzzy logic in a sensorless valve electric drive [№2 за 2016 год]
Authors: Lgotchikov V.V., Gorchakov D.V.
Visitors: 11491
The paper considers a 6-step algorithm of valve electric drive sensorless control. The analysis of transient processes in electric drive power circuit showed the decrease of overall performance of a valve electric drive in dynamic modes of operation when the drive is controlled by a sensorless algorithm for determinination of the switching moment using a counterelectromotive force signal integral. In some cases the whole system can become unstable. It's necessary to compensate the increase of current and magnetic circuit saturation by dynamical changing control system parameters, so that the system could operate in a stable condition. To achieve this purpose the article proposes using a fuzzy logic controller. Such controller adjusts the commutation moment of drive phases using the database of fuzzy rules. Fuzzy logic control system uses normalized values of motor current and a rate of rotation as input parameters. Fuzzy logic controller used in electric drive control system will increase a control performance in dynamic modes, such as applying the load or velocity signal changes. Simulink tests show that the designed system with fuzzy logic control allows high quality level of motor control in a wide range of motor speed. The proposed control system also increases load-carrying capability of the whole system.

9. The method of distributed analysis of verifiable models properties [№2 за 2016 год]
Author: Shipov A.A.
Visitors: 9738
Due to every day complexity and complication growth of software systems, we need some useful tools to check matching their specifications, especially for large distributed software systems. However, verification of this kind of systems is often followed by the “combinatorial explosion” problem, which causes a sharp growth of temporal complexity during verification at rather low volume increase of verifiable systems. Nowadays there are some methods to overcome this problem, such as abstraction, interpretation and verification “on the fly”. Nevertheless, practically, the usage of only existing methods can be often not enough to solve this problem. The logic prompts that we should carry out the process of executing large distributed software systems, as well as a verification process in a distributed way. The article offers and analyses a method for overcoming the problem of “combinatorial explosion”. It can be used as additional for already existing methods. The idea of the method consists in using the algorithm of Buchi automata distributed verification for linear temporal logic (LTL). This algorithm can help to increase efficiency and speed of the verification process due to division of computations between the number of computing knots. Despite the fact that the idea of distributed computations is not innovative and similar tools are already presented in a model checking tool Spin, the theoretical material of the article is supported by the set of examples which shows on practice that the proposed algorithm is more efficient than one presented in Spin.

10. Methods of automatic ontology construction [№2 за 2016 год]
Authors: Platonov A.V., Poleschuk E.A.
Visitors: 11092
The article describes an automatic domain ontology generation process using input text corpora. In particular, it describes the processes similar to Biperpedia, BOEMIE Project systems, etc. This paper includes a description of basic steps of automatic ontology construction, specifically a domain-object extraction process, concept (i.e. terms that combine an object set) extraction process, as well as the process of semantic relations and rules extraction. This paper reviews algorithms for each steps of an ontology construction process. There is a named entity recognition task and regular expression generation based on a genetic programming approach for a domain-object extraction process. The authors propose an idea of using a sequential pattern mining approach for term sequences extraction for an object identification process. The paper contains a description of basic steps of a concept extraction task and a review of a concept attributes extraction task. The article also describes a lexico-syntactic pattern approach for a domain semantic relation extraction process. The authors propose an approach to this task based on association rules mining like in a frequent pattern mining approach. The paper includes three methods of ontology learning evaluation, specifically: a golden sample method, a human evaluation method and an indirect method using client-application evaluation. The paper describes positive and negative aspects of each method and proposes a compromise to estimate the quality of a model.

| 1 | 2 | 3 | Next →