ISSN 0236-235X (P)
ISSN 2311-2735 (E)

Journal influence

Higher Attestation Commission (VAK) - К1 quartile
Russian Science Citation Index (RSCI)

Bookmark

Next issue

2
Publication date:
16 June 2024

Articles of journal № 4 at 2012 year.

Order result by:
Public date | Title | Authors

1. Model-based verificaton with error localization and error correction for C designs [№4 за 2012 год]
Author: Urmas Repinski
Visitors: 6145
Process of software design verification makes sure if design holds its specification, existence of the specification that is possible to animate allows to perform simulation-based verification. In order to locate and repair errors if verification fails, we have to have access to the structure of the debugged design. For this purpose the design should be parsed into suitable representation – into the model. Both algorithms: Error Localization process and Error Correction process of the design require model simulation. This article presents different simulation algorithms. Simulation of the model is usually applied directly, when the design model is simulated with the goal to obtain the outputs from the inputs, but this approach is not always suitable, because in this case the functionality of the programming language design should be almost completely re-implemented for simulation. An alternative approach described in this article – design model simulation using design programming language functionality. It is more reasonable and does not require re-implementation of the functionality already available in design programming language. This approach also makes possible implementation of the algorithm of dynamic slicing for Error Localization and Error Correction.

2. The algorithm and software implementation of the hybrid training method of artificial neural networks [№4 за 2012 год]
Authors: Belyavsky G.I., Lila V.B., Puchkov E.V.
Visitors: 13687
The task of training an artificial neural network can be considered as a task of optimisation, and the main problem is to choose among a variety of optimization methods most suitable. The choice in favor of gradient methods based on the fact that, as a rule, in the problems of network teaching objective function can be expressed as a differentiable function of all weighting coefficients. However, the complicated nature of dependence between weighting coefficients leads to the fact that the objective function has local extremums and saddle points, making the use of gradient methods are not always justified. For solving optimization problems with multiextremal criterion the methods of random search, which include genetic algorithms, are used. Genetic algorithms, as a rule, are notable for their slow convergence. For the comparative performance of gradient methods and genetic algorithm a software with a web interface has been created. In the capacity of the task of training the neural network a problem of approximation of two-dimensional Rosenbrock function was used. The results of this research showed that gradient methods have fast convergence only at the beginning of training, and the genetic algorithm – in the end. Thus, a hybrid algorithm based on the sequential use of gradient methods and genetic algorithm has been proposed.

3. A dense stereo matching algorithm based on ground control points and plane labeling [№4 за 2012 год]
Authors: Кривовязь Г.Р., Птенцов С.В., Konushin A.S.
Visitors: 11658
In this work a new dense stereo matching algorithm is proposed. The algorithm is based upon a recently introduced idea of non-local cost aggregation on a minimum spanning tree that includes all image pixels. Themainfeatureoftheproposed algorithmisto switchfrom disparity space to plane space: the disparity in each pixel is deduced from equation of a plane assigned to that pixel. A way of estimating the set of planes, which are used to approximate the scene, is described. For this purpose, ground control points (GCPs) are used. GCPs, derivedas a result of image matching, are also exploited to regularize the solution. Theresultsofalgorithmevaluationononeofthe moderndatasets (published in 2012) are provided. Thedatasetconsistsof 194 streetviewimagepairs. Thediscussion of possiblewaysoffurtheralgorithmimprovement concludes the paper.

4. Algorithm comparison of methods of complex quantitative quality evaluation of the complex systems [№4 за 2012 год]
Author: Yu.M. Lisetskiy
Visitors: 10930
This work considers the application of various methods of complex quantitative quality evaluation of compound systems to compare and choose the best version of the alternative. As well as the availability of uncertainly in the choice of a method which makes it difficult to obtain an objective assessment of the quality of complex systems, which affects on the validity of the decisions taken in their selection. Offer a solution to this problem, which allows the uncertainly in the choice of a method, removing the inconsistency between the results of different methods of solutions. The algorithm, the essence of which is to define a rational method of the possible combination of methods. An algorithm that minimizes the risk of wrong decisions due to the irrational choice of the method of the possible set of methods for quantitative evaluation of quality systems. Experimental results on real complex systems, confirming the effectiveness of the developed algorithm for comparing the methods package, introducing metrics and expertise in the comparison, evaluation and selection of complex systems are given.

5. An efficient application mapping algorithm for multiprocessor systems [№4 за 2012 год]
Authors: Kiselev E.A., Аладышев О.С.
Visitors: 9144
This article describes a new application mapping approach for multiprocessor systems based on simulated annealing algorithm. The authors propose a model of multiprocessor system, which takes into account the heterogeneity of computing and communication resources, as well as a model of a parallel program based on the identification of typical communication operations between theprogram threads.The authors propose a parallel implementation of the algorithm simulation annealing to improve the quality of application mapping on the resources of multiprocessor computer system.The authors investigated the effect of competition in the network at the application work time.

6. The analysis of complex-structured text data, that characterize processes of state orders formation, placement and performance in science and technology [№4 за 2012 год]
Authors: Пономарев С.А., Корецкий М.В., Сытник Д.А., Горюнов И.Г.
Visitors: 10666
This article shows the analysis of current pressing problems of information support for administrative state structures, which are responsible for implementation of federal and regional programs actions in science and technology. The main problems of data accumulation patterns are shown. This data characterizes the processes of state orders formation, placement and performance (carried out under the program). To solve these problems, the authors propose to develop the method of semantic content analysis of state orders original data set in order to establish links between them based on semantic analysis techniques. Comparative analysis of the basic classification and text data clustering algorithms are shown, test application of the LSA/LSI (Latent Semantic Analysis/Indexing) algorithm for existing reports of state contracts in scientific and technical areas are described, the necessary information about their baseline characteristics and the results with identifying case-related groups are also shown (the algorithm is based on decomposition of the original coined in documents words diagonal matrix by singular value). The morphological processing of raw text descriptions procedure are described in article. It was carried out using Pymorphy, the morphological analysis library. Also this article shows the stop words and the most rare and most commonly used exclusion procedure. The use of FCM (Fuzzy Classifier Means) algorithm are proposed and described, because there are no any support for overlapping clusters in LSA/LSI algorithm. The expected results scientific novelty and end-users target groups of perspective software tools are shown.

7. Model-based verificaton with error localization and error correction for C designs [№4 за 2012 год]
Author: Урмас Репинский
Visitors: 6524
Process of software design verification makes sure if design holds its specification, existence of the specification that is possible to animate allows to perform simulation-based verification. In order to locate and repair errors if verification fails, we have to have access to the structure of the debugged design. For this purpose the design should be parsed into suitable representation – into the model. Both algorithms: Error Localization process and Error Correction process of the design require model simulation. This article presents different simulation algorithms. Simulation of the model is usually applied directly, when the design model is simulated with the goal to obtain the outputs from the inputs, but this approach is not always suitable, because in this case the functionality of the programming language design should be almost completely re-implemented for simulation. An alternative approach described in this article – design model simulation using design programming language functionality. It is more reasonable and does not require re-implementation of the functionality already available in design programming language. This approach also makes possible implementation of the algorithm of dynamic slicing for Error Localization and Error Correction.

8. Neutral networks and regression models in S&P 500 index forecasting [№4 за 2012 год]
Authors: Осколкова М.А., Паршаков П.А.
Visitors: 11886
The article presents a comparative analysis of neural network modeling and regression analysis for forecasting the S&P 500 index. Initially, the forecast of the absolute value of the index is provided, then we justify the use of stationery data, that is, the return of S&P 500. The comparison of two methods is carried out in two stages. Firstly methods are compared by the coefficient of determination on the periods of three and twelve months, and by the quality of trend predictions. Note that the choice of model and its testing is performed at different time intervals (the so-called in-sample and out-of-sample periods). Taking into account the fact that the primary desire of a typical trader is to gain a profit at the second stage we have chosen such trading criteria as profit and profit, weighted on risk (drawdown). On a longer time interval (12 months) regression shows the best results, but in terms of economic gains neural network win. When we consider a shorter period (3 months) neural network has better results. Thus, neural networks are able to assess the dynamics of the stock due to its flexibility and ability to find non-linear patterns.

9. Choosing computational system for scientific problems [№4 за 2012 год]
Author: Shabanov, B.M.
Visitors: 13328
The article describes mapping of computer architecture to application programs. This problem is important when in is needed to choose computer system for certain programs. Formalization of choice of computer system for solving scientific and engineering problems is discussed. Study of efficiency of the program on a cluster with multicore processors and GPUs is presented. It is considered a synchronous model of the program with exchanges: between cores, between the processor and the accelerator and between the computational nodes, time estimations for data transfer time are provided. The following methods of determining parameters of numerical model are analyzed: profiling programs on a computer system, simulation of the program on the system, evaluation model of the program and the system. Some typical cases of data exchange are considered: exchange with the «neighbors» (e.g., exchange between the nodes of a multidimensional mesh) and collective communications (one to all, all to one). To provide data for solving problems of this kind in the JSCC RAS there was developed a set of benchmarks representing different areas of science, also there was developed test program that measures floating point performance of cores, memory performance, file operations performance and communications performance. Three problems of computer system choice are considered – determination of the system components system for solving certain problems to reach: maximum performance, minimum cost of the system or maximum performance at a fixed price. Specific features of costs minimization are discussed. The described approach was used in the selection of architectures for JSCC RAS high-performance systems such as MVS-10BM, MVS-6000IM, MVS-100K

10. Genetic algorithm for nfa state minimization problem [№4 за 2012 год]
Author: Tsyganov A.V.
Visitors: 10659
The state minimization problem for nondeterministic finite automata is a well-known computationally hard combinatorial optimization problem. A lot of exact and approximate methods were proposed for it. All known exact algorithms for this problem are exhaustive and often become impractical even for relatively small automata. In the present paper we discuss a new heuristic algorithm for NFA state minimization problem which is based on the classical Kameda– Weiner algorithm and genetic algorithm. The main idea of the proposed method is to replace the exhaustive search for legitimate covers of the RAM (Reduced Automaton Matrix) with the fast but incomplete search for covers by the means of genetic algorithm. The implementation of the proposed method with the usage of parallel computing techniques is described and the results of computational experiments are provided.

| 1 | 2 | 3 | 4 | 5 | 6 | Next →