Journal influence
Bookmark
Next issue
Analysis of website security status based on performance metrics
Abstract:The paper is relevant due to the constant evolution of Internet infrastructure including conditions for optimal operation of web systems, information sources, e-commerce and various types of services. During the past twenty years, general information in the World Wide Web (WWW) and the number of users has increaseв almost two hundredfold. With this growth, requirements for technology has become even more critical. However, current research is more about a commercial aspect then the technical one. The paper pays special attention to assessing website performance quality regarding information technology indicators. It also considers a fuzzy Markov chain to define specific website productivity states.
Аннотация:Актуальность работы заключается в анализе современных трендов развития web-сайтов и интернет-инфраструктуры, а также в отражении условий для оптимальной работы систем, представляющих информацию, выполняющих функции электронной коммерции и предоставляющих различные виды услуг. В последние два десятилетия общая информация во всемирной паутине WWW и количество пользователей увеличились почти в двести раз. С их ростом требования к технике стали еще критичнее, но оценке с коммерческой точкой зрения уделяется больше внимания, чем с технической. В работе особое внимание уделяется оценке качества работы web-сайтов с точки зрения информационно-технологических показателей, рассмотрена нечеткая цепь Маркова с выделением конкретных состояний производительности web-сайтов.
Authors: D.T. Dim (dim.dike@yahoo.com) - Tver State Technical University, Tver, Russia, Bogatikov V.N. (VNBGTK@mail.ru) - Tver State Technical University, Tver, Russia, Ph.D | |
Keywords: цепь маркова, тестирование веб-сайта, анализ сайта, метрики, web-сайта |
|
Page views: 5939 |
PDF version article Full issue in PDF (29.80Mb) |
In design, development and operation of a website as a resource, the main criteria and indicators (so-called metrics) are used in assessing its quality and effectiveness [1, 2]. The issues related to cross-platform, presentation and usability of an interface are not considered. The following metrics were selected to assess website performance: — time to the first byte (TTFB, ms); — domain name server (DNS) lookup time (ms); — universal resource locator (URL) redirection; — number of HTTP requests; — page size (kb); — connection time (ms). TTFB. In web applications and websites, there are different sub-categories of latency. TTFB is the time required to receive the first byte from the server after sending HTTP GET request. DNS lookup time. Like TTFB, DNS lookup refers to a sub-category of latency and reflects the time required to look up the IP address of the corresponding domain. URL redirection. Automatic transition by a website from one page to another after user’s GET request. For example, visiting google.com on a device in the Russian Federation without tunneling defaults to google.ru. The number of HTTP requests. This is a number of objects required when loading websites. Naturally, higher number of requests results in longer load times. Page size. The cumulative page size with images, animations, drawings, style sheets, scripts and html/htm code. Connection time (ms). Another component of latency, which reflects the time required to establish a TCP connection. Managing these metrics will lead to better site performance, since they belong to a server and network environment of websites. To analyze performance of websites, first, it is necessary to collect data. There are two main types of data collection tools [3]. 1. Counters. A small code embedded in webpages loaded by a browser. Most prominent solutions include OpenStat, GoogleAnalytics, Yandex.metrics, LiveInternet and others. 2. Log analyzers. They provide the ability to collect statistical data and compose their own special reports through installation on a server. Counters [4–8] were primarily used to collect data for selected metrics, which further characterizes different website states. Analysis of website states The analysis of website security states was based on the following assumptions: - absence of change in core content; - the probability of a transition from one state to another depends solely on the current state, and not the previous one; - the probability of a transition depends on a change in parameter or a set of parameters. The state of the system at any given time is described by specifying its coordinates. Knowing these values at a given time t, we can determine evolution of the system under the influence of internal and external factors in subsequent time periods. Each website productivity state is denoted by Si, where i = 0 ~ 6. S0 ~ S6. System states determined based on selected metrics. The transition between states occurs due to change (improvement or deterioration) in hardware, software, network, their interaction with each other and the environment (as shown in a figure). The most important part in their interaction is how they directly affect website performance. Characteristics of different website states are shown in table 1: S0 – Inoperative state. Characterized by high DNS lookup time exceeding 1 200 ms, page size >7 168 kb, redirects >7, TTFB exceeding 3 000 ms, number of HTTP requests >180 and the connection time ex- ceeding 5 500 ms. This state is usually accompanied by HTTP error codes (including but not limited to 404, 502), unavailability of the requested link or page, and expiration of waiting time. S1 – first improvement stage. This state corresponds to S0 with exceptions to connection time and DNS lookup time, which are 1 500 ms < TCON < 5 500 ms and 880 < TDNS <1 200 ms respectively. S2 – second improvement stage. This state adheres to S1 except for TTFB and the number of HTTP re- quests, which are 780 S3 – third improvement stage. This state corresponds to S2 except for DNS lookup time, which is 160 S4 – satisfactory functional stage. This state corresponds to S3 except for the following parameters: DNS lookup time < 380 ms; page size 350 S5 – operational state. This state corresponds to S4 except for the following parameters: page size 180 S6 – optimal working state of websites. This state corresponds to S5 with exceptions of the following metrics: DNS lookup time is below 200 ms, redirects <3, page size < 160 kb, TTFB delay is below 180 ms and the connection time <250 ms. It should be noted that the majority of existing websites on the WWW are located in states S3~S5 [1]. Consequently, the states are characterized as showed in table 2. Table 3 shows the indicators of existing websites. Measures for managing website security The main problem in ensuring proper functioning of websites is in the implementation of a set of measures aimed at maintaining operability, sustainability and development potential of websites. One of the most important measures is monitoring, which enables not only the ability to keep track of all processes on websites, but also to prevent various safety and efficiency threats in a timely manner. Amongst the probable criteria to monitor, the following might be identified as key areas [9]: - DNS lookup time; - response time; - scheduled task execution; - wait times for static files; - databases and their connections. In this case, each area contains a specific set of metrics with each composition significantly different from the other. Conclusion Monitoring and forecasting are the most important parts of managing a complex security process. The technological, technical and economic consequences of information technology development, maximum use of computational and financial resources depends significantly on them. In order to control a process effectively, it is necessary to detect and prevent crisis situa- tions, which thereby ensures effective safety manage-ment. This requires the existence of unified identifiable metrics and states of an information system, as well as an accurate definition of the main threats and the subsequent development of measures to eliminate them. References 1. Souders S. High performance web sites: essential knowledge for front-end engineers. O'Reilly Media Publ., 2007, 170 p. 2. King A.B. Website Optimization: Speed, Search Engine & Conversion Rate Secrets. O'Reilly Media Publ., 2008, 398 p. 3. Kaushik A. Web Analytics 2.0: The Art of Online Accountability and Science of Customer Centricity. Sybex Publ., 2009 (Russ. ed.: Vilyams Publ., 2014, 528 p.). 4. GtMetrix. Available at: https://gtmetrix.com (accessed May 18, 2017). 5. Webpage test. Available at: http://www.webpagetest.org (accessed May 18, 2017). 6. Ultra-tools. Available at: https://www.ultratools.com/ (accessed May 18, 2017). 7. Pingdom. Available at: https://tools.pingdom.com/ (accessed May 18, 2017). 8. Google Developer tools in chrome browser (accessed May 18, 2017). 9. Croll A., Power S. Complete web monitoring. O'Reilly Media Publ., 2009, 672 p. 10. Internet growth statistics. Available at: http://www.internetworldstats.com/emarketing.htm (accessed May 2, 2017). Литература
|
Permanent link: http://swsys.ru/index.php?id=4362&lang=en&page=article |
PDF version article Full issue in PDF (29.80Mb) |
The article was published in issue no. № 4, 2017 [ pp. 654-657 ] |
Back to the list of articles