Data publikacji: 19 mar 2025
Otrzymano: 05 lis 2024
Przyjęty: 09 lut 2025
DOI: https://doi.org/10.2478/amns-2025-0398
Słowa kluczowe
© 2025 Jun Ma, published by Sciendo
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
The server in the internet data center where the website is stored is called the web server, the server is an important hardware facility for the website, without which the website cannot function properly [1–2]. The security and maintenance of the server is the most important part of the entire network and a slight discomfort can paralyze the entire network [3–4]. Currently there are two major categories of threatening behaviors to servers, one is malicious attacks on the network, such as denial of service, viruses, etc., which can cause a large consumption of server resources, affecting the speed of normal server operation, and in severe cases, paralyzing the network [5–7]. Another one is the malicious intrusion, which will make the server information leakage, and the intruder can do whatever he wants to do and destroy the server recklessly [8–9].
With the development and application of network information technology, the current stage of network information technology has been put into the development process of all walks of life, but the ensuing problems are also more and more, especially in terms of the web server, the emergence of this aspect of the problem will lead to the emergence of all kinds of security problems, affecting the normal operation of the entire computer and confidentiality, and even triggering the emergence of network paralysis [10–13].
Web server belongs to the center of the website operation, belongs to the more important existence in the development process of the whole website [14–15]. If the server in the process of network operation has certain problems, it will lead to corresponding problems in computer security. This situation will directly threaten the security of the website operation, there are many user information leakage, user information security risks and other aspects of the problem, is not conducive to the sustainable development of the website. Therefore, in the process of computer security maintenance, we should pay attention to the security of the web server maintenance action, to promote the user to use the network and the website in the process of improving security [16–19].
Web server security and network security suffered by malicious attacks, to society and individuals has brought serious economic and spiritual losses, so how to improve the website and web server malicious attacks and vulnerability detection, malicious attack protection, very necessary, at the same time how to build and optimize the network security governance framework and network security governance strategy is also imminent. Literature [20] examined the more common attacks on web sessions, while combining the four axes of the existing security program evaluation, targeted at the proposed five defense program guidelines for network security program innovation provides a reference opinion. Literature [21] envisioned a WEB intrusion detection system with machine learning as the core logic, which can effectively differentiate and mark normal logs as well as attack logs to improve the security of web servers. Literature [22] studied ReDoS vulnerabilities in websites and uncovered 25 unknown vulnerabilities in popular libraries and identified the presence of RbDos vulnerabilities against at least 339 of the 2846 most popular websites, posing a huge threat to site availability. Literature [23] describes the security issues of IoT, the lack of research and development in the areas of security mechanisms and analyzes the IoT privacy, uniformity and other security requirements, and discusses the nature of the types of attacks suffered by the IoT, as well as the behavior. Literature [24] conceptualized a strategy for evaluation and identification of security mechanisms for Web server platforms to ensure that the security mechanisms can operate effectively across Web server platforms, and demonstrated the proposed strategy through real-world examples. Literature [25] attempts to construct a network security governance strategy from the perspective of big data, analyzes China’s network security governance measures, as well as the obstacles it faces, and puts forward targeted suggestions for improvement, and verifies that the proposed strategy is feasible and effective through practical tests. Literature [26] discusses the subversive changes brought about by computer networks for society and people’s lives, and at the same time conducts in-depth analysis of the security problems of computer network engineering and discusses the security solutions, which makes positive contributions to the progress of computer network engineering. Literature [27] builds a prototype of hole reporting system based on network technology, which realizes the maintenance and analysis of network security by focusing on the vulnerability report, and effectively assists system administrators in network vulnerability scanning and tracking.
Computer security is a prominent problem, so it is necessary to conduct research on the optimization of computer security system design, the training of computer security management personnel and the optimization of their computer security protection capabilities. Literature [28] critically examines the design of computer security systems and systematically reviews related research papers, confirms the prevalence of common deficiencies in malware detection, vulnerability discovery, and binary code analysis, demonstrates them, and concludes with relevant recommendations to mitigate and avoid these problems. Literature [29] introduces the knowledge of computer technology and network security, providing scholars with an understanding of computer technology and network security through a concise and professional approach that involves computer and Internet framework building, design principles, and so on. Literature [30] proposes the use of a closed room approach to learning and training to develop and enhance the learners’ theoretical knowledge and awareness of computer security, learning includes how to reduce the risk of attacks on computers, which helps to mitigate the damage caused by hacking attacks on computers. Literature [31] assessed the performance of bioinformatics research and analysis computer tools in response to malicious attacks, uncovered their vulnerabilities, and targeted a broad framework and set of guidelines to effectively protect the security and privacy of bioinformatics processing tools.
This research developed a web-based computer security monitoring system. The system is equipped with a SQL database and utilizes the RSA public key cryptosystem for encryption and decryption analysis of web server data. The website anti-tampering module uses digital digest algorithms, watermarking techniques, and ISAPI filters to detect, locate, and defend against website tampering security issues. Combined with the server monitoring module, to jointly monitor and maintain the computer security of the web server.
An SQL database’s data architecture is basically a three-tier system, but the terminology used is distinct from that of traditional relational models [32–33]. In SQL, the relational schema is called the “base table”, the in-storage schema is called “stored file”, the sub-schema is called “view”, the tuple is called “row”, and the attribute is called “column”.
An SQL database is a collection of tables, which are defined by one or more schemas.
A table consists of a collection of rows, where a row is a sequence of columns, with each column and row corresponding to a data item.
A table is either a base table or a view. A basic table is a table actually stored in the database, while a view is a definition of a table consisting of several basic tables or other views.
A basic table can span one or more storage files, and a storage file can also hold one or more basic tables. Each storage file corresponds to one physical file on the external storage.
Users can use SQL statements to perform operations such as queries on views and basic tables. From the user’s point of view, views and basic tables are the same, both are relational tables.
The user can be an application program or an end user.SQL statements can be embedded for use in a program in the host language, and SQL users can also be used as a stand-alone user interface for end users in an interactive environment.
Data Definition: This part is also known as “SQL DDL”, defines the logical structure of the database, including the definition of the database, the basic tables, views and indexes in four parts.
Data manipulation: this part is also known as “SQL DML”, which includes data queries and data updates of two major types of operations which include data updates, including insertion, deletion and update three operations.
Data control: the control of user access to the data are the basic table and view authorization, integrity rules to describe the transaction control statements.
The use of embedded SQL language regulations: the rules for the use of SQL statements in the host language program.
Cryptographic regimes can be categorized into two main groups based on whether the encryption key used can be made public or not. One is traditional cryptographic regimes, while the other is public key cryptographic regimes. The former is generally used for encryption of network data communication, while the latter is used in authentication systems for message confirmation and nonrepudiation.
The security of a public key cryptosystem lies in the fact that plaintext cannot be deduced from the public key and ciphertext. This is achieved by the mathematical principle of the one-way trap function. The one-way trapdoor function Given Given When
The construction of the RSA public key cryptosystem is based on Euler’s theorem, described as follows:
Choose a pair of reciprocal prime numbers Calculate Choose an integer Compute the inverse Public key The secret algorithm is
Commonly used structures for filters are HTTP-FILTER-VERSION and HTTP-FILTER-CONTEXT, of which the HTTP-FILTER-VERSION structure is used by the GetFilterVersion() function to get the filter version information, the priority of the filter, and the notification events in which the filter will be called. There are two main ISAPI development interfaces, an ISAPI filter entry function and an ISAPI filter callback function, of which there are two entry functions, GetFilterVersion() and HttpFilterProc().
The GetFilterVersion() function is the first function called by IIS when calling the ISAPI filter, and must be submitted to the filter to work properly. IIS passes a pointer to an HTTP-FILTER-VERSION structure to the IIS server, which is used to provide filter initialization information. It is more important to pass the filter priority and flag bits for notification events that are handled through this structure to the IIS server. Then, when a notification event registered for a filter is called, IIS calls the HttpFilterProc() function, which is called to pass information and control the ISAPI filter.
Each ISAPI filter has its own DLL that exports two entry functions, GetFilterVersion and HttpFilterProc, and can also export the TerminateFilter function. But any filter DLL must export the first two functions to provide the server with the ability to call. When the IIS server loads a filter, it creates an HTTP-FILTER-VERSION structure and initializes it. Then the GetFilterVersion function of the filter is called, passing a pointer to the above structure as a parameter. The ISAPI filter uses the version and description information to assign values to some of the parameters of the HTTP-FILTER-VERSION structure. It is important that the filter uses HTTP-FILTER-VERSION to specify which notification events are to be blocked, and to specify the filter’s priority. It should also specify whether the filter detects only secure ports, only insecure ports, or all ports. Each HTTP interaction between IIS and the client browser triggers a different event on the server. Since the ISAPI filter is registered, each time an event occurs, IIS calls the filter’s HttpFilterProc entry function. When a data processing is complete, the filter returns an SF-STATUS status code to IIS, and then IIS continues to process the HTTP request and response until all other ISAPI filter registered events are processed.
Digital watermarking technique refers to embedding inconspicuous marks in digital data content. The embedded marks are usually invisible or imperceptible, but they can be detected or extracted using certain computational operations [34–35]. The watermark is tightly integrated and hidden with the source data, becomes an inseparable part of the original data, and can survive some operations that do not destroy the use or commercial value of the original data. Generally speaking, a digital watermarking system should comprise two basic components: a watermark embedding system and a watermark detection system. Watermark embedding, both the original file and the watermark may be after some kind of pre-processing, together with the key, after the embedding algorithm, finally get the file after adding watermark. Watermark detection system is shown in Fig. 1, its inputs are the watermarked file and the key, for blind watermark detection, the original file is not required. The watermark extraction algorithm compares the extracted watermark information with the original watermark information to determine if it is similar, and a conclusion is drawn based on the results.

Watermark detection system
The key point in webpage tampering prevention is the need to verify the judgment of whether the webpage is tampered or not, i.e., the data integrity issue of the webpage file. The main function of the digital digest algorithm is to take the input data and derive a unique output value according to certain algorithms. This output value is called the digest value or hash value. In other words, a digital digest algorithm is a function that transforms an input message of arbitrary length into data of a certain fixed length. Its primary purpose is not to accomplish the task of encryption and decryption of data, but rather to verify the data’s integrity. The digital digest algorithm can be expressed in the following form:
The purpose of a digital digest algorithm is to produce a digest value for a message, file, or other block of data. It needs to have the following properties: The computation
SQLServer database supports Windows graphical management tools and remote system management, and it has powerful processing functions and compatibility, which can be used to realize a system information processing database platform open to the public through SQL language. In addition, the SQL database system not only has powerful data recovery features, but also provides powerful management tools. Therefore, the SQLServer database is suitable for data storage and querying in this monitoring system. The system utilizes the SQLServer2012R2 database, which is named ITMonitor. User Login Data Table In this system, in order to improve the security of the system, administrators must register in advance and access the system according to their rights. The system uses RBAC (Role-based Access Control) permission management method and has three data tables: user information table, role table, and permission table. When a user logs in, the system connects the user table and the role table through the role number, and the permission number connects the role table and the permission table, and finally gives the permission to the user through the role. Web page and server monitoring data tables These tables contain the necessary information for the system to monitor the web site or server, including the web page and server. Basic information, monitoring settings, and historical data. It mainly consists of the following tables: webpage information table, server information table, webpage monitoring settings table, etc.
Tampering Detection
Web page tamper detection first uses coarse-grained web page watermarking for quick tamper detection. If the web page is tampered with, fine-grained web page watermarking is further used to identify the tampering. Fine-grained watermarking can detect tampering and transmit the location information to the relevant administrators. The administrator has the option to decide whether to perform fine-grained watermark detection based on the current situation.
Filter anti-tampering
Filter is a logical component of the anti-tamper module that realizes the filtering of tampered web pages on the web server side. It can detect tampering on the web pages requested by users. If a tampered web page is detected by the system, the user request will be shortened and the system will recover the web page file before responding to the user’s requested page.
The working process of the filter is shown in Figure 2.
Cyclic scanning anti-tampering
The monitoring server is connected to the Web server through the intranet, and it is physically isolated from the public network to ensure its security. The monitoring server carries out cyclic scanning and detection of the published contents of the Web server, and once it finds that the web pages have been tampered with, it will use the backup files on the monitoring server to recover the tampered files of the Web server in time.

Filter operation process
When the system runs for the first time or a new web page file is released, the watermark generation and embedding methods are used to embed the watermark on the web page file, and the web page file with the embedded watermark is copied to the backup directory and saved, which is used to recover the web page when it is found that the content of the web page has been tampered with or the web page file has been deleted illegally.
Obtain the information of the web page files to be restored through the web page monitoring module.
According to the web page file information, read the backup file of the web page in the backup directory.
Detect the backup file of the web page.
If the two watermarks are different, it means that the backup file has been tampered with and the web page file cannot be recovered for the time being, sending an alarm message to the administrator.
The server monitoring module mainly monitors the running status of the server and includes two submodules: network monitoring, hardware monitoring, and process monitoring. The server monitoring module is shown in Figure 3.

Server monitoring module
Ping pass-through reflects the instantaneous state of the network. The return value of Ping includes RTT (round-trip delay), and according to the return value of RTT, the average delay time can be calculated, and the system will evaluate the quality of the network where the server is located based on this information. Network pass-through detection the system will perform Ping detection on the monitored server according to the preset monitoring period. Average delay detection
Based on the RTT return value of Ping, the average delay D can be calculated using equation (2), where ti is the ith round-trip delay:
The data collection of server hardware and process is realized by using monitoring agent. The agent program is deployed on the monitored server to collect the hardware and process information of the monitored server and transmit it back to the monitoring system. CPU utilization CPU utilization rate is the ratio of CPU processing task time to the whole cycle length in one cycle, as shown in equation (3):
Memory utilization Physical memory utilization is the ratio of the number of bytes of physical memory being used to the total number of bytes of physical memory, as shown in equation (4):
Disk space utilization The disk usage rate can be calculated by the formula, which is equal to the ratio of the used size to the total size, as shown in equation (5):
When the system partition/logical partition utilization exceeds 50%~70%, it is regarded as a general warning, and exceeding 70%~90% is defined as a serious warning. The program obtains disk information and calculates disk utilization every 60 seconds. Processes On the monitored server, there may be very important business programs running, such as IIS process w3wp.exe, SQLServer process sqlserver.exe and so on. The system first reads the list of processes to be detected, and then uses the method GetProcessByld of class System.Diagnostice.Process to get the list of processes in the system and matches them, and restarts the process to be detected if it is found that the process is not running normally. The system supports the detection of multiple processes. If you need to add more processes, just add them to the server process monitoring settings table.
In this section, four digital watermarking techniques, namely, text watermarking, audio watermarking, video watermarking, and image watermarking, are selected to conduct a computer website security and reliability comparison test with the fine-grained web page watermarking in this paper.
The test environment is a 666MHz Pentium III PC, the operating system is Windows 10, and the server side consists of four Web servers of different types and platforms.
Test the performance under different security failures and compare the advantages of this paper’s watermarking technique with the other four techniques in dealing with different tampering magnitudes. In order to analyze the results more accurately, each experimental system was run
The results of security and reliability of computerized websites under digital watermarking technology are shown in Fig. 4. As can be seen from the figure, with the increment of tampering amplitude, the reliability and security of the website show a decreasing trend. However, this paper is based on fine-grained web watermarking technology, and the decreasing trend is more gentle than the other four watermarking technologies. In terms of security, when the tampering amplitude is from 0 to 0.7, the reduction of this paper’s method is about 24.30%, while the other four methods are distributed between 33.09% and 60.14%. In addition, when the tampering magnitude is 0.7, the reliability of this paper’s method is improved by 26.63% to 67.58% compared to the comparison methods.

Website safety and reliability results of digital watermarking technology
This section analyzes the accuracy of tamper detection of the computer security monitoring system in this paper in the same environment as above. Simulating the operation of users in different clients, this section sets up 100 clients, the administrator configures the sensitive information monitoring and management table in each client, and each client monitors the same sensitive information content. Users are able to operate at their own will, and the administrator monitors the user’s operation status in real time through the server. Afterwards, the administrator to user operation status display and query logs and then check with the user, so as to derive the results, drawing the system monitoring and user tampering amplitude fit curve, in order to evaluate the security monitoring accuracy of the system in this paper. Computer security monitoring system sensitive information tampering monitoring accuracy rate shown in Figure 5.

Sensitive information tampering with monitoring accuracy
As can be seen from the data in the figure, the system is capable of real-time monitoring of user operations, and it can accurately recognize when sensitive information has not been tampered with. And the amplitude curve for system monitoring and user-sensitive information tampering fits well. The relative error distribution between the two is less than 10%, indicating that the system has good monitoring accuracy.
Test the encryption and decryption of web server data files of different sizes with the same key to test the size of the encrypted file in relation to the time taken.
A set of files of different sizes (5KB, 10KB, 15KB, 20KB, 25KB, 30KB, 35KB, 40KB, 45KB, 50KB) are encrypted and decrypted with a 1024bit key and a 2048bit key respectively. The size of the web server data file is compared to the time spent on encryption and decryption in Figure 6.

The size of the site server data file and the time for encryption decryption
From the figure, it can be seen that the encryption time is almost the same for encrypting files of 50K or less, and the time spent is almost the same for both 1024-bit and 2048-bit keys, and it is also almost the same for 5KB and 50KB, and the encryption time overhead is in the range of 0.4s to 0.51s. It can be seen that for files of 50K and less size, the encryption time consumed is lower and does not make much difference around a fairly large (2048-bit) key length. But for the same number of key bits, decryption takes longer than encryption. For a certain size of data file, using a 1024-bit key, the decryption time spent is more than 3 times of the encryption time spent, and with the increase of the website data file, the decryption time is also increased, the data file is increased from 5KB to 50KB, and the decryption time is increased from 1.13s to 15.12s. For a 2048-bit key, the decryption time is more than 3 times of the decryption time spent for 1024-bit key. For 2048-bit key, the decryption time is 3 times more than 1024-bit decryption time, such as when the data file size is 50KB, the decryption time for 2048-bit decryption is 3.29 times more than 1024-bit decryption time.
In the business needs of the enterprise, different time and place, different applications on the network requirements are very different, such as the same huge amount of network bandwidth is occupied, both may be a virus operation, but also may be a real business needs.
Before the implementation of this system, the fault analysis is done by database method, i.e., the data is saved to the database first and then analyzed. The effectiveness of this paper’s system approach and database approach in monitoring web server hardware during a data volume increase from 100,000 to 1 million is compared in this section. Fig. 7 shows the comparison results between the method of this paper and the database analysis method.

The comparison of the method and the number of the library analysis
From the figure, it can be seen that the CPU utilization is higher in the case of this paper’s method, which is between 2.73% and 11.6% higher than the database method. But the memory utilization and disk space rate are less than database, and the processing time is especially obvious, with the increase of data volume, the database analysis shows a linear growth, and the processing time when the data volume is 1,000,000 is 7.91 times of that when it is 100,000, while the method of this paper is basically unchanged. And this paper’s method will provide real-time feedback on data monitoring to managers through early warning, while the database method is processed through after-action analysis.
After adopting this paper’s web server-based computer security monitoring system, there is no need to manually monitor hundreds of servers, thousands of processes and business systems, which reduces the allocation of human resources and provides more accurate and timely services than before, and the operational status of these business systems and networks can be tracked at any time. Through the automatic update of the message collection component and the setting of the policy center, it can be ensured that what reaches the user’s end are things that need to be dealt with urgently, which reduces the need for manual labor and improves the monitoring efficiency.
This study utilizes a variety of web server security maintenance techniques to design a system database, a web site anti-tampering module, and a web server monitoring module. The three are integrated to develop the web server-based computer security monitoring system described in this paper. The system is subjected to several experiments to evaluate its performance in computer security maintenance. The experimental results are as follows: Under the application of fine-grained web page watermarking technology, when the web page is tampered with from 0 to 0.7, the security reduction is about 24.30%, and the comparison method is as high as 33.09% to 60.14%. The system in this paper can monitor the tampering of sensitive information of web server in real time and can show better monitoring accuracy. The system in this paper consumes between 0.4s~0.51s for encrypting data and information of website server, which can meet the needs of computer security maintenance. The CPU utilization of this method is higher than the database method when dealing with huge amount of data, and the processing time is basically unchanged when the amount of data increases significantly.
