Journal of Web Engineering
ISSN / EISSN : 1540-9589 / 1540-9589
Published by: River Publishers (10.13052)
Total articles ≅ 151
Latest articles in this journal
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2062
With the development of sensor and communication technologies, the use of connected devices in industrial applications has been common for a long time. Reduction of costs during this period and the definition of Internet of Things (IoTs) concept have expanded the application area of small connected devices to the level of end-users. This paved the way for IoT technology to provide a wide variety of application alternative and become a part of daily life. Therefore, a poorly protected IoT network is not sustainable and has a negative effect on not only devices but also the users of the system. In this case, protection mechanisms which use conventional intrusion detection approaches become inadequate. As the intruders’ level of expertise increases, identification and prevention of new kinds of attacks are becoming more challenging. Thus, intelligent algorithms, which are capable of learning from the natural flow of data, are necessary to overcome possible security breaches. Many studies suggesting models on individual attack types have been successful up to a point in recent literature. However, it is seen that most of the studies aiming to detect multiple attack types cannot successfully detect all of these attacks with a single model. In this study, it is aimed to suggest an all-in-one intrusion detection mechanism for detecting multiple intrusive behaviors and given network attacks. For this aim, a custom deep neural network is designed and implemented to classify a number of different types of network attacks in IoT systems with high accuracy and F1-score. As a test-bed for comparable results, one of the up-to-date dataset (CICIDS2017), which is highly imbalanced, is used and the reached results are compared with the recent literature. While the initial propose was successful for most of the classes in the dataset, it was noted that achievement was low in classes with a small number of samples. To overcome imbalanced data problem, we proposed a number of augmentation techniques and compared all the results. Experimental results showed that the proposed methods yield highest efficiency among observed literature.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2061
Ontology technology has been investigated in a wide range of areas and is currently being utilized in many fields. In the e-learning context, many studies have used ontology to address problems such as the interoperability in learning objects, modeling and enriching learning resources, and personalizing educational content recommendations. We systematically reviewed research on ontology for e-learning from 2008 to 2020. The review was guided by 3 research questions: “How is ontology used for knowledge modeling in the context of e-learning?”, “What are the design principles, building methods, scale, level of semantic richness, and evaluation of current educational ontologies?”, and “What are the various ontology-based applications for e-learning?” We classified current educational ontologies into 6 types and analyzed them by 5 measures: design methodology, building routine, scale of ontology, level of semantic richness, and ontology evaluation. Furthermore, we reviewed 4 types of ontology-based e-learning applications and systems. The observations obtained from this survey can benefit researchers in this area and help to guide future research.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2066
With the rapid advancement of hardware and internet technologies, we are surrounded by more and more Internet of Things (IoT) devices. Despite the convenience and boosted productivity that these devices have brought to our lives and industries, new security implications have arisen. IoT devices bring many new attack vectors, causing an increment of cyber-attacks that target these systems in the recent years. However, security vulnerabilities on numerous devices are often not fixed. This may due to providers not being informed in time, they have stopped maintaining these models, or they simply no longer exist. Even if an official fix for a security issue is finally released, it usually takes a long time. This gives hackers time to exploit vulnerabilities extensively, which in many cases requires customers to disconnect vulnerable devices, leading to outages. As the software is usually closed source, it is also unlikely that the community will review and modify the source code themselves and provide updates. In this study, we present ARMPatch, a flexible static binary patching framework for ARM-based IoT devices, with a focus on security fixes. After identified the unique challenges of performing binary patching on ARM platforms, we have provided novel features by replacing, modifying, and adding code to already compiled programs. Then, the viability and usefulness of our solution has been verified through demos and final programs on real devices. Finally, we have discussed the current limitations of our approach and future challenges.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2064
The economic growth and information technology leads to the development of Internet of Things (IoT) industry and has become the emerging field of research. Several intrusion detection techniques are introduced but the detection of intrusion and malicious activities poses a challenging task. This paper devises a novel method, namely the Water Moth Search algorithm (WMSA) algorithm, for training Deep Recurrent Neural Network (Deep RNN) to detect malicious network activities. The WMSA algorithm is newly devised by combining Water Wave optimization (WWO) and the Moth Search Optimization (MSO). The pre-processing is employed for the removal of redundant data. Then, the feature selection is devised using the Wrapper approach, then using the selected features; the Deep RNN classifier effectively detects the intrusion using the selected features. The proposed WMSA-based Deep RNN showed improved results with maximal accuracy, specificity, and sensitivity of 0.96, 0.973 and 0.960.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2063
REST Web Services is a lightweight, maintainable, and scalable service accelerating client application development. The antipatterns of these services are inadequate and counter-productive design solutions. They have caused many qualitative problems in the maintenance and evolution of REST web services. This paper proposes an automated approach toward antipattern detection of the REST web services using Genetic Programming (GP). Three sets of generic, REST-specific and code-level metrics are considered. Twelve types of antipatterns are examined. The results are compared with the manual rule-based approach. The statistical analysis indicates that the proposed method has an average precision and recall scores of 98% (95% CI, 92.8% to 100%) and 82% (95% CI, 79.3% to 84.7%) and effectively detects REST antipatterns.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2067
With the continuous innovation and development of modern computer science and mobile Internet and other information technologies, artificial intelligence (AI) is not a new thing. It has been widely studied and applied in many fields, and it is very important for people in modern society. The research fields of artificial intelligence mainly include: deep learning, natural language processing, computer vision, intelligent robot, automatic programming, data mining and so on. All kinds of industrial production and daily life will bring a very important practical significance and far-reaching influence. The rapid development and improvement of AI have effectively changed the daily life of modern people and improved work efficiency, and promoted the vigorous and healthy development of human economic and social civilization and the progress of information technology. When widely used, traditional network information and big data processing technologies are difficult to adapt to its development needs. Only by closely combining cloud computing technology with other technologies can it play a better role and give full play to AI technology and its development. The enthusiasm and promotion of related application technologies have promoted the smooth progress of AI technology and related undertakings. With the development and improvement of cloud computing technology, more and more users tend to use the cloud to work. However, a large number of cloud service failures occurred, causing huge losses for enterprises and individuals. In order to prevent damage to the interests of enterprises and individuals, cloud service providers will provide high-quality services as much as possible. This paper aims to study the application of AI technology in cloud computing environment resources, research on the indicator of reliability, and propose a cloud service reliability verification method for the infrastructure-as-a-service layer. Experimental research shows that through the reliability detection method in this paper, users can easily and quickly obtain the reliability of the purchased cloud service, and can intuitively feel whether the performance of each server meets the promised situation in the cloud service provider’s SLA.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2065
In a computing environment, higher resolutions generally require more memory bandwidth, which inevitably leads to the consumption more power. This may become critical for the overall performance of mobile devices and graphic processor units with increased amounts of memory access and memory bandwidth. This paper proposes a lossless compression algorithm with a multiple differential pulse-code modulation variable sign code Golomb-Rice to reduce the memory bandwidth requirement. The efficiency of the proposed multiple differential pulse-code modulation is enhanced by selecting the optimal differential pulse code modulation mode. The experimental results show compression ratio of 1.99 for high-efficiency video coding image sequences and that the proposed lossless compression hardware can reduce the bus bandwidth requirement.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.2054
Access control is a major factor in enhancing data security in the cloud storage system. However, the existing data sharing and the access control method have privacy data leakage and key abuse, which is a major challenge in the research community. Therefore, an effective method named Blockchain-based access control and data sharing approach is developed in the cloud storage system to increase data security. The proposed Blockchain-based access control and data sharing approach effectively solve single-point failure in the cloud system. It provides more benefits by increasing the throughput and reducing the cost. The Data user (DU) makes the registration request using the ID and password and forwards it to the Data Owner (DO), which processes the request and authenticates the Data user. The information of the data owner is embedded in the transactional blockchain using the encrypted master key. The Data owner achieves the data encryption process, and encrypted files are uploaded to the Interplanetary File System (IPFS). Based on the encrypted file location and encrypted key, the Data owner generates the ciphertext metadata and is embedded in the transactional blockchain. The proposed Blockchain-based access control and data sharing approach achieved better performance using the metrics, like a better genuine user detection rate of 95% and lower responsiveness of 25sec with the blockchain of 100 sizes.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.20515
With the development of online information sharing, high-tech equipment for collaborative production management of power enterprises emerges endlessly. Therefore, it is necessary to design the collaborative production management system of power enterprises based on online information sharing to meet the information sharing needs. In terms of the hardware, the B/S structure was built, and the computer was debugged with Cascading Style Sheet (CSS). In terms of the software, Hadoop horizontal architecture technology framework was designed, the physical deployment was carried out, the production management center module was designed, and the production operation chain was monitored and managed to realize the collaborative production management of power enterprises. The experimental results showed that the designed collaborative production management system of power enterprise had high reliability and friendliness, the highest reliability is 97.2%, the highest friendliness is 99.8%, which meets the current demand.
Journal of Web Engineering; https://doi.org/10.13052/jwe1540-9589.20511
Speech signal is a time-varying signal, which is greatly affected by individual and environment. In order to improve the end-to-end voice print recognition rate, it is necessary to preprocess the original speech signal to some extent. An end-to-end voiceprint recognition algorithm based on convolutional neural network is proposed. In this algorithm, the convolution and down-sampling of convolutional neural network are used to preprocess the speech signals in end-to-end voiceprint recognition. The one-dimensional and two-dimensional convolution operations were established to extract the characteristic parameters of Meier frequency cepstrum coefficient from the preprocessed signals, and the classical universal background model was used to model the recognition model of voice print. In this study, the principle of end-to-end voiceprint recognition was firstly analyzed, and the process of end-to-end voice print recognition, end-to-end voice print recognition features and Res-FD-CNN network structure were studied. Then the convolutional neural network recognition model was constructed, and the data were preprocessed to form the convolutional layer in frequency domain and the algorithm was tested.