International Journal on Recent and Innovation Trends in Computing and Communication

Journal Information
EISSN : 2321-8169
Published by: Auricle Technologies, Pvt., Ltd. (10.17762)
Total articles ≅ 686

Latest articles in this journal

Rakesh Paliwal, Irfan Khan
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 12-24;

Mobile wireless sensor networks have been developed as a result of recent advancements in wireless technologies. Sensors in the network are low-cost and have a short battery life, in addition to their mobility. They are more applicable in terms of the essential properties of these networks. These networks have a variety of uses, including search and rescue operations, health and environmental monitoring, and intelligent traffic management systems, among others. According to the application requirements, mobile wireless sensor nodes are energy limited equipment, so energy conservation is one of the most significant considerations in the design of these networks. Aside from the issues posed by sensor node mobility, we should also consider routing and dynamic clustering. According to studies, cluster models with configurable parameters have a substantial impact on reducing energy usage and extending the network's lifetime. As a result, the primary goal of this study is to describe and select a smart method for clustering in mobile wireless sensor networks utilizing evolutionary algorithms in order to extend the network's lifetime and ensure packet delivery accuracy. For grouping sensor nodes in this work, the Genetic Algorithm is applied initially, followed by Bacterial Conjugation. The simulation's results show a significant increase in clustering speed acceleration. The speed of the nodes is taken into account in the suggested approach for calibrating mobile wireless sensor nodes.
Amit Agrawal, Geeta Tiwari
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 25-34;

Today Cancer is spreading heavily and become the most dangerous disease in the world. This disease causes death if not diagnoses before the major stage. Small changes or illness in the human body may transform to the cancer in the body. The main thing in this disease is it is not easily detected in its earlier stage. So this causes the aim to design a computer operated system that can make distinguish between benign (non-cancerous) and malignant (cancerous) mammogram. The proposed system helps doctors to increase the diagnosis accuracy. The above propose system shall be simulated by MATLAB. The ART 1.0 algorithm shall be studied and modified to improve the accuracy of existing ART 1.0 system. The simulation shall be done by obtaining cancer data set from UCI repository. The reason behind choosing ART algorithm because of its characteristic to work on three phases i.e., Recognition, Comparison, Search Phase. The winning neuron is obtained by finding the dot product of input and weight vector. The neuron having largest dot product be the winner.
Nemi Chand Jat, Chetan Kumar
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 01-11;

It is possible to define the quantity of temporal effects by employing multitemporal data sets to discover changes in nature or in the status of any object based on observations taken at various points in time. It's not uncommon to come across a variety of different methods for spotting changes in data. These methods can be categorized under a single umbrella term. There are two primary areas of study: supervised and unsupervised change detection. In this study, the goal is to identify the changes in land cover. Covers a specific area in Kayseri using unsupervised change detection algorithms and Landsat satellite pictures from various years have been gleaned through the use of remote sensing. In the meantime, image differencing is taking place. The method will be applied to the photographs using the image-enhancing process. In the next step, Principal Component Analysis (PCA) is employed. The difference image will be analyzed using Component Analysis. To find out which locations have and which do not. As a first step, a procedure must be in place. We've finished registering images one after the other. Consequently, the photos are being linked together. After then, it's back to black and white. Three non-overlapping portions of the difference image have been created. This can be done using the principal component analysis method. From the eigenvector space, we may get to the fundamental components. As a last point, the major feature vector space fuzzy C-Means Clustering is used to divide the component into two clusters, and then a change detection technique is carried out. As the world's population grew, farmland expansion and unplanned land encroachment intensified, resulting in uncontrolled deforestation around the globe. This project uses unsupervised learning algorithm K-means clustering. In a cost-effective manner that can be employed by officials, companies as well as private groups, to assist in fighting illicit deforestation and analysis of satellite database.
Ripon Roy, Anil Kalotra
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 11-20;

With the modern converting public control approach, it's supposed for public establishments to provide greater green and powerful offerings at a decreased cost. In this study, the overall performance enhancement volume becomes found according to the decided general performance standards of transportation offerings statistics technology in public establishments. The institution's modern-day overall performance becomes evaluated. It has become obtrusive that public transport corporations (BRTC) wished for a statistics gadget to preserve car fleets below control. The effect of the car control utility becomes repeated that performance of the overall performance evaluation made above. It became decided where becomes obtained at minimum decreasing mentioned all types of expenses in public transportation.
Nandini Joshi, Bharat Bhushan Jain
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 21-29;

Non-linear loads are frequently affected by power quality (PQ). Resonance mechanisms, condenser overheating, and other performance-degrading consequences are all caused by harmonic currents. Voltage sags are common in low-voltage systems. While harmonic currents are pumped into the grid, equipment like electrical converters improve the entire response of an equal load. The necessity for reactive power is well-known for lowering feeder voltage and increasing losses. Harmonic currents can cause a poor signal by distorting the waveform voltage. There's also a rise in the number of loads that need significant sinus tension to work correctly. People are getting more interested in power conditioning solutions as electronic devices become more power-sensitive. As a result, if the amount of electricity produced falls below a specified threshold, compensation must be supplied. The Unified Power Quality Controller (UPQC) is a type of AC transmission system that can manage voltage, impedance, and phase angle. UPQC (United Provinces and Territories (FACTS). A Dynamic Voltage Restorer, a Fuzzy Controlled Shunt Active Power Filter, and a UPQC are required to improve the power quality of the power system. DVRs (Dynamic Voltage Restorers) are power converters that are installed in responsive load arrays to protect against supply disruptions. Because of its short response time and high level of dependability, it is an excellent tool for increasing the quality of electrical power. The simulation results were compared to the basic system and enhanced to demonstrate the efficiency of the suggested system.
Kritika Soni, Mohak ‎, Neha Kaushik, Dhruv Dhote, Dhruv Nigam, K Gopi Krishna
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 01-10;

To analyse the previous website which means the original website. Trying to make more attractive and interesting. Methods: Analyse the old website. A website redesign shouldn’t just change the overall look of your website. It should enhance the ways in which it functions. Find out what is working on the current website. Building the website design plan. Added strong visual features and elements. Findings: website overall feels outdated i.e., make it more attractive and add some animated clip. Applications: the study highlighted various issues, redesign a IBM company website and you’re going to see that with very small tweaks to the layout and composition and have a dramatic impact on the webs design. The first thing notice is I’m overwhelmed, right in terms of graphic design terms of hierarchies there are so many things here they just try to grab attention, there is a image, styling so that grab my attention so many things are competing for my attention that just overwhelmed so, this is not a good user experience. First thing that that we were thinking about even before trying to get into what we do they even do here on the website is how can we simplify what’s going on here how can we create very clear hierarchies. We were thinking about how we can simplify this visually. A lot of times there’s so many things we can do here such as illustration, 3D rendering of this, do custom photography there’s so many ways to approach this. We can present it in a very interesting way
Jui-Pin Yang
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 08-13;

The network storage systems are generally composed of clients, storage servers and metadata servers. In this paper, we proposed a novel storage virtualization (NSV) scheme which is capable of alleviating the heavy load of metadata server, guaranteeing the storage quality of service and dynamically adapting storage resources. The metadata server automatically constructs a dedicated storage cluster according to various requirements of storage quality of service. The storage cluster may consist of one to many storage servers which includes one master storage server and zero to many slave storage servers. In other words, a network storage system consists of at least one storage cluster. The requests of each client are forwarded to corresponding master storage server within a specific storage cluster. In addition, the master storage server determines the best storage server which handles the requests based on the conditions of storage servers. Next, the requests will be redirected to the selected storage server. Finally, the responses are directly transmitted to the client.
Madhavi S. Avhankar, Janardan A. Pawar, Snehankita Majalekar, Suwarna Kedari
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 10, pp 01-07;

Mobile Ad Hoc Networks have evolved rapidly and are finding numerous applications in the areas of self-creating, self-organizing and self-administering wireless networks. The present paper describes use of and comparison of three routing protocols. The parameters used for comparison are throughput and delay in response by varying the number of mobile nodes. A random waypoint mobility model was used for fixing the mobile nodes. The simulation study is carried out using OPNET modeler 14.5. Simulation result shows that for increasing number of mobile nodes OLSR offers better throughput and minimum delay than AODV and GRP routing protocols.
Smitha Krishnan, B.G Prasanthi
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 9, pp 08-11;

Today, the most recent paradigm to emerge is that of Cloud computing, which promises reliable services delivered to the end-user through next-generation data centres which are built on virtualized compute and storage technologies Consumer will be able to access desired service from a “Cloud” anytime anywhere in the world on the bases of demand. Computing services need to be highly reliable, scalable, easy accessible and autonomic to support ever-present access, dynamic discovery and computability, consumers indicate the required service level through Quality of Service (QoS) parameters, according to Service Level Agreements (SLAs) A suitable mdel for the prediction is being developed. Here Genetic Algorithm is chosen in combination with stastical model to do the workload prediction .It is expected to give better result by producing less error rate and more accuracy of prediction compared to the previous algorithm.
Meltem Eryılmaz, Önder Ertan, Furkan Yalçınkaya, Ekin Kara
International Journal on Recent and Innovation Trends in Computing and Communication, Volume 9, pp 01-07;

Coronavirus pandemic has been going on since late 2019, millions of people died worldwide, vaccination has recently started in many countries and new strategies are sought by countries since they are still struggling to defeat the virus. So, this research is made to predict the possible ending time of the coronavirus pandemic in Turkey using data mining and statistical studies. Data mining is a computer science study that processes large amounts of data and produces new useful information. It is especially used to support decision making in companies today. So, this project could support the decision making of authorities in developing an effective strategy against the on-going pandemic. During the research we have practiced on Turkey’s coronavirus and vaccination data between 13 January 2021 and 28 May 2021. We used Rapidminer and the Random Forest method for data mining. After all the simulations we have applied and observed during our research, it was clearly seen that vaccination parameters were decreasing the new cases. Also, the stringency index did not affect the new cases. As a conclusion of our research and observations, we think that the government should vaccinate as many people as it can in order to relax restrictions for the last time.
Back to Top Top