Journal IJCCS (Indonesian Journal of Computing and Cybernetics Systems)

-
70 articles
Page of 1
Articles per Page
by
Muhammad Ihsan, Reza Pulungan, A Afiahayati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 95-106; doi:10.22146/ijccs.32142

Abstract:Library services quality is one of the most vital parts in library management. Evaluation of the library services based on the perspective of users is important. In this paper, we propose a collaboration of GDSS-AHP (Group Decision Support System-Analytical Hierarchy Process), LibQual, and IPA (Importance-Performance Analysis) methods to evaluate library services quality. The collaboration of GDSS-AHP and LibQual is used to calculate the weight of each evaluation statement and the level of library services quality based on users’ perception and expectation. IPA is then used to determine the position of the value of each evaluation statement in IPA’s four quadrants to obtain the recommended level for the library services improvement. This study is conducted at the Library of the Ministry of Trade of Indonesia, involving four decision makers: a head librarian, a library academic expert, and two library practitioners. Fifty library visitors become respondents to assess the quality services questionnaires. Based on their responses, we obtain that users’ satisfaction level is at least satisfied by 76.49 %. Meanwhile, usability testing is also conducted on the developed system by using three observation elements: effectiveness, efficiency and satisfaction. The usability testing is performed on five respondents, one admin, and two decision makers, and results in an average usability level of 90.03%.
Fajar Ratnawati, Edi Winarko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 1-10; doi:10.22146/ijccs.19237

Abstract:Movie has unique characteristics. When someone writes an opinions about a movie, not only the story in the movie itself is written, but also the people involved in the movie are also written. Opinion ordinary movie written in social media primarily twitter.To get a tendency of opinion on the movie, whether opinion is likely positive, negative or neutral, it takes a sentiment analysis. This study aims to classify the sentiment is positive, negative and neutral from opinions Indonesian language movie and look for the accuracy, precission, recall and f-meausre of the method used is Dynamic Convolutional Neural Network. The test results on a system that is built to show that Dynamic Convolutional Neural Network algorithm provides accuracy results better than Naive Bayes method, the value of accuracy of 80,99%, the value of precission 81,00%, recall 81,00%, f-measure 79,00% while the value of the resulting accuracy Naive Bayes amounted to 76,21%, precission 78,00%, recall 76,00%, f-measure 75,00%.
Khabib Mustofa, Sunu Pinasthika Fajar
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 63-72; doi:10.22146/ijccs.28121

Abstract:In a software development projects, testing is an activity that can spend time, effort or cost up to 35%. To reduce this, developers can choose automatic testing. Automated testing, especially for functional testing, on web applications can be done by using tools, one of which is Selenium. By default, Selenium testing is done sequentially and without exploiting multithreading, which has an impact a sufficiently long time.In this study, a platform that allows Selenium users to test and utilize multithreading with Ruby language to speed up testing was developed. Thr result shows that Ruby's multithreading has proven to be capable of speeding functional testing up on various web applications. Variations occur depending on the functionality being tested, the testing approach and also the type of browsers used.
Maulida Ayu Fitriani, Aina Musdholifah, Sri Hartati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 53-62; doi:10.22146/ijccs.27871

Abstract:Various clustering methods to obtain optimal information continues to evolve one of its development is Evolutionary Algorithm (EA). Adaptive Unified Differential Evolution (AuDE), is the development of Differential Evolution (DE) which is one of the EA techniques. AuDE has self adaptive scale factor control parameters (F) and crossover-rate (Cr).. It also has a single mutation strategy that represents the most commonly used standard mutation strategies from previous studies.The AuDE clustering method was tested using 4 datasets. Silhouette Index and CS Measure is a fitness function used as a measure of the quality of clustering results. The quality of the AuDE clustering results is then compared against the quality of clustering results using the DE method.The results show that the AuDE mutation strategy can expand the cluster central search produced by ED so that better clustering quality can be obtained. The comparison of the quality of AuDE and DE using Silhoutte Index is 1:0.816, whereas the use of CS Measure shows a comparison of 0.565:1. The execution time required AuDE shows better but Number significant results, aimed at the comparison of Silhoutte Index usage of 0.99:1 , Whereas on the use of CS Measure obtained the comparison of 0.184:1.
Cornelis Frederik Junifer Latupapua, Tri Kuntoro Priyambodo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 43-52; doi:10.22146/ijccs.27292

Abstract:LTE networks were created to improve on previous technologies, where the advantages of LTE networks are at the speed of data transfer and greater service capacity, reduced operational costs and can be integrated with existing technologies.This simulation is used to analyze the video performance of FDD streaming mode in handover process using Network Simulator 3 with 3 cell for different speed and number of users, with delay, packet loss and throughput parameters. The test results show that the performance of streaming video in handover process on all test, not affected by delay value. The highest delay value is still in good category that is 153. 43 ms. The highest packet loss is 54.5% with 60 users at speeds of 100 km / h. The highest throughput value is 0.60 Mbps at a speed of 40 km / h with 5 users and the lowest throughput value is 0.40 Mbps at a speed of 60 km / h with 60 users. The best performance occurred at a speed of 40 km / h, on the contrary at speeds of 70 Km / h and 100 Km / h, the performance decreased due to increased packet loss and decreased throughput value.
Nelson Rumui, Agus Harjoko, Aina Musdholifah
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 33-42; doi:10.22146/ijccs.26331

Abstract:Stroke is a type of cerebrovascular disease that occurs because blood flow to the brain is disrupted. Examination of stroke accurately using CT scan, but the tool is not always available, so it can be done by the Siriraj Score. Each type of stroke has similar symptoms so doctors should re-examine similar cases prior to diagnosis. The hypothesis of the Case-based reasoning (CBR) method is a similar problems having similar solution.This research implements CBR concept using Siriraj score, dense index and Jaccard Coeficient method to perform similarity calculation between cases.The test is using k-fold cross validation with 4 fold and set values of threshold (0.65), (0.7), (0.75), (0.8), (0.85), (0.9), and (0.95). Using 45 cases of data test and 135 cases of case base. The test showed that threshold of 0.7 is suitable to be applied in sensitivity (89.88%) and accuracy (84.44% for CBR using indexing and 87.78% for CBR without indexing). Threshold of 0.65 resulted high sensitivity and accuracy but showed many cases of irrelevant retrieval results. Threshold (0.75), (0.8), (0.85), (0.9) and (0.95) resulted in sensitivity (65.48%, 59.52%, 5.95%, 3,57% and 0%) and accuracy of CBR using indexing (61.67%, 55.56%, 5.56%, 3.33%, and 0%) and accuracy of CBR without indexing (62.78% 56.67%, 55.56%, 5.56%, 3.33%, and 0%).
Ferdi Chahyadi, Azhari Sn, Hendra Kurniawan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 21-32; doi:10.22146/ijccs.23056

Abstract:Nurse’s scheduling in hospitals becomes a complex problem, and it takes time in its making process. There are a lot of limitation and rules that have to be considered in the making process of nurse’s schedule making, so it can fulfill the need of nurse’s preference that can increase the quality of the service. The existence variety of different factors that are causing the nurse scheduling problem is so vast and different in every case. The study is aimed to develop a system used as an equipment to arrange nurse’s schedule. The working schedule obtained will be checked based on the constraints that have been required. Value check of the constraint falsification used Simulated Annealing (SA) combined with cooling method of Probabilistic Cooling Scheme (PCS). Transitional rules used cost matrix that is employed to produce a new and more efficient state. The obtained results showed that PCS cooling methods combined with the transition rules of the cost matrix generating objective function value of new solutions better and faster in processing time than the cooling method exponential and logarithmic. Work schedule generated by the application also has a better quality than the schedules created manually by the head of the room.
Sri Mulyana, Ilham Sahputra
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 11-20; doi:10.22146/ijccs.22886

Abstract:The accident that occurred to somebody will give much suffering; moreover, if the accident gives the serious injury, such as a broken bone which needs to get more seriously treatment. Not only does the patient need the action towards his/her injury, but also he/she needs the psychological therapy in facing the problems happened which is suggested by a psychologist. One of the reasoning method in expert systems is Case-Based Reasoning (CBR). In Case-Based Reasoning, a case-based consists of various cases in conditions or symptoms and solution (the psychological therapy) given. To find out the solution from a new problem given, the system will find any cases in the case-based which have higher the degree of similarity between the cases. This research develops a case-based reasoning system to decide the action of the psychological therapy towards the patients in the post-accident who needs seriously treatment. The psychological therapy involves in giving assistance, consultation, psychiatrist support, and the compound of various actions as well. A case study was conducted from the medical records of psychological treatment at ‘Dr Soeharso’ hospital in Surakarta. Based on the result of the research developed, the action of psychological therapy upon the patient has successfully determined. They have accuracy rates of 60% in the threshold 50% compared to the treatments resulted from the psychologist. The result was found by calculating the degree of similarity between the new issue and all cases existing in the case base.
Chandra Kusuma Dewa, Amanda Lailatul Fadhilah, A Afiahayati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 83-94; doi:10.22146/ijccs.31144

Abstract:Convolutional neural network (CNN) is state-of-the-art method in object recognition task. Specialized for spatial input data type, CNN has special convolutional and pooling layers which enable hierarchical feature learning from the input space. For offline handwritten character recognition problem such as classifying character in MNIST database, CNN shows better classification result than any other methods. By leveraging the advantages of CNN over character recognition task, in this paper we developed a software which utilizes digital image processing methods and CNN module for offline handwritten Javanese character recognition. The software performs image segmentation process using contour and Canny edge detection with OpenCV library over captured handwritten Javanese character image. CNN will classify the segmented image into 20 classes of Javanese letters. For evaluation purposes, we compared CNN to multilayer perceptron (MLP) on classification accuracy and training time. Experiment results show that CNN model testing accuracy outperforms MLP accuracy although CNN needs more training time than MLP.
Restu Maulunida, Achmad Solichin
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 12, pp 73-82; doi:10.22146/ijccs.28707

Abstract:At present, the need to access the data have been transformed into digital data, and its use has been growing very rapidly. This transformation is due to the use of the Internet is growing very rapidly, and also the development of mobile devices are growing massively. People tend to store a lot of files in their storage and transfer files from one media to another media. When approaching the limit of storage media, the fewer files that can be stored. A compression technique is required to reduce the size of a file. The dictionary coding technique is one of the lossless compression techniques, LZW is an algorithm for applying coding dictionary compression techniques. In the LZW algorithm, the process of forming a dictionary uses a future based dictionary and encoding process using the Fixed Length Code. It allows the encoding process to produce a sequence that is still quite long. This study will modify the process of forming a dictionary and use Variable Length Code, to optimize the compression ratio. Based on the test using the data used in this study, the average compression ratio for LZW algorithm is 42,85%, and our proposed algorithm is 38,35%. It proves that the modification of the formation of the dictionary we proposed has not been able to improve the compression ratio of the LZW algorithm.
Enny Itje Sela, M Ihsan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.24756

Abstract:Currently to find out the quality of eggs was conducted on visual observation directly on the egg, both the outside of the egg in the form of eggshell conditions or the inside of the egg by watching out using sunlight or a flashlight. This method requires good accuracy, so in the process it can affect results that are not always accurate. This is due to the physical limitations of each individual is different. This study examines the utilization of digital image processing for the detection of egg quality using eggshell image.The feature extraction method performed a texture feature based on the histogram that is the average intensity, standard deviation, skewness, energy, entropy, and smoothness properties. The detection method for training and testing is K-Means Clustering algorithm. The results of this application are able to help the user to determine the quality of good chicken eggs and good quality chicken eggs, with accurate introduction of good quality eggs by 90% and poor quality eggs by 80%.
Victor Paskalathis, Azhari Sn
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.16631

Abstract:Common practice in crowdsourced delivery services is through direct delivery. That is by dispatching direct trip to a driver nearby the origin location. The total distance can be reduced through multiple pickup and delivery by increasing the number of requests in a trip.The research implements exact algorithm to solve the consolidation problem with up to 3 requests in a trip. Greedy heuristic is performed to construct initial route based on highest savings. The result is then optimized using Ant Colony Optimization (ACO). Four scenarios are compared. A direct delivery scenarios and three multiple pickup and delivery scenarios. These include 2-consolidated delivery, 3-consolidated delivery, and 3-consolidated delivery optimized with ACO. Four parameters are used to evaluate using Analytical Hierarchical Process (AHP). These include the number of trips, total distance, total duration, and security concerns.The case study is based on Yogyakarta area for a whole day. The final route optimized with ACO shows 178 requests can be completed in 94 trips. Compared to direct delivery, consolidation can provides savings up to 20% in distance and 14% in duration. The evaluation result using AHP shows that ACO scenario is the best scenario.
Ira Zulfa, Edi Winarko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.24716

Abstract:Sentiment analysis is a computational research of opinion sentiment and emotion which is expressed in textual mode. Twitter becomes the most popular communication device among internet users. Deep Learning is a new area of machine learning research. It aims to move machine learning closer to its main goal, artificial intelligence. The purpose of deep learning is to change the manual of engineering with learning. At its growth, deep learning has algorithms arrangement that focus on non-linear data representation. One of the machine learning methods is Deep Belief Network (DBN). Deep Belief Network (DBN), which is included in Deep Learning method, is a stack of several algorithms with some extraction features that optimally utilize all resources. This study has two points. First, it aims to classify positive, negative, and neutral sentiments towards the test data. Second, it determines the classification model accuracy by using Deep Belief Network method so it would be able to be applied into the tweet classification, to highlight the sentiment class of training data tweet in Bahasa Indonesia. Based on the experimental result, it can be concluded that the best method in managing tweet data is the DBN method with an accuracy of 93.31%, compared with Naive Bayes method which has an accuracy of 79.10%, and SVM (Support Vector Machine) method with an accuracy of 92.18%.
Lukman Heryawan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.24761

Abstract:Diabetic retinopathy is a complication caused by diabetes mellitus. Diabetic retinopathy, if not handled properly can lead to blindness. A necessary step to prevent blindness is early detection. Early detection can be done by finding the initial symptoms that microaneurysm. In this research, a system made to detect diabetic retinopathy using algorithms detection microaneurysm with mathematical morphology. The algorithm is divided into three stages of preprocessing, detecting candidate microaneurysm and postprocessing. In this research, the system will be made by using a raspberry pi as the media. To see how well the system detects diabetic retinopathy, the test will be done. in the tests performed, system obtained an accuracy of 90%, sensitivity 90, and specificity of 55% using data diaretdb1. While testing using data from e-ophtha obtained results with an accuracy of 70.5%, a sensitivity of 80% and a specificity of 60%.
Pawit Rianto, Agus Harjoko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.17416

Abstract:Because there is no a system based on Digital Image Processing to determine the degree of ripeness of Salak Pondoh (Salacca zalacca Gaertner Voss.) on tree, then this study has attempted to implement such a system. System was built with consists of several sub-processes. First, the segmentation process, the system will perform a search of pixels alleged pixels salak pondoh, by utilizing the features of color components r, g, b, and gray of each pixel salak pondoh then calculated large the dissimilarity ( Euclidean Distance ) against values of data features , , , and comparison. If the value of dissimilarity less than the threshold value and is also supported by the neighboring pixels from different directions has a value of dissimilarity is less than a threshold value, the pixel is set as an object pixel, for the other condition set as background pixels. For the next, improvements through an elimination noise stage and filling in the pixels to get a perfect binary image segmentation. Second, classification, by knowning the mean value of R and V of the entire pixel object, then the level of ripeness salak pondoh can be determined by using the method of classification backpropagation or k -Nearest Neighbor. From the test results indicate that the success of the system by 92% when using a backpropagatioan classification algorithm and 93% with k-Nearest Neighbor algorithm.
Muhammad Fahrurrozi, Azhari Sn
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.18360

Abstract:Semantic web is a technology that allows us to build a knowledge base or ontology for the information of the web page can be understood by computers. One software for building ontology-based semantic web is a protégé. Protege allows developers to develop an ontology with an expression of logic description. Protégé provides a plugin such as DL-Query and SPARQL-Query to display information that involve expression of class, property and individual in the ontology. The problem that then arises is DL-plugin Query only able to process the rules that involve expression of class to any object property, despite being equipped with the function of reasoning. while the SPARQL-Query plugin does not have reasoning abilities such as DL-Query plugin although the SPARQL-Query plugin can query memperoses rules involving class, property and individual. This research resulted in a new plugin using SPARQL-DL with input natural language as a protégé not provide a plugin with input natural language to see results from the combined expression-expression contained in the ontology that allows developers to view information ontology language that is easier to understand without having think of SPARQL query structure is complicated.
Made Arya Budhi, Retantyo Wardoyo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.22773

Abstract:Determining the best employee at Lombok Garden inteded to stimulate the performance of the hotel employees Lombok Garden. Improved performance of employees it will have a direct effect on the quality of hotel services. Employee performance appraisement are conducted by six assessors, namely the head of each department and consists of several criteria. Assessments will be difficult if done manually considering each appraiser has its own preferences in assessment. To solve that problem, we need a computer system that helps decision-making is a group decision support system (GDSS) determination of the best employees in the hotel Lombok Garden.Group decision support system developed in this study using TOPSIS (Technique For Order Preference By Similiarity To Ideal Solution) and Borda to assist decision-making group. TOPSIS method is used for decision-making in each appraiser, while the Borda method used to combine the results of each assessor's decision so as to obtain the final result of the best employees in Lombok Garden.Based on the final result of the system of determination of the best employees in the form of a ranking of the final value of each employee. The highest value will be used as a recommendation as the best employee at Lombok Garden.
Ari Kusuma Wardana, Sri Hartati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.24214

Abstract:Genetic Algorithm is one of famous algorithm and often used in many sector. Usually genetic algoritm is used in solution searching about complex problems. Pencak silat macth scheduling is a complex scheduling and needs a lot of time to made it. Objective this research implements a genetic algorithm as an algorithm which can solve the problem of pencak silat macth scheduling and can satisfy all of hard constraint and minimize soft constraint. In this research genetic algorithm roles as algorithm which solves pencak silat mach scheduling problems in Pimda 02 Tak Suci Bantul. Population which produced by genetic algorithm represents solution alternatives which offered. Best chromosome in a population represents macth scheduling solution. This solution is sequence of match partai based on rules of pencak silat match scheduling. This research produces best fitness value ever in each generation is 1. More and more chromosom number and more and more generation number will make batter solution and batter fitness value. This research is expected helping pencak silat match committes make a pencak silat schedule in pencak silat championship.
Pradita Eko Prasetyo Utomo, Azhari Sn
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.16643

Abstract:Motor vehicle theft is a crime that is most common in Indonesia. Growth of vehicle motorcycle significant in each year accompanied by the increasing theft of motorcycles in each year, we need a system that is able to forecast the development and the theft of the motorcycle.This research proposes the development of forecasting models vulnerability criminal offense of theft of motorcycles with ARIMA forecasting method. This method not only forecast from variable of theft but also residents, vehicles and unemployment. The study also determined the classification level of vulnerability to the crime of theft of a motorcycle using a method based on the Decision Tree CART ARIMA forecasting method.Forecasting time series data with ARIMA method performed by each of the variables to produce the best ARIMA forecasting model which varies based on the data pattern of each of those variables. The results of classification by CART method to get the value of accuracy of 92% for the city of Yogyakarta and 85% for DIY. Based on the above, the results of ARIMA forecasting and classification CART can be used in determining the level of vulnerability to the crime of theft of motorcycles.
Eka Wahyudi, Sri Hartati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.15523

Abstract:Case Based Reasoning (CBR) is a computer system that used for reasoning old knowledge to solve new problems. It works by looking at the closest old case to the new case. This research attempts to establish a system of CBR for diagnosing heart disease. The diagnosis process is done by inserting new cases containing symptoms into the system, then the similarity value calculation between cases uses the nearest neighbor method similarity, minkowski distance similarity and euclidean distance similarity. Case taken is the case with the highest similarity value. If a case does not succeed in the diagnosis or threshold <0.80, the case will be revised by experts. Revised successful cases are stored to add the systemknowledge. Method with the best diagnostic result accuracy will be used in building the CBR system for heart disease diagnosis. The test results using medical records data validated by expert indicate that the system is able to recognize diseases heart using nearest neighbor similarity method, minskowski distance similarity and euclidean distance similarity correctly respectively of 100%. Using nearest neighbor get accuracy of 86.21%, minkowski 100%, and euclidean 94.83%
Dharma Pratama, Seng Hansun
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.15558

Abstract:Food is one of the basic needs for human being. The needs of food will always increase unanimous with the number of people, so that many restaurants appear. Because of there are so many restaurants, it can arise a confusion when we want to choose a restaurant to eat. Therefore, an application which can give a restaurant recommendation will be built in this research. The recommendation given by the system is calculated using Slope One algorithm and the restaurants database is gathered from Google Places API. Slope One algorithm make the recommendation by summing the rating of a restaurants with the difference average to other restaurants. The application also had been tested to the user by using J.R.Lewis questionnaire with questions categories of application usefulness, information quality, and user interface quality. The results from the testing are user find the application useful to give the proper restaurant recommendation, the information quality is good, and the user interface quality is also good.
Tonny Suhendra, Tri Kuntoro Priyambodo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.15743

Abstract:Development of technology and complexity of an environment (dynamic environtment), the use of algorithms in path planning becomes an important thing to do, problem to be solved by the path planning is safe patch (collision-free), second is the distance traveled, ie, the path length is generated from the robot start position to the current target position and the thirdtravel time, ie, the timerequired by the robot to reached its destination.this research uses ACO algorithm and A-star Algorithm to determine the influence of obstacles (simple environment) and also differences in the pattern of the target motion (linier and sinusoidal)on the ability of the algorithm in pathplanning for finding the shortest path. The test results show that for a simple environtment where the state of target and obstacles still static,the resukt that A-star algorithm is betterthan ACO algorithm both in terms of travel time and travel distance. Testing with no obstacles, seen from the distance travelled differences obtained of 0,57%, whereas for testing with obstacles difference of 9%. Testing in a complex environtment where the targets and obstacles which movesdinamically with a certain pattern, from the three environmental conditions that has been tested, ACO algorithm is better than A-star algorithm where the ACO algorithm can find a path with optimal distance or the sortest distance.
Siti Khomsah, Edi Winarko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.15927

Abstract:The successful rate of the poor families empowerment can be classified by characteristic patterns extracted from the database that contains the data of the poor families empowerment. The purpose of this research is to build a classification model to predict the level of success from poor families, who will receive assistance empowerment of poverty. Classification models built with WARM, which is combining two methods, they are HITS and WIT-tree. HITS is used to obtained the weight of the attributes from the database. The weights are used as the attributes’s weight on methods WIT-tree. WIT-tree is used to generate the association rules that satisfy a minimum weight support and minimum weight confidence. The data used was 831 sample data poor families that divided into two classes, namely poor families in the standard of "developing" and poor families in the level of "underdeveloped". The performance of classification model shows, weighting attribute using HITS approaches the accuracy of 86.45% and weighted attributes defined by the user approaches the accuracy of 66.13%. This study shows that the weight of the attributes obtained from HITS is better than the weight of the attributes specified by the user.
Bambang Hermanto, Azhari Sn
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.15946

Abstract:In an effort to improve the quality of customer service, especially in terms of feasibility assessment of borrowers due to the increasing number of new prospective borrowers loans financing the purchase of a motor vehicle, then the company needs a decision making tool allowing you to easily and quickly estimate Where the debtor is able to pay off the loans.This study discusses the process generates C4.5 decision tree algorithm and utilizing the learning group of debtor financing dataset motorcycle. The decision tree is then interpreted into the form of decision rules that can be understood and used as a reference in processing the data of borrowers in determining the feasibility of prospective new borrowers. Feasibility value refers to the value of the destination parameter credit status. If the value of the credit is paid off status mean estimated prospective borrower is able to repay the loan in question, but if the credit status parameters estimated worth pull means candidates concerned debtor is unable to pay loans..System testing is done by comparing the results of the testing data by learning data in three scenarios with the decision that the data is valid at over 70% for all case scenarios. Moreover, in generated tree and generate rules takes fairly quickly, which is no more than 15 minutes for each test scenario
Dirja Nur Ilham, Sri Mulyana
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.16595

Abstract:The right Placement Job Training (PKL) selection for the students is a very important thing, because it can maximize the abilities and talents of each student so that can produce graduates who are ready to compete in the world of work. The most common problem of PKL selection is the lack of competence in terms of the needs of the company, as well as the needs of students will be on PKL place selection. To overcome these problems required a computer system in the form of group decision support systems (GDSS) who can help South Aceh Polytechnic for the selection of the right vendors for students. In this study, Group decision support system developed by using AHP (Analytical Hierarchy Process) and Borda for group decision-making. AHP method is used to determine the weights of criteria and sub-criteria of each company where PKL alternative to alternative perangkingan company for each student from each of the decision makers. Borda method used for incorporation gradement results obtained by each decision maker so getting rank final and decisive recommendations PKL student places. Based on the outcome of a group decision support system in the form of a rank of criteria values of students to alternative company where PKL placement selection. And alternative companies that get the highest yield serve as recommendations PKL student placement decisions for computer engineering department Polytechnics South Aceh.
Mugenzi Thierry, Tri Kuntoro Priyambodo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.17167

Abstract:E-Government basically comprises the use of electronic communications technologies such as the internet, in enhancing and advancing the citizens access to public services. In most developing countries including Burundi, citizens are facing many difficulties for accessing public services. One of the identified problems is the poor quality of service in managing citizens’ complaints. This study proposes an SMS and web based e-Government Model as a solution. In this study, a case study of a complaint management system at District of Gihosha has been used as a reference to prove that SMS and Web based e-Government Model can enhances the access of public services. The objective of this study is the development of an SMS and web-based system that can enhances the process and the management of citizens’ complaints at District of Gihosha. The system has been developed using PHP as front end, Apache as web server, MySQL as Database and Gammu as SMS gateway. The obtained results after testing the system shows that all the functionalities of the developed system worked properly. Thus, the SMS and web based complaint management system developed is considered to be effective.
Ria Astriratma, Retantyo Wardoyo, Aina Musdholifah
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.17342

Abstract:Through the State Civil Apparatus Law, the Government attempt to reduce nepotism by creating an open competition system among civil servants in the process of filling positions. Regional Civil Service Agency (BKD) Tarakan already has personnel database and decision support system that can combine the existing database with the scoring model to get the candicate profile who fit with vacant positions is needed to support more objective performance.The Application of profile matching method in this decision support system is expected to help the candidate selection process on structural officer in the Government of Tarakan comply with the ability of a required field in a position. From the research we concluded that the change in the value of the candicate profile, and the number of subcriteria that used to categorize the positions can affect the closeness of candidates with vacant position and the use of profile matching method for case that the highest value is the best value requires that the ideal value used is the value the maximum in order to avoid exceeding the expectations of ideal.
Fuzy Yustika Manik, Kana Saputra Saragih
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.17838

Abstract:Post harvest issues on star fruit are produced on a large scale or industry is sorting. Currently, star fruit classified by rind color analysis visually human eye. This method does not effective and inefficient. The research aims to classify the starfruit sweetness level by using image processing techniques. Features extraction used is the value of Red, Green and Blue (RGB) to obtain the characteristics of the color image. Then the feature extraction results used to classify the star fruit with Naïve Bayes method. Starfruit image data used 120 consisting of 90 training data and 30 testing data. The results showed the classification accuracy using RGB feature extraction by 80%. The use of RGB as the color feature extraction can not be used entirely as a feature of the image extraction of star fruit.
Adri Priadana, Agus Harjoko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 11; doi:10.22146/ijccs.17526

Abstract:There is still a lot of juvenile delinquency in the middle of the community, especially people in urban areas, in the modern era. Juvenile delinquency may be fights, wild racing, gambling, and graffiti on the walls without permission. Vandalized wall is usually done on walls of office buildings and on public or private property. Results from vandalized walls can be seen from the image of the change between the initial image with the image after a motion.This study develops a image change detection system in video to detect the action of graffiti on the wall via a Closed-Circuit Television camera (CCTV) which is done by simulation using the webcam camera. Motion detection process with Accumulative Differences Images (ADI) method and image change detection process with Illumination Invariant Change Detection method coupled with image cropping method which carried out a comparison between the a reference image or image before any movement with the image after there is movement.Detection system testing one by different times variations, ie in the morning, noon, afternoon, and evening. The proposed method for image change detection in video give results with an accuracy rate of 92.86%.
Fadhila Tangguh Admojo, Edi Winarko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.11186

Abstract:Mountain climbing path information has been widely available on the internet. However, to get information that suits the needs of climbers take time to browse and compare all the available information. The diversity of the search results content actually confuse the climbers. This research aims to provide a solution to the problems faced by climbers, by developing an information retrieval system for mountain climbing path using semantic technology (ontology) based approach . The system is developed by using two knowledge base (ontology), ontology Bahasa represents linguistic knowledge and ontology Mountaineering represents mountaineering knowledge. The system is designed to process and understand natural language input form. The process of understanding the natural language based on syntactic and semantic analysis using the rules of Indonesian grammar. The results of the research that has been conducted shows that the system is able to understand natural language input and is capable of detecting input that is not in accordance with the rules of Indonesian grammar both syntactically and semantically. The system is also able to use a thesaurus of words in the search process. Quantitative test results show that the system is able to understand 69% of inputs are taken at random from the respondents.
Yuning Widiarti Darsono, Adianto Adianto, Mirna Apriani
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.15491

Abstract:The need for monitoring, effective and efficient control and evaluation of water quality in regional waters Surabaya become a demand for population growth, climate change and variability in the current era of urbanization. The traditional method is done by collecting water samples, test and analyze water in the laboratory has been relatively expensive and do not have the ability to capture real-time data, analysis and information delivery fast in making decisions. On the other hand, the rapid spread in the use of mobile phones in developing countries has increased mobile data management applications. A variety of mobile applications has also increased in recent years. This is because mobile phones cheap, easy to use and can transmit multiple types of information including images and GPS data remotely. In this paper, the author describes a data communication system of water quality resources based on UDP protocol. This system is called ubiquitous mobile sensing consisting of microcontroller Arduino, water quality sensors, and Android smartphones. It has the ability to detect temperature, dissolved oxygen (DO), pH and electrical conductivity (EC) in real time. By using this monitoring system, the data result is expected more accurate, faster and cheaper.
Kabul Kurniawan, Ahmad Ashari
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.15525

Abstract:The Implementation of web services using point-to-point method considered to be irrelevant weather the number of services are growing up rapidly and more complex. In the other hand, the differences of web service implementation standard is also being a serious problem when the integration and communication among web services are needed.One way that can be applied to provide management information system integration among others is the implementation of Enterprise Service Bus (ESB). ESB is an infrastructure that can be the solution of n-to-n integration complexity problem. ESB concept strongly supports the implementation of Service Oriented Architecture (SOA).This research will discuss about the implementation of ESB in government area, especially in Sleman, sub-district of Daerah Istimewa Yogyakarta local government, Indonesia. Some of ESB functionality such as Routing and Transformation are applied to solve the integration problem between Government to Public (G2P) in case of checking lisence process at Badan Penanaman Modal dan Izin Terpadu (BMPT) Sleman Sub-Distric. The results show that the ESB can be used as the mediator of information integration in case of checking lisence process. Beside that, the results of performance testing shows that the average execution time (AET) for service operations reach up to 0,085 seconds. Furthermore, the system has an average amount of execution about 11.99 times and produces data about 15746.86 bytes or equivalent to 15.38 KB per second. These results indicate that the execution time is considered to be small enaugh so that the services will not interfere the process of the transactions.
Mulyanto Mulyanto, Ahmad Ashari
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.15528

Abstract:As an important IT infrastructure, website is a system which requires high reliability and availability levels. Website meets the criteria as a highly available system because website must provide services to clients in real time, handle a large amount of data, and not lose data during transaction. A highly available system must meet the condition of being able to run continuously as well as guaranteeing consistency on data requests. This study designed a website with high availability. The approach was building network cluster with failover and replicated block device functions. Failover was built to provide service availability, while replicated block device provides data consistency during failure of service. With failover cluster and replicated block device approaches, a cluster which is able to handle service failures of web server and database server on the website. The result of this study was the services of the website could run well if there was any failure in node members of the cluster. The system was able to provide 99,999 (five nines) availability on database server services and 99,98 (three nines) on web server services.
Nola Ritha, Retantyo Wardoyo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.15532

Abstract:Rainfall prediction can be used for various purposes and the accuracy in predicting is important in many ways. In this research, data of rainfall prediction use daily rainfall data from 2013-2014 years at rainfall station in Putussibau, West Kalimantan. Rainfall prediction using four parameters: mean temperature, average humidity, wind speed and mean sea level pressure.This research to determine how performance Neural Fuzzy Inference System with Levenberg-Marquardt training algorithm for rainfall prediction. Fuzzy logic can be used to resolve the linguistic variables used in rule of rainfall. While neural networks have ability to adapt and learning process, due to recognize patterns of data from input need training to prediction. And Levenberg-Marquardt algorithm is used for training because of effectiveness and convergence acceleration.The results showed five models NFIS-LM developed using a variety of membership functions as input obtained that model NFIS-LM with twelve of membership functions and use four inputs, such as mean temperature, average humidity, wind speed and mean sea level pressure gives best results to predict rainfall with values Mean Square Error (MSE) of 0.0262050. When compared with model NN-Backpropagation, NFIS-LM models showed lower accuracy. It is shown from MSE generated where model NN-Backpropagation generate MSE of 0.0167990.
Medi Taruk, Ahmad Ashari
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.15529

Abstract:Transmission Control Protocol (TCP) is a protocol that works at the transport layer of the OSI model. TCP was originally designed more destined for a wired network. However, to meet the need for the development of a very fast network technology based on the needs of the use by the user, it needs further development to the use of TCP on wireless devices. One implementation of a wireless network based on Worldwide Interoperability for Microwave Access (WiMAX) network is a model that offers a variety advantage, particularly in terms of access speed.In this case, use NS-2 to see throughput at TCP variants tested, namely TCP-Tahoe, TCP-Reno, TCP-Vegas, and TCP-SACK over WiMAX network model, with few observations scenarios. The first is a look at each of these variants throughput of TCP when only one particular variant of the work in the network. Second observe all variants of TCP throughput at the same time and have the equivalent QoS, but with the possibility of a small congestion based on the capacity of the link is made sufficient. Third observed throughput with multi congestion.In WiMAX network has scheduling services are UGS, rtPS and ertPS using UDP protocol and nrtPS and BE using the TCP Protocol. By using the software network simulator (NS-2) to obtain performance comparison TCP protocol-based services on the WiMAX network with QoS parameters are throughput, packet loss, fairness and time delay.
Made Leo Radhitya, Agus Harjoko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.15949

Abstract:One of the dangers that occur at the beach is rip current. Rip current is significant danger for beachgoers. This research tried to serve the geographical information about rip current's occurence risk by using decision tree that built by C4.5 Algorithm. The output from decision tree is rip current's occurrence risk. Case study in this research is the beach that located at Rote Island, Rote Ndao, Nusa Tenggara Timur.The system evaluation result shown the average of accuracy is 0.84, the average of precision is 0.61, the average of recall is 0.68 and the average of F-measure is 0.59 from range 0 to 1.
Kharis Syaban, Agus Harjoko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.16628

Abstract:Compared with other methods of classifiers such as cellular and molecular biological methods, using the image of the leaves become the first choice in the classification of plants. The leaves can be characterized by shape, color, and texture; The leaves can have a color that varies depending on the season and geographical location. In addition, the same plant species also can have different leaf shapes. In this study, the morphological features of leaves used to identify varieties of pepper plants. The method used to perform feature extraction is a moment invariant and basic geometric features. For the process of recognition based on the features that have been extracted, used neural network methods with backpropagation learning algorithm. From the neural-network training, the best accuracy in classifying varieties of chili with minimum error 0.001 by providing learning rate 0.1, momentum of 0.7, and 15 neurons in the hidden layer foreach of various feature. To conduct cross-validation testing with k-fold tehcnique, obtained classification accuracy to be range of 80.75%±0.09% with k=4.
Devid Haryalesmana Wahid, Azhari Sn
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.16625

Abstract:Antusias masyarakat yang memberikan perhatian lebih terhadap akun resmi selebriti di Twitter memunculkan tren penggunaan Twitter sebagai upaya manajemen kesan. Penggalian reaksi masyarakat di media sosial merupakan upaya strategis untuk memperoleh umpan balik, namun tidak mudah dilakukan. Pengguna membutuhkan waktu yang lama untuk membaca ribuan tweet sekaligus memilah sentimennya, sehingga dibutuhkan peringkasan sentimen ekstraktif secara otomatis. Penelitian terdahulu umumnya tidak memasukkan informasi sentimen yang terkandung pada sebuah tweet sebagai bobot peringkat kalimat, sehingga hasil ringkasan masih berupa topik umum yang dibicarakan masyarakat. Penelitian ini bertujuan mengkombinasikan metode SentiStrength, Hybrid TF-IDF dan Cosine Similarity untuk mengekstraksi ringkasan sentimen positif dan negatif masyarakat terhadap topik selebriti di Twitter secara otomatis, dengan artis Agnes Monica sebagai studi kasus. Metode SentiStrength digunakan untuk mendapatkan skor kekuatan sentimen dan mengklasifikasi tweet ke dalam kelas positif, negatif dan netral. Tweet bersentimen positif dan negatif diringkas dengan cara pemeringkatan tweet menggunakan Hybrid TF-IDF yang dikombi­nasi dengan skor kekuatan sentimen, kemudian menghilangkan tweet yang mirip menggunakan Cosine Similarity.Hasil pengujian memperlihatkan bahwa kombinasi SentiStrength, Hybrid TF-IDF, dan Cosine Similarity mampu menghasilkan ringkasan sentimen dengan akurasi yang lebih baik dibandingkan menggunakan Hybrid TF-IDF saja, dengan perolehan akurasi rata-rata sebesar 60% dan f-measure sebesar 62%. Hal ini disebabkan karena penambahan kekuatan sentimen sebagai bobot peringkasan.
Azizah Fatmawati, Azhari Sn, Nisa Rna
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.17521

Abstract:Saks Sentence Completion Test (SSCT) is one of the projective tests for revealing personality dynamics that is able to show characters of someone toward his or her interpersonal relationship and interpretation to environment. Normally, this kind of test is conducted by psychologists where they have very important and complicated task to interpret respondents' answer test. However, since the advance development of intelligent agent based system, the task that is previously complicated for psychologist now becoming easier. They just need to delegate intelligent agent software to interpret the test result from respondent in order to decide the test result.In this research we developed an intelligent agent software using Java Agent Development Framework (JADE) and BDI as agents architecture with Prometheus method. The application development focuses on how to utilize summary method to generate answer model from SSCT while the psychologists are able to obtain interpretation result immediately.Results showed that the average percentage of successfully identified answers by the model reaches 59.00% while the average percentage of its accuracy is 95.13%. Moreover, by using four agents that are communicating each other psychologists may obtain the test result.
Doni Setyawan, Edi Winarko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.17485

Rachmat Wahid Saleh Insani, Reza Pulungan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10, pp 93-102; doi:10.22146/ijccs.11192

Abstract:AbstrakSistem Information and Communication Technology (ICT) adalah suatu bagian hidup dari masyarakat yang sangat penting. Sistem ICT terus berkembang menjadi sistem yang kompleks dan besar. Protokol komunikasi adalah contoh sistem ICT yang digunakan oleh seluruh masyarakat pengguna Internet. Protokol OLSR adalah protokol komunikasi jaringan wireless yang bersifat proaktif, table-driven dan berbasis pada algoritma link-state. Protokol EE-OLSR adalah varian dari protokol OLSR yang dinyatakan mampu meningkatkan penggunaan energi tanpa adanya pengurangan pada performa.Proses verifikasi protokol umumnya dilakukan dengan cara simulasi dan pengujian langsung. Namun proses tersebut tidak mampu memverifikasi bahwa tidak ada subtle error atau design flaw pada suatu protokol. Model checking merupakan metode algoritmik yang dijalankan secara fully-automatic untuk melakukan verifikasi pada sistem. UPPAAL adalah model checker tool untuk memodelkan, simulasi, dan verifikasi suatu sistem yang dimodelkan pada Timed Automata.UPPAAL CORA adalah model checker tool untuk memverifikasi protokol yang telah dimodelkan ke bahasa pemodelan Linearly Priced Timed Automata, apakah protokol memenuhi properti energy efficient yang telah diformulasikan menggunakan bahasa spesifikasi formal dalam sintaks Weighted Computation Tree Logic. Teknik Model Checking terhadap protokol tersebut menghasilkan bukti bahwa protokol EE-OLSR memenuhi properti energy efficient hanya ketika lalu lintas pengiriman paket terjadi. Kata kunci—pemodelan, verifikasi, EE-OLSR, UPPAAL CORA. AbstractInformation and Communication Technology systems is a most important part of society. These systems are becoming more and more complex and are massively encroaching on daily life via the Internet and all kinds of embedded systems. Communication protocols are one of the ICT systems used by Internet users. OLSR protocol is a wireless network communication protocol with proactive, and based on link-state algorithm. EE-OLSR protocol is a variant of OLSR that is able to prolong the network lifetime without losses of performance.Protocol verification process generally be done by simulation and testing. However, these processes unable to verify there are no subtle error or design flaw in protocol. Model Checking is an algorithmic method runs in fully automatic to verify a system. UPPAAL is a model checker tool to model, verify, and simulate a system in Timed Automata.UPPAAL CORA is model checker tool to verify EE-OLSR protocol modelled in Linearly Priced Timed Automata, if the protocol satisfy the energy efficient property formulated by formal specification language in Weighted Computation Tree Logic syntax. Model Checking Technique to verify the protocols results in the protocol is satisfy the energy efficient property only when the packet transmission traffic happens. Keywords— modeling, verification, EE-OLSR, UPPAAL CORA.
I Nym Saputra Wahyu Wijaya, Reza Pulungan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10, pp 81-92; doi:10.22146/ijccs.11191

Abstract:Abstrak Untuk mengurangi handoff latency dan meningkatkan keberhasilan dari skema HHO konvensional, dilakukan pengembangan skema handover pada standard protokol WiMAX IEEE 802.16e dengan menambahkan mobility pattern. Keunggulan skema handover dengan mobility pattern adalah mengurangi handoff latency hingga 50%, sedangkan kelemahannya sering terjadi kesalahan dalam penentuan Target base station. Verifikasi formal dilakukan untuk mengetahui kelemahan tersebut. Skema handover dimodelkan dengan continuous-time Markov chain (CTMC). Model difokuskan untuk memperkirakan pengaruh algoritma mobility pattern terhadap handoff latency dari mekanisme Hard Handover WiMAX. Langkah-langkah untuk membangun model sebagai berikut: Mereprensentasikan state space, memberikan penomoran pada semua, menggenerasikan rate transisi matriks (infinitesimal generator). Kemudian model yang dibentuk dituangkan pada tool PRISM, untuk dilakukan verifikasi formal. Probabilistic model checking pada penelitian melakukan pendekatan kuantitatif (qunatitative properties) dan kualitatif (qualitative properties). Verifikasi formal terhadap properti-properti yang berkaitan dengan handover pada jaringan WiMAX menunjukkan bahwa 70% dari MS yang melakukan scanning dengan mobility pattern sukses melakukan handover, 24 % diantaranya melakukan scanning konvensional akibat kesalahan dalam penentuan TBS sehingga handoff latency yang dihasilkan lebih besar jika dibandingkan dengan sistem yang hanya menggunakan metode scanning konvensional.Kata kunci— WiMAX, handover, mobility pattern, CTMC, PRISM, handoff latency Abstract In order to decrease handoff latency and increase the successful of HHO conventional scheme, a development of handover scheme is done in standard protocol WiMAX IEEE 802.16e by adding mobility pattern. The superiority of handover scheme with mobility pattern can reduce handoff latency up to 50%, mean while the weakness of this scheme is a wrong act in determining target base station are often happen. Simulation can not showing the cause of that error. So, we do formal verification in to hard handover model with mobility pattern. In this research, behaviour system is modeled with continuous-time Markov chain (CTMC). The model is foccused to aproximating the influence of mobility pattern in to handoff latency from WiMAX hard handover mechanism. In order to set up a series markov chain models handover system can follow steps, such as: represents the state space, give a number in all transitions, generate the rate transition matrix (infinitesimal generator). Probabilistic model checking in the research are using quantitative properties and qualitative properties. Formal verification concerning properties has relation with handover in WiMAX network showing that 70% from mobile station which doing scanning with mobility pattern are success doing handover. 24% of them doing scanning conventional as a result of wrongness in act determining target base station, so handoff latency which is pictured will bigger than a system that is only use conventional scanning method. Keywords— WiMAX, handover, mobility pattern, CTMC, PRISM, handoff latency
Ahmad Ashari, Alimuddin Alimuddin
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10, pp 11-22; doi:10.22146/ijccs.11185

Abstract:AbstrakPenerapan sistem informasi akademik (siakad) berbasis web suatu perguruan tinggi sangatlah penting untuk meningkatkan pelayanan akademik. Penerapan siakad tersebut memiliki berbagai kendala terutama dalam menangani jumlah akses yang tinggi sehingga menyebabkan terjadinya overload. Selain itu bila terjadi kegagalan hardware atau software menyebabkan siakad tidak bisa diakses. Solusi dari permasalahan ini adalah penggunaan banyak server dimana beban yang ada didistribusi dimasing-masing server. Perlu metode dalam pendistribusian beban agar merata dimasing-masing server yaitu metode load balancing dengan algoritma round robin sehingga skalabilitas siakad tinggi. Sedangkan untuk menangani kegagalan server perlu fault tolerance agar ketersediaan siakad menjadi tinggi. Penelitian ini untuk membangun metode load balancing dan fault tolerance menggunakan perangkat lunak linux virtual server dan beberapa program tambahan seperti ipvsadm dan heartbeat yang memiliki kemampuan meningkatkan skalabilitas dan ketersediaan siakad. Hasil penelitian menunjukan bahwa dengan load balancing dapat meminimalkan response time sampai 5,7%, meningkatkan throughput sampai 37% atau 1,6 kali dan memaksimalkan pemanfaatan sumberdaya atau utilisasi sebesar 1,6 kali peningkatan, serta terhindar dari overload. Sedangkan ketersediaan yang tinggi diperoleh dari kemampuan server melakukan failover atau berpindah ke server yang lain bila terjadi kegagalan. Sehingga implementasi load balancing dan fault tolerance dapat meningkatkan layanan kinerja siakad dan terhindar dari kesalahan. Kata kunci —load balancing, fault tolerance, overload, sistem informasi akademik The application of academic information system (siakad) a web-based college is essential to improve the academic services. Siakad the application has many obstacles, especially in dealing with a high amount of access that caused the overload. Moreover in case of hardware or software failure caused siakad inaccessible. The solution of this problem is the use of many existing servers where the load is distributed in the respective server. Need a method of distributing the load evenly in the respective server load balancing is the method by round robin algorithm so high siakad scalability. As for dealing with the failure of a server need fault tolerance for the availability siakad be high. This research is to develop methods of load balancing and fault tolerance using software linux virtual server and some additional programs such as ipvsadm and heartbeat that has the ability to increase scalability and availability siakad. The results showed that with load balancing to minimize the response time to 5,7%, increase throughput by 37% or 1,6 times and maximize resource utilization or utilization of 1,6 times increased, and avoid overload. While high availability is obtained from the server's ability to perform failover or move another server in the event of failure. Thus implementing load balancing and fault tolerance can improve the service performance of siakad and avoid mistakes. Keywords—load balancing, fault tolerance, overload, academic information systems.
Teguh Susyanto, Khabib Mustofa
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10, pp 103-114; doi:10.22146/ijccs.12734

Abstract:AbstrakLayanan pencarian informasi pekerjaan yang ada saat ini masih memiliki banyak kelemahan dan sering gagal menyajikan informasi lowongan pekerjaan yang relevan dan sesuai kebutuhan pencari kerja. Hal ini disebabkan metode pencarian yang diterapkan pada mesin pencari masih menggunakan pencocokan berbasis sintaks dan belum adanya integrasi dengan portal penyedia informasi pekerjaan, sehingga menyulitkan para pencari kerja untuk mendapatkan informasi yang relevan. Untuk mengatasi kelemahan tersebut, diusulkan sebuah prototipe aplikasi pencarian lowongan pekerjaan dengan melibatkan web service sebagai penyedia informasi pekerjaan.Tujuan penelitian ini adalah terbentuknya rancang bangun aplikasi pencarian informasi lowongan kerja berdasarkan personalisasi pencari kerja dengan mengkombinasikan pendekatan multi agent dan semantic web service.Prototipe aplikasi dirancang menggunakan teknologi multi agen yang memiliki kemampuan untuk memilih, memanggil web service penyedia informasi pekerjaan dan melakukan proses matching informasi lowongan pekerjaan yang sesuai dengan profil pencari kerja secara otomatis. Proses pemilihan web service menggunakan kombinasi algoritma service matching dan Simple Additive Weighting. Algoritma semantic matching digunakan untuk menghitung nilai similaritas antara lowongan pekerjaan dengan profil pencari kerja. Berdasarkan pengujian yang dilaksanakan dengan responden, dinyatakan prototipe aplikasi ini telah dapat memberikan rekomendasi lowongan pekerjaan yang sesuai dengan profil pencari kerja. Kata kunci— lowongan pekerjaan, web service, multi agen, semantic matching AbstractCurrently, job searching service still has many weaknesses and often fails to provide relevant job information that matches the needs of job seekers. This is due to the searching method applied in the search engines still uses the syntax-based matching and the lack of integration among the job service providers. Therefore it’s difficult for the job seekers to get the desired information. To overcome these weaknesses, a prototype of a job vacancy searching by involving a web service as a job information provider is proposed.This thesis is aimed to create job search based on the personalization of job seeker by combining multi agent and semantic web service approaches.The designing of the prototype used a multiagent technology whose capability was to call job service provider and run matching process of the job vacancy appropriate with the job seeker’s profile automatically. Algorithm of the service selection used service matching and Simple Additive Weighting. The similary between the job offer and the job seeker’s profile was calculated by using semantic algorithm. Based on the testing carried out to the respondents, it’s stated that this prototype has been able to give recommendation of job appropriate with the job seeker’s. Keywords—job offer, semantic web service, multi agent, semantic matching
Aina Musdholifah, Mujiono Mujiono
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10, pp 1-10; doi:10.22146/ijccs.11184

Abstract:AbstrakDasar dari reformasi birokrasi adalah reformasi pengelolaan SDM. Salah satu faktor pendukungnya adalah pengembangan database pegawai. Untuk mendukung pengelolaan SDM diantaranya dibutuhkan data warehouse dan tool kecerdasan bisnis. Data warehouse adalah konsep penyimpanan data terintegrasi yang handal untuk memberi dukungan kepada seluruh kebutuhan analisis data. Dalam penelitian ini dikembangkan data warehouse dengan menggunakan pendekatan data-driven dengan sumber data berasal dari SIMPEG, SAPK dan presensi elektronik. Data warehouse dirancang menggunakan nine steps methodology dan dituangkan dalam notasi unified modeling language (UML). Extract transform load (ETL) dilakukan dengan menggunakan Pentaho Data Integration dengan menerapkan peta transformasi. Selanjutnya untuk membantu pengelolaan SDM, dibangun sistem untuk melakukan online analitical processing (OLAP) berbasis web guna mempermudah memperoleh informasi. Dalam penelitian ini dihasilkan framework pengembangan program aplikasi BI dengan asitektur Model-View-Controller (MVC) dan operasi OLAP dibangun menggunkan pembangkit query dinamis, PivotTable, dan HighChart untuk menyajikan infomasi tentang PNS, CPNS, Pensiun, Kenpa dan Presensi. Kata kunci— data warehouse, pengelolaan SDM, Model-View-Controller (MVC), query dinamis Abstract The basis of bureaucratic reform is the reform of human resources management. One supporting factor is the development of an employee database. To support the management of human resources required including data warehouse and business intelligent tools. The data warehouse is an integrated concept of reliable data storage to provide support to all the needs of the data analysis. In this study developed a data warehouse using the data-driven approach to the source data comes from SIMPEG, SAPK and electronic presence. Data warehouses are designed using the nine steps methodology and unified modeling language (UML) notation. Extract transform load (ETL) is done by using Pentaho Data Integration by applying transformation maps. Furthermore, to help human resource management, the system is built to perform online analytical processing (OLAP) to facilitate web-based information. In this study generated BI application development framework with Model-View-Controller (MVC) architecture and OLAP operations are built using the dynamic query generation, PivotTable, and HighChart to present information about PNS, CPNS, Retirement, Kenpa and Presence Keywords— data warehouse, human resources management, Model-View-Controller (MVC), dynamic query
Annisa Mauliani, Sri Hartati, Aina Musdholifah
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10, pp 71-80; doi:10.22146/ijccs.11190

Abstract:AbstrakPenelitian ini terdapat penambahan aspek waktu pada hasil Association Rules Mining. Aspek waktu yang digunakan dalam penelitian ini adalah tanggal transaksi. Bentuk informasi yang dihasilkan dari penambahan aspek waktu dalam Association Rules Mining dikenal dengan Temporal Association Rules Mining. Data yang digunakan untuk penelitian ini adalah data transaksi penjualan di Toko Batik Diyan Solo. Dalam penelitian ini, Algoritma Apriori digunakan untuk pembentukan Temporal Association Rules. Hasil dari penelitian ini menunjukkan Algoritma Apriori dan Temporal Association Rules dapat digunakan untuk menggali informasi. Hasil pengujian ini dapat digunakan untuk bahan pertimbangan dalam memperkirakan penambahan stok barang pada event tertentu bagi pihak manajerial Batik Diyan, karena pada hasil Temporal Association Rules terdapat keterangan tambahan mengenai event. Sehingga pada hasil Temporal Association Rules dapat diketahui, Temporal Association Rules tersebut terjadi pada saat event apa saja.Di Hari Raya Idul Fitri Tahun 2013 dan Hari Raya Idul Fitri Tahun 2014 pada parameter nilai minsup 10%, mintempsup 5, minconf 50%, menghasilkan Temporal Association Rules yang berbeda. Pada Hari Raya Idul Fitri Tahun 2013, tidak menghasilkan Temporal Association Rules. Sedangkan pada Hari Raya Idul Fitri Tahun 2014 menghasilkan Temporal Association Rules dengan nilai support yang terbesar 14%, yaitu {BLBP} à {HPCK} Kata kunci— penjualan, temporal association rules, Apriori, data mining AbstractIn this study the adding of time aspect on association rules mining result was used. The aspect of time that was used in this study was date of transactions. The information that was resulted from time aspect adding in association rules mining was known as temporal association rules mining.. In this study, Apriori Algorithm was used for the forming of temporal association rules. This result which show Algorithm Apriori and Temporal Association Rules can be used to get more information about Temporal Association Rules. The Result of this examination can be used for decision support for manager. Because the result of Temporal Association Rules have explanation about event. So from the result of Temporal Association Rules can be knowing, the Temporal Association Rules happened at the time of event.In Ramadan Idul Fitri 2013 and Ramadan Idul Fitri 2014 with parameter assess minsup 10%, mintempsup 5, minconf 50%, resulting different Temporal Association Rules. In Ramadan Idul Fitri 2013, no resulting Temporal Association Rules. While is on Ramadan Idul Fitri 2014 resulting Temporal Association Rules with the biggest value of support 14%, that is {BLBP} à {HPCK}. Keywords— sale, temporal association rules, apriori and data mining
Dian Hafidh Zulfikar, Agus Harjoko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10, pp 35-46; doi:10.22146/ijccs.11187

Abstract:Abstrak Proses penyisipan pesan pada steganografi pada kawasan DCT umumnya dilakukan pada nilai DCT hasil proses kuantisasi yang memiliki nilai selain 0, ini berhubungan dengan sebaran keberagaman pixel pada citra. Penerapkan point operation image enhancement (POIE) berupa histogram equalization, contrast stretching, brigthening serta gamma correction pada citra penampung sangat berkaitan dengan histogram citra.. Parameter uji yang digunakan yaitu banyaknya bit pesan yang mampu ditampung, nilai PSNR dan MSE, serta nilai koefisien DCT hasil kuantisasi. Berdasarkan hasil pengujian yang telah dilakukan didapatkan beberapa kesimpulan yaitu kapasitas pesan pada steganografi DCT sekuensial lebih besar dibandingkan dengan steganografi DCT F5 baik sebelum penerapan POIE maupun setelah penerapan POIE, kualitas citra stego pada steganografi DCT F5 lebih baik dibandingkan dengan steganografi DCT sekuensial baik sebelum penerapan POIE maupun setelah penerapan POIE, baik steganografi DCT F5 maupun steganografi DCT sekuensial sama-sama tidak memiliki ketahanan terhadap manipulasi terhadap citra stego. Kata kunci— Steganografi, domain DCT, citra penampung, kapasitas pesan, kualitas citra, point operation image enhancement, steganografi DCT sekuensial, steganografi DCT F5. Abstract Steganographic process on the DCT transform is generally done on the value of DCT quantization process results that have a value other than 0, this relates to the distribution of the diversity of pixels in the image. Applying point operation of image enhancement (POIE) in the form of histogram equalization, contrast stretching, brigthening and gamma correction on the image of the reservoir is associated with the image histogram . Test parameters used is the number of bits that can be accommodated message, PSNR and MSE value, and the value of DCT coefficients quantization results. Based on test results that have to be got several conclusions that capacity steganographic message on DCT sequential greater than the DCT F5 steganography either before or after application of the application POIE, stego image quality on DCT steganography F5 better than the sequential DCT steganography well before the application POIE and after application of POIE, both F5 and steganography steganography DCT DCT sequential equally resistant to manipulation of stego image. Keywords— Steganography , DCT domain , cover image , messaging capacity , image quality , POIE , sequential DCT steganography , steganography DCT F5 .
I Gede Winaya, Ahmad Ashari
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.11188

Abstract:MongoDB is a database that uses document-oriented data storage models. In fact, to migrate from a relational database to NoSQL databases such as MongoDB is not an easy matter especially if the data are extremely complex. Based on the documentation that has been done by several global companies related to the use of MongoDB, it can be concluded that the process of migration from RDBMS to MongoDB require quite a long time. One process that takes quite a lot is transformation of relational database schema into a document-oriented data model on MongoDB. This research discusses the development transformation system of relational database schema to the document oriented data model in MongoDB. The process of transformation is done by utilizing the structure and relationships between tables in the scheme as the main parameters of the modeling algorithm. In the process of the modeling documents, it necessary to adjustments the specifications of MongoDB document that formed document model can be implemented in MongoDB. Document models are formed from transformation process can be a single document, embedded document, referenced document or combination of these. Document models are formed depending on the type, rules, and the value of the relationships cardinality between tables in the relational database schema.
Murhaban Murhaban, Ahmad Ashari
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 10; doi:10.22146/ijccs.11189

Abstract:Handover method is used to keep the stabilization of connection. Its connected with the performance was caused the process canal traffic transfer automatically in mobile station (MS) that was used to communicate without cutting off the connection. The main factor of success in handover was quality of service to provide the difference level of service in arranging and giving the traffic priority in the network like voice over IP (VoIP) application or communication voice using internet network. This research will analyse the achievement quality of service in the WiMax network standard 802.16e used hard handover and softhandover method with the VoIP application in mobile station. Based on the testing that was carried out hard handover and soft handover method used the application of voice over internet protocol in mobile station has obtained value jitter 0.001 Ms – 0.31 ms, and delay 10.5 ms 39 ms this is proved that the influence of jitter and delay against handover with the VoIP application still in the tolerance stage that was permitted. It is different with the output throughput 85 Bit/Sekon - 550 Bit/Sekon that is too low and indicated that throughput is not sentitif against handover with the voice over internet protocol application.
Agus Halid, Reza Pulungan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 133-144; doi:10.22146/ijccs.7542

Abstract:AbstrakStream Control Transmission Protocol (SCTP) merupakan protokol yang mirip dengan Transmission Control Protocol (TCP) dan User Datagram Protocol (UDP). SCTP merupakan protokol yang bersifat reliable dan connectionless. Protokol ini memiliki kemampuan multistreaming dan multihoming dalam melakukan transmisi data. Penelitian ini merupakan pemodelan terhadap SCTP menggunakan simulator OPNET yang dapat menjadi akselerasi bagi peneliti dalam bidang jaringan. SCTP pada simulator dibangun dengan melakukan modifikasi terhadap TCP. Pemodelan dimulai dengan membangun skenario jaringan dan menentukan bandwidth pada jalur yang akan dilewati oleh paket data.Modifikasi ukuran window dalam penelitian ini menggunakan nilai 1 MMS, 2 MMS hingga 10 MMS pada pengendali kemacetan. Tujuannya adalah untuk melihat pengaruh modifikasi ukuran window terhadap nilai packet loss, delay dan throughput. Hasil pengukuran memperlihatkan bahwa nilai throughput tertinggi terdapat pada Skenario Kedua sebagaimana diperlihatkan pada Tabel 6.4 dengan nilai throughput sebesar 433.566,0244 bit/s. Penggunaan ukuran window dalam pengendali kemacetan dimaksudkan untuk menghindari banjir data pada sisi endpoint yang dapat menyebabkan packet loss. Kata kunci—Pengendali kemacetan, throughput, delay, packet loss, ukuran window, multihoming, SCTP Abstract Stream Control Transmission Protocol (SCTP) is a protocol that is similar to the Transmission Control Protocol (TCP) and User Datagram Protocol (UDP). SCTP is a protocol that is both reliable and connectionless. This protocol has the ability multistreaming and multihoming in the transmit data.This research is the modeling of the SCTP using OPNET simulator that can be accelerated for researchers in the field of networking. SCTP on the simulator was built to perform modifications to TCP. Modeling starts with building a network scenarios and determine the bandwidth on the path that will be passed by data packets.Modification of window size in this research using 1 MMS, 2 MMS up to 10 MMS on congestion control. The aim is to disclose the effect of modification of the window size to the value packet loss, delay and throughput. The measurement results show that the throughput rate is highest in the Second Scenario as shown in Table 6.4 with throughput value of 433.566,0244 bits/s. Using window size in congestion control is intended to prevent a flood of data on the endpoint that can lead to packet loss. Keywords—Congestion control, throughput, delay, packet loss, window size, multihoming, SCTP
Yoga Dwitya Pramudita, Reza Pulungan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9; doi:10.22146/ijccs.7546

Abstract:AbstrakLayanan komunikasi Instant Messaging menyediakan berbagai fitur komunikasi yang bisa digunakan oleh pengguna, diantaranya adalah text messaging (pesan teks) baik online maupun offline. Salah satu standar protokol yang mendukung layanan ini adalah XMPP (Extensible Messaging and Presence Protocol). Aliran komunikasi XMPP menggunakan potongan dokumen XML, sehingga rentan terhadap serangan pasif monitoring konten paket komunikasi. Untuk mengatasi kelemahan ini solusinya adalah menggunakan komunikasi yang terenkripsi. Selain itu ada solusi lain yang coba ditawarkan dalam penelitian ini, yaitu penggunaan covert channel untuk mengirim pesan secara tersembunyi. Dalam penelitian ini akan dibuat sebuah aplikasi klien XMPP berbasis web browser yang mampu melakukan komunikasi XMPP dan juga menyediakan komunikasi covert channel. Komunikasi XMPP agarbisa berjalan diatas aplikasi berbasis web browser maka digunakanlah protokol WebSocket. Protokol inilah yang nantinya akan dieksploitasi pada sisi header, khususnya pada field masking-key untuk memuat pesan covert channel yang dikirimkan pada saat sesi komunikasi XMPP berlangsung. Dari hasil ujicoba, aplikasi klien covert channel mampu menghasilkan komunikasi dengan lebar data 3 byte perpaket. Aplikasi Klien juga mampu melakukan komunikasi covert channel pada kondisi link komunikasi dengan tingkat probabilitas packet loss dibawah 10%. Kata kunci— WebSocket, XMPP, masking-key, Covert Channel, aplikasi klien berbasis browser. AbstractInstant Messaging communication services provide a variety of communication features that can be used by the user, such as text messaging (text messages) both online and offline. One of the standard protocol that supports this service is XMPP (Extensible Messaging and Presence Protocol). XMPP communication using XML documents, making it vulnerable to passive attacks monitoring content of communications. To overcome this drawback the solution is encrypted communications. The other solutions that try to offer in this research is the use of a covert channel to send hidden messages. In this research will create a browser based XMPP client application that is capable to deliver XMPP communication and also provide covert channel communication. XMPP communication can be built on a web-based application using WebSocket protocol. This protocol will exploit field masking-key to load the covert channel messages that is sent during the session XMPP communication takes place. From the test results, the client application is able to produce a covert channel communication with a data width of 3 bytes in each packet. The client application is also able to perform covert communication channel in a communication link with the condition of the probability of packet loss rate below 10%. Keywords— WebSocket, XMPP, masking-key, Covert Channel, browser based application.
La Surimi, Reza Pulungan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 121-132; doi:10.22146/ijccs.7541

Abstract:AbstrakVoIP merupakan aplikasi real time yang kualitasnya sangat tergantung pada delay dan jitter, yang mana hal ini sulit dipenuhi oleh protokol yang bersifat reliable dan memiliki congestion control seperti TCP. Di sisi lain penggunaan UDP yang tidak memiliki congestion control menyebabkan peluang terjadinya congestion pada jaringan sangat besar. Penggunaan SCTP sebagai protokol alternatif juga belum mampu mengakomodasi kekurangan TCP dan UDP. Beberapa hasil penelitian menunjukkan perlu adanya perbaikan ataupun modifikasi pada mekanisme congestion control yang dimiliki oleh SCTP. Penggunaan mekanisme ECN dan AQM pada beberapa penelitian menunjukkan bahwa kedua mekanisme ini dapat menurunkan delay dan jitter. Penelitian ini melakukan pengujian terhadap kualitas VoIP di atas SCTP yang menggunakan ECN dan AVQ pada network simulator NS2. Hasil simulasi menunjukkan bahwa penggunaan mekanisme ECN dan AVQ pada protokol SCTP menghasilkan kualitas VoIP yang lebih baik pada kondisi jaringan yang tidak ideal (high Latency low Bandwidth dan low Latency low Bandwidth dari pada penggunaan protokol SCTP tanpa menggunakan mekanisme ECN dan AVQ. Penelitian ini juga melakukan perbandingan nilai MOS panggilan VoIP SCTP yang menggunakan ECN dan AVQ dengan nilai MOS panggilan VoIP yang menggunakan protokol TCP dan UDP. Hasilnya SCTP dengan ECN dan AVQ mengungguli TCP namun belum dapat mengungguli UDP. Kata kunci— VoIP, SCTP, ECN, AQM,AVQ. AbstractVoIP is the real time applications that are highly dependent on the quality of delay and jitter, which it is difficult to be met by protocol that has reliable data transfer feature and congestion control such as TCP. On the other hand the use of UDP that has no congestion control make chance of causing congestion in the network is very large. The use of SCTP as an alternative protocol was also not able to accommodate the weaknesses of TCP and UDP. Some research shows that repairs or modifications to the SCTP congestion control mechanism is needed.The Use of ECN and AQM in some studies show that these two mechanisms can reduce delay and jitter. This study tested the quality of VoIP over SCTP with ECN and AVQ, in NS2. Simulations carried out by independent replication technique, and the results showed that ECN and AVQ can increase the value of MOS VoIP calls significantly in non ideal network scenarios. This study also did comparison of SCTP MOS that uses ECN and AVQ with MOS values VoIP using TCP and UDP. The result showed that SCTP with ECN and AVQ outperform TCP but can not surpass UDP yet. Keywords— VoIP, SCTP, ECN, AQM,AVQ.
Dwi Agus Diartono, Yohanes Suhari, Aji Supriyanto
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 145-156; doi:10.22146/ijccs.7543

Abstract:AbstrakPermasalahan yang sering dihadapi oleh UMKM adalah terbatasnya jumlah dan jangkauan pemasaran dan penjualan produknya. Begitu juga persaingan produk sejenis dapat terjadi oleh antar produk lokal atau produk yang datang dari luar. Hal ini disebabkan karena pemasaran dan penjualan masih dilakukan secara konvensional dan dilakukan secara individual. Penelitian ini bermaksud melakukan implementasi model sistem E-Commerce produk UMKM di suatu daerah atau Kabupaten dengan model pemberdayaan partisipasi kelompok siber (cyber cluster partisipatif) dalam melakukan linking web yang dikembangkan menggunakan sistem Search Engine Optimisazion (SEO) dan Content Management System (CMS). Tujuannya adalah agar web yang dikembangkan dapat dengan mudah menempati ranking teratas pada halaman pencari web (search engine) dan selalu terupdate isi dan rankingnya. Manfaat dari penelitian ini adalah meningkatkan pemasaran dan penjualan produk UMKM hingga pasar global dan menjadikan alamat web tersebut mudah dicari dan dan ditemukan karena sering muncul pada posisi puncak pencarian di mesin pencari seperti google. Luaran penelitian ini adalah website produk UMKM berbasis CMS dan teroptimisasi dengan model link internal dan eksternal sehingga selalu muncul pada posisi top range pencarian. Metode penelitian ini menggunakan action research, dengan model pengembangan sistem terstruktur model air terjun (waterfall). Aplikasi webnya sendiri dikembangkan dengan model prototype, sesuai dengan kebutuhan penggunannya. Kata kunci—Cyber-Cluster, SEO, CMS, UMKM AbstractProblems that are often faced by UMKM (SME) is the limited number and range of marketing and sales of its products. So is the competition of similar products can occur by inter-local products or products that come from outside. This is because marketing and sales are still done conventionally and done individually. This study intends to make the implementation of a model system of E-Commerce SME product in an area or district with the participation of empowerment models cyber group (cluster cyber participatory) in performing web linking system that was developed using Optimisazion Search Engine (SEO) and Content Management System (CMS). The goal is for the web that can be developed easily ranked the Web search page (search engines) and always updated content and rank. The benefit of this research is to improve the marketing and sale of products of SMEs to global market and making it easy to find the web address and and are found as often appear in top positions in the search engines like google. Outcomes of this research is based CMS website MSME products and optimized the model of internal and external links that always appears at the top position of the search range. Methods This study uses an action research, the model of structured systems development waterfall model (waterfall). Its own web application developed with prototype models, according to consumer needs. Keywords—Cyber-Cluster, SEO, CMS, SME
Erma Susanti, Khabib Mustofa
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 111-120; doi:10.22146/ijccs.7540

Abstract:AbstrakEkstraksi informasi merupakan suatu bidang ilmu untuk pengolahan bahasa alami, dengan cara mengubah teks tidak terstruktur menjadi informasi dalam bentuk terstruktur. Berbagai jenis informasi di Internet ditransmisikan secara tidak terstruktur melalui website, menyebabkan munculnya kebutuhan akan suatu teknologi untuk menganalisa teks dan menemukan pengetahuan yang relevan dalam bentuk informasi terstruktur. Contoh informasi tidak terstruktur adalah informasi utama yang ada pada konten halaman web. Bermacam pendekatan untuk ekstraksi informasi telah dikembangkan oleh berbagai peneliti, baik menggunakan metode manual atau otomatis, namun masih perlu ditingkatkan kinerjanya terkait akurasi dan kecepatan ekstraksi. Pada penelitian ini diusulkan suatu penerapan pendekatan ekstraksi informasi dengan mengkombinasikan pendekatan bootstrapping dengan Ontology-based Information Extraction (OBIE). Pendekatan bootstrapping dengan menggunakan sedikit contoh data berlabel, digunakan untuk memimalkan keterlibatan manusia dalam proses ekstraksi informasi, sedangkan penggunakan panduan ontologi untuk mengekstraksi classes (kelas), properties dan instance digunakan untuk menyediakan konten semantik untuk web semantik. Pengkombinasian kedua pendekatan tersebut diharapkan dapat meningkatan kecepatan proses ekstraksi dan akurasi hasil ekstraksi. Studi kasus untuk penerapan sistem ekstraksi informasi menggunakan dataset “LonelyPlanet”. Kata kunci—Ekstraksi informasi, ontologi, bootstrapping, Ontology-Based Information Extraction, OBIE, kinerja Abstract Information extraction is a field study of natural language processing by converting unstructured text into structured information. Several types of information on the Internet is transmitted through unstructured information via websites, led to emergence of the need a technology to analyze text and found relevant knowledge into structured information. For example of unstructured information is existing main information on the content of web pages. Various approaches for information extraction have been developed by many researchers, either using manual or automatic method, but still need to be improved performance related accuracy and speed of extraction. This research proposed an approach of information extraction that combines bootstrapping approach with Ontology-Based Information Extraction (OBIE). Bootstrapping approach using small seed of labelled data, is used to minimize human intervention on information extraction process, while the use of guide ontology for extracting classes, properties and instances, using for provide semantic content for semantic web. Combining both approaches expected to increase speed of extraction process and accuracy of extraction results. Case study to apply information extraction system using “LonelyPlanet” datasets. Keywords— Information extraction, ontology, bootstrapping, Ontology-Based Information Extraction, OBIE, performance
Hence Beedwel Lumentut, Sri Hartati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 197-206; doi:10.22146/ijccs.7548

Abstract:AbstrakPotensi perikanan budidaya air tawar semakin meningkat, hal tersebut disebabkan produksi ikan sektor penangkapan mendekati “overfishing”. Budidaya perikanan air tawar memiliki beberapa alternatif ikan yang memiliki nilai ekonomis tinggi yaitu ikan Mas, ikan Mujair, ikan Nila, ikan Gurame, ikan Lele dan ikan Patin. Alternatif ikan ini memiliki karakteristik yang berbeda untuk masing-masing jenis pembudidayaannya. Parameter-parameter yang mempengaruhi proses budidaya ikan air tawar tersebut diantaranya: faktor kesesuaian air meliputi: suhu, kecerahan, DO (derivater oksigen), keasaman (pH). Sedangkan pemilihan budidaya perikanan yang menguntungkan bisa dinilai dari faktor finansial yaitu: NPV (Net Present Value), ROI (Return on Investment), BCR (Benefit Cost Ratio), PBP (Pay Back Period) dan BEP (Break Event Point). Sedangkan metode yang dipergunakan untuk pengambilan keputusan yaitu Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) sebagai salah satu model decision dapat digunakan untuk memberikan preferensi kepada para petani budidaya ikan, karena alternatif yang terpilih tidak hanya memiliki jarak terpendek dari solusi ideal positif tetapi juga jarak terpanjang dari solusi ideal negatif. Hasil dari penelitian ini menunjukkan bahwa sistem penunjang keputusan yang mempertimbangkan parameter kondisi lingkungan air dan faktor finansial dapat membantu petani budidaya ikan untuk menentukan jenis budidaya ikan air tawar yang akan dijalankan. Kata kunci—Ikan air tawar, Analisis Finansial, TOPSIS, SPK. AbstractFreshwater aquaculture potential is increasing, one of the reason is production of fishing over the sea is almost deal with "overfishing". Freshwater aquaculture fish have few alternatives such as Carp, Mossambique, Tilapia, Gouramy, Catfish and Pangacius. Each has different type of cultivation. The requirement parameters that influence the process of freshwater cultive is water suitability factors include: Temperature, Brightness, DO (derivated oxygen), acidity (pH) etc. While the selection of profitable aquaculture can be determind from financial bussines as: NPV (Net Present Value), ROI (Return on Investment), BCR (Benefit Cost Ratio), PBP (Payback Period) and BEP (Break Event Point). The methods that used to help the decision-making process that Method Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) as one of the decision models can be used to give preference to farmers fish farming, because the alternative is chosen not only have the shortest distance from a solution positive ideal but also the longest distance from the negative ideal solution. The results of this study show that decision support systems that take into account the environmental condition of water parameters and financial bussines can help fisherman to determine the type of freshwater Aquaculture culture to be run. Keywords— Fresh Water Fish, Financial Analysis, TOPSIS, SPK
Wayan Gede Suka Parwita, Edi Winarko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 167-176; doi:10.22146/ijccs.7545

Abstract:AbstrakRecommendation system sering dibangun dengan memanfaatkan data peringkat item dan data identitas pengguna. Data peringkat item merupakan data yang langka pada sistem yang baru dibangun. Sedangkan, pemberian data identitas pada recommendation system dapat menimbulkan kekhawatiran penyalahgunaan data identitas.Hybrid recommendation system memanfaatkan algoritma penggalian frequent itemset dan perbandingan keyword dapat memberikan daftar rekomendasi tanpa menggunakan data identitas pengguna dan data peringkat item. Penggalian frequent itemset dilakukan menggunakan algoritma FP-Growth. Sedangkan perbandingan keyword dilakukan dengan menghitung similaritas antara dokumen dengan pendekatan cosine similarity.Hybrid recommendation system memanfaatkan kombinasi penggalian frequent itemset dan perbandingan keyword dapat menghasilkan rekomendasi tanpa menggunakan identitas pengguna dan data peringkat dengan penggunaan ambang batas berupa minimum similarity, minimum support, dan jumlah rekomendasi. Nilai pengujian yaitu precision, recall, F-measure, dan MAP dipengaruhi oleh besarnya nilai ambang batas yang ditetapkan. Kata kunci— Hybrid recommendation system, frequent itemset, cosine similarity. AbstractRecommendation system was commonly built by manipulating item is ranking data and user is identity data. Item ranking data were rarely available on newly constructed system. Whereas, giving identity data to the recommendation system causes concerns about identity data misuse.Hybrid recommendation system used frequent itemset mining algorithm and keyword comparison, it can provide recommendations without identity data and item ranking data. Frequent itemset mining was done using FP-Gwowth algorithm and keyword comparison with calculating document similarity value using cosine similarity approach.Hybrid recommendation system with a combination of frequent itemset mining and keywords comparison can give recommendations without using user identity and rating data. Hybrid recommendation system using 3 thresholds ie minimum similarity, minimum support, and number of recommendations. With the testing data used, precision, recall, F-measure, and MAP testing value are influenced by the threshold value. Keywords— Hybrid recommendation system, frequent itemset, cosine similarity.
Izmy Alwiah Musdar, Azhari Sn
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 157-166; doi:10.22146/ijccs.7544

Abstract:AbstrakTelah banyak metode yang dikembangkan untuk memecahkan berbagai masalah clustering. Salah satunya menggunakan metode-metode dari bidang kecerdasan kelompok seperti Particle Swarm Optimization (PSO). Metode Rapid Centroid Estimation (RCE) merupakan salah satu metode clustering yang berbasis PSO. RCE, seperti varian PSO clustering lainnya, memiliki kelebihan yaitu hasil clustering tidak tergantung pada inisialisasi pusat cluster awal. RCE juga memiliki waktu komputasi yang jauh lebih cepat dibandingkan dengan metode sebelumnya yaitu Particle Swarm Clustering (PSC) dan modified Particle Swarm Clustering (mPSC), tetapi metode RCE memiliki standar deviasi kualitas skema clustering yang lebih tinggi dibandingkan PSC dan mPSC dimana ini berpengaruh terhadap variansi hasil clustering. Hal ini terjadi karena equilibrium state, yaitu kondisi dimana posisi partikel tidak mengalami perubahan lagi, kurang tepat pada saat kriteria berhenti tercapai. Penelitian ini mengusulkan metode RCE-Kmeans yaitu metode yang mengaplikasikan K-means setelah equilibrium state metode RCE tercapai untuk memperbarui posisi partikel yang dihasilkan dari metode RCE. Hasil penelitian menunjukkan bahwa dari sepuluh dataset, metode RCE-Kmeans memiliki nilai kualitas skema clustering yang lebih baik pada 7 dataset dibandingkan K-means dan lebih baik pada 8 dataset dibandingkan dengan metode RCE. Penggunaan K-means pada metode RCE juga mampu menurunkan nilai standar deviasi dari metode RCE. Kata kunci—Clustering Data, Particle Swarm, K-means, Rapid Centroid Estimation. Abstract There have been many methods developed to solve the clustering problem. One of them is method in swarm intelligence field such as Particle Swarm Optimization (PSO). Rapid Centroid Estimation (RCE) is a method of clustering based Particle Swarm Optimization. RCE, like other variants of PSO clustering, does not depend on initial cluster centers. Moreover, RCE has faster computational time than the previous method like PSC and mPSC. However, RCE has higher standar deviation value than PSC and mPSC in which has impact in the variance of clustering result. It is happaned because of improper equilibrium state, a condition in which the position of the particle does not change anymore, when the stopping criteria is reached. This study proposes RCE-Kmeans which is a method applying K-means after the equilibrium state of RCE reached to update the particle's position which is generated from the RCE method. The results showed that RCE-Kmeans has better quality of the clustering scheme in 7 of 10 datasets compared to K-means and better in 8 of 10 dataset then RCE method. The use of K-means clustering on the RCE method is also able to reduce the standard deviation from RCE method. Keywords—Data Clustering, Particle Swarm, K-means, Rapid Centroid Estimation.
Jani Kusanti, Sri Hartati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 187-196; doi:10.22146/ijccs.7547

Abstract:AbstrakPenggunaan metode Adaptive Neuro Fuzzy Inference System (ANFIS) dalam proses identifikasi salah satu gangguan neurologis pada bagian kepala yang dikenal dalam istilah kedokteran stroke ischemic dari hasil ct scan kepala dengan tujuan untuk mengidentifikasi lokasi yang terkena stroke ischemik. Langkah-langkah yang dilakukan dalam proses identifikasi antara lain ekstraksi citra hasil ct scan kepala dengan menggunakan histogram. Citra hasil proses histogram ditingkatkan intensitas hasil citranya dengan menggunakan threshold otsu sehingga didapatkan hasil pixel yang diberi nilai 1 berkaitan dengan obyek sedangkan pixel yang diberi nilai 0 berkaitan dengan background. Hasil pengukuran digunakan untuk proses clustering image, untuk proses cluster image digunakan fuzzy c-mean (FCM). Hasil clustering merupakan deretan pusat cluster, hasil data digunakan untuk membangun fuzzy inference system (FIS). Sistem inferensi fuzzy yang diterapkan adalah inferensi fuzzy model Takagi-Sugeno-Kang. Dalam penelitian ini ANFIS digunakan untuk mengoptimalkan hasil penentuan lokasi penyumbatan stroke ischemic. Digunakan recursive least square estimator (RLSE) untuk pembelajaran. Hasil RMSE yang didapat pada proses pelatihan sebesar 0.0432053, sedangkan pada proses pengujian dihasilkan tingkat akurasi sebesar 98,66% Kata kunci—stroke ischemik, Global threshold, Fuzzy Inference System model Sugeno, ANFIS, RMSE Abstract The use of Adaptive Neuro Fuzzy Inference System (ANFIS) methods in the process of identifying one of neurological disorders in the head, known in medical terms ischemic stroke from the ct scan of the head in order to identify the location of ischemic stroke. The steps are performed in the extraction process of identifying, among others, the image of the ct scan of the head by using a histogram. Enhanced image of the intensity histogram image results using Otsu threshold to obtain results pixels rated 1 related to the object while pixel rated 0 associated with the measurement background. The result used for image clustering process, to process image clusters used fuzzy c-mean (FCM) clustering result is a row of the cluster center, the results of the data used to construct a fuzzy inference system (FIS). Fuzzy inference system applied is fuzzy inference model of Takagi-Sugeno-Kang. In this study ANFIS is used to optimize the results of the determination of the location of the blockage ischemic stroke. Used recursive least squares estimator (RLSE) for learning. RMSE results obtained in the training process of 0.0432053, while in the process of generated test accuracy rate of 98.66% Keywords— Stroke Ischemik, Global threshold, Fuzzy Inference System model Sugeno, ANFIS, RMSE
Agus Harjoko, Herman Herman
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 207-218; doi:10.22146/ijccs.7549

Abstract:AbstrakGulma merupakan tanaman pengganggu yang merugikan tanaman budidaya dengan menghambat pertumbuhan tanaman budidaya. Langkah awal dalam melakukan pengendalian gulma adalah mengenali spesies gulma pada lahan tanaman budidaya. Cara tercepat dan termudah untuk mengenali tanaman, termasuk gulma adalah melalui daunnya. Dalam penelitian ini, diusulkan pengenalan spesies gulma berdasarkan citra daunnya dengan cara mengekstrak ciri bentuk dan ciri tekstur dari citra daun gulma tersebut. Untuk mendapatkan ciri bentuk, digunakan metode moment invariant, sedangkan untuk ciri tekstur digunakan metode lacunarity yang merupakan bagian dari fraktal. Untuk proses pengenalan berdasarkan ciri-ciri yang telah diekstrak, digunakan metode Jaringan Syaraf Tiruan dengan algoritma pembelajaran Backpropagation. Dari hasil pengujian pada penelitian ini, didapatkan tingkat akurasi pengenalan tertinggi sebesar 97.22% sebelum noise dihilangkan pada citra hasil deteksi tepi Canny. Tingkat akurasi tertinggi didapatkan menggunakan 2 ciri moment invariant (moment dan ) dan 1 ciri lacunarity (ukuran box 4 x 4 atau 16 x 16). Penggunaan 3 neuron hidden layer pada Jaringan Syaraf Tiruan (JST) memberikan waktu pelatihan data yang lebih cepat dibandingkan dengan menggunakan 1 atau 2 neuron hidden layer. Kata kunci—3-5 gulma, daun ,moment invariant, lacunarity, jaringan syaraf tiruan AbstractWeeds are plants that harm crops by inhibiting the growth of cultivated plants. The first step to take control of weeds is by identifying weed among the cultivating plant. The fastest and easiest way to identify plants, including weeds is by its leaves. This research proposing weed species recognition based on weeds leaf images by extracting its shape and texture features. Moment invariant method is used to get the shape and Lacunarity method for the texturel. Neural Network with backpropagation learning algorithm are implements for the extracted features recognition proses. The result of this research achievement shows the highest level of recognition accuracy of 97.22% before the noise is eliminated in the image of the Canny edge detection. Highest level of accuracy is obtained using two features from moment invariant (moment and ) and 1 lacunarity’s feature (size box 4 x 4 or 16 x 16). The use of 3 neurons in the hidden layer of Artificial Neural Network (ANN) provide training time data more quickly than by using 1 or 2 hidden layer neurons. Keywords— weed, leaf, moment invariant, lacunarity, artificial neural network
Decky Hendarsyah, Retantyo Wardoyo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 5; doi:10.22146/ijccs.1997

Abstract:Abstrack— SMS now becomes such a need for cellular phone users to communicate to other people. But the cellular phone users do not realize that the sent messages could be intercepted or changed by an unwanted party. Therefore it requires a security in sending an SMS message which is called cryptography. Given limited resources on cellular phone, then the implementation of symmetric cryptographic technique is suitable to meet the security needs of an SMS message. In symmetric cryptography, there is a symmetric key for encryption and decryption process. In order to secure exchange of symmetric keys in public channels is required of a protocol for key exchange.This research implements RC4 symmetric cryptography to encrypt and decrypt messages, while for key exchange is using Diffie-Hellman protocol. In this research, there are modifications to the Diffie-Hellman protocol that is the calculation of the public key and symmetric key to include cellular phone number as authentication. Whereas on a modified RC4 is the key where there is a combination with cellular phone number as authentication and key randomization, and then there are also modifications to the pseudorandom byte generator, encryption and decryption of the RC4 algorithm. The system is constructed using the Java programming language in the platform Micro Edition (J2ME) based MIDP 2.0 and CLDC 1.0.The research found that with the cellular phone number as authentication, key, encryption and decryption process automatically it is able to maintain confidentiality, data integrity, authentication and non-repudiation to the message. Keywords— Diffie-Hellman, Key exchange, RC4, SMS Secure, Symmetric Cryptography.
Ika Oktavia Suzanti, Reza Pulungan
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 13-22; doi:10.22146/ijccs.6636

Abstract:AbstrakMobile Ad-hoc Network (MANET) adalah sekumpulan wireless mobile yang terhubung satu sama lain tanpa infrastruktur yang tetap sehingga perubahan topologi dapat terjadi setiap saat. Protokol routing MANET memiliki dua model yaitu protokol routing reaktif yang membentuk tabel routing hanya saat dibutuhkan dan protokol routing proaktif yang melakukan pemeliharaan tabel routing secara berkala. Properti umum yang harus dipenuhi oleh protokol jaringan ad-hoc adalah route discovery, packet delivery dan loop fredom. AODV merupakan protokol reaktif MANET yang memiliki standar waktu berapa lama sebuah rute dapat digunakan (route validity), sehingga properti route discovery dan packet delivery harus dapat dipenuhi dalam waktu tersebut. Proses verifikasi protokol dilakukan dengan memodelkan spesifikasi protokol menggunakan teknik, tool, dan bahasa matematis. Pada penelitian ini bahasa pemodelan yang digunakan adalah timed automata, yaitu bahasa pemodelan untuk memodelkan sistem yang memiliki ketergantungan terhadap waktu tertentu pada setiap prosesnya. Verifikasi protokol dilakukan secara otomatis dengan mengggunakan tool model checker UPPAAL.Protokol yang diverifikasi adalah protokol AODV Break Avoidance milik Ali Khosrozadeh dkk dan protokol AODV Reliable Delivery dari Liu-Jian dan Fang-Min. Hasil verifikasi protokol membuktikan bahwa protokol AODV Break Avoidance mampu memenuhi properti route discovery dan protokol AODV Reliable Delivery mampu memenuhi properti packet delivery dalam waktu sesuai dengan spesifikasi. Kata kunci —Verifikasi Protokol, Timed Automata, AODV, UPPAAL Abstract MANET is a group of wireless mobile that connected one to each other without fixed infrastructure so topology could change at anytime. MANET routing protocol has two models which are reactive routing protocol that built routing table only when needed and proactive routing protocol that maintain routing table periodically. General property which had to be satisfied by ad-hoc network protocol are route discovery, packet delivery and loop freedom. AODV is a reactive protocol in MANET that has time standard to determine how long a route is valid to be used (route validity) so route discovery and packet delivery property should be satisfied in a specifically certain time. Protocol verification process done by modeling protocol specification using technique, tool and mathematic language.In this research protocol modeled using timed automata which is a modeling language that could be used to model a time dependent system in each process. Verification using timed automata can automatically done by UPPAAL tool model checker.Protocol which will be verified are AODV Break Avoidance by Ali Khosrozadeh et al. and AODV Reliable Delivery by Liu Jian and Fang-Min. Result of this protocol verification prove that AODV BA could satisfied route discovery property and AODV Reliable Delivery could satisfied packet delivery property within their specification time. Keywords—Protocol Verification, Timed Automata, AODV, UPPAAL
Effan Najwaini, Ahmad Ashari
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 89-100; doi:10.22146/ijccs.6643

Abstract:AbstrakPada komunikasi VoIP (Voice Over IP) kualitas suara dipengaruhi oleh banyak faktor salah satunya yaitu kualitas server. Pemilihan platform PC atau server yang cocok (baik dari segi harga maupun kinerja) merupakan persoalan utama dalam membangun jaringan VoIP. Kinerja server yang jelek akan menurunkan kualitas suara atau bahkan tidak mampu untuk menghubungkan antar pengguna.Dalam penelitian ini dilakukan pengujian terhadap kinerja wireless access point Linksys WRT54GL yang dimanfaatkan sebagai VoIP server. Pengujian dilakukan untuk mengetahui berapa banyak panggilan VoIP yang mampu dilayani oleh wireless access point sebagai VoIP server serta berapa waktu yang diperlukan oleh server tersebut untuk dapat memproses setiap sinyal SIP maupun paket RTP. Berdasarkan hasil pengujian yang dilakukan, VoIP server pada wireless access point mampu melayani komunikasi VoIP dengan baik untuk jumlah panggilan yang sedikit sehingga layak diimplementasikan pada penggunaan skala kecil. Penggunaan metode Native Bridging dalam penanganan media yang dilakukan oleh server dapat meningkatkan jumlah panggilan yang mampu dilayani sebesar 3 hingga 7 kali dibandingkan dengan metode lainnya. Kata kunci—VoIP, Asterisk, Acess Point, WRT54GL, OpenWRT, Kinerja AbstractVoice quality on VoIP communication is caused by many factors, one of which is the quality of the server. Choosing PC platform or server which is suitable is the main issue in developing VoIP network. A bad server performance or not equivalent with the most of users will degrade the sound quality or even not able to connect between users.Tthe test carried out to the performance of the wireless access point Linksys WRT54GL which is used as a VoIP server. The test was carried out to determine how many VoIP calls which are able to be serviced by a wireless access point as a VoIP server and how long the server needs to be able to process every signal of SIP and RTP packet.Based on the test result performed, the VoIP server on the wireless access point is able to serve VoIP communication well for a few calls number, so it is worth to be implemented on the use of small scale. The use of Native Bridging method in handling the media performed by the server can increase the number of calls that were able to be served about 3 to 7 times compared with other methods. Keywords— VoIP, Asterisk, Acess Point, WRT54GL, OpenWRT, Performance
Freska Rolansa, Azhari S.N.
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 101-110; doi:10.22146/ijccs.6644

Abstract:AbstrakGedung merupakan salah satu tempat dilaksanakannya berbagai aktivitas dari sejumlah manusia pada waktu tertentu. Pada saat kebakaran gedung terjadi, semua orang yang berada didalam gedung harus melakukan proses evakuasi agar terhindar dari bahaya api. Permodelan dengan agen merupakan salah satu cara menggambarkan kondisi kebakaran di kehidupan nyata, untuk mengurangi biaya dan bahaya yang ditimbulkan pada saat terjadi kebakaran digedung.Permodelan ini menggunakan pendekatan Multi agen, yang terdiri dari agen karyawan, agen api dan agen pintu exit yang saling berinteraksi dan berkomunikasi. Setiap karakteristik dan perilaku dari agen disimulasikan dengan menggunakan NetLogo pada saat terjadi kebakaran digedung dengan menggunakan skenario perluasan api, skenario penyelamatan, dan proses evaluasi fasilitas pendukung evakuasi.Pengujian dilakukan terhadap kondisi existing gedung dan perubahan rancangan fasilitas pendukung evakuasi seperti scenario penempatan pintu dan scenario penambahan lebar pintu. Setiap satu pengujian skenario dilakukan sebanyak 5 kali percobaan dengan parameter yang sama dan hasilnya akan dicari nilai rata-rata jumlah manusia yang selamat dan jumlah korban yang terkena api sebagai hasil evaluasi fasilitas pendukung evakuasi, selain itu juga dilakukan pengujian terhadap skenario perluasan api dan skenario penyelamatan untuk melihat karakteristik dan perilaku yang dimiliki agen api dan agen karyawan dalam memberikan aksi terhadap proses evakuasi. Kata kunci : Evakuasi, kebakaran, agen, NetLogo, skenario, evaluasi. AbstractThe building is one of place that people do various activities at a certain time. At the time of building fire occurs, all of the people must make the evacuation process in order to avoid the danger of fire. Modeling with Agent is one way to describe the condition of fire in real life, to reduce costs and the danger posed in the event of a fire halls. This model uses Multi-agent approach, which consists of the employeeagent, fire agent and exit dooragent which interact and communicate. Each of the characteristics and behavior of the agent is simulated using the NetLogo in the event of fire halls using the expansion fire scenario, rescue scenario, and evacuation support facilities to evaluation process. Tests carried out on the existing condition of the building and changes the support facilities such as the placement of doors and the addition of the door width scenario. Each of the test scenario performed 5 times with the same parameters and the results will be sought value average number of human survivors and the number of victims affected by the fires as a result of the evaluation of evacuation support facility, while also testing the expansion fire and rescue scenario for look at the characteristics and behavior of fire agent and employee agent in evacuation process. Keywords: Evacuation, Fire, Agent, NetLogo, scenario, evaluation.
Christian Dwi Suhendra, Retantyo Wardoyo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9, pp 77-88; doi:10.22146/ijccs.6642

Abstract:AbstrakKelemahan dari jaringan syaraf tiruan backpropagation adalah sangat lama untuk konvergen dan permasalahan lokal mininum yang membuat jaringan syaraf tiruan (JST) sering terjebak pada lokal minimum. Kombinasi parameter arsiktektur, bobot awal dan bias awal yang baik sangat menentukan kemampuan belajar dari JST untuk mengatasi kelemahan dari JST backpropagation. Pada penelitian Ini dikembangkan sebuah metode untuk menentukan kombinasi parameter arsitektur, bobot awal dan bias awal. Selama ini kombinasi ini dilakukan dengan mencoba kemungkinan satu per satu, baik kombinasi hidden layer pada architecture maupun bobot awal, dan bias awal. Bobot awal dan bias awal digunakan sebagai parameter dalam perhitungan nilai fitness. Ukuran setiap individu terbaik dilihat dari besarnya jumlah kuadrat galat (sum of squared error = SSE) masing – masing individu, individu dengan SSE terkecil merupakan individu terbaik. Kombinasi parameter arsiktektur, bobot awal dan bias awal yang terbaik akan digunakan sebagai parameter dalam pelatihan JST backpropagation.Hasil dari penelitian ini adalah sebuah solusi alternatif untuk menyelesaikan permasalahan pada pembelajaran backpropagation yang sering mengalami masalah dalam penentuan parameter pembelajaran. Hasil penelitian ini menunjukan bahwa metode algoritma genetika dapat memberikan solusi bagi pembelajaran backpropagation dan memberikan tingkat akurasi yang lebih baik, serta menurunkan lama pembelajaran jika dibandingkan dengan penentuan parameter yang dilakukan secara manual. Kata kunci Jaringan syaraf tiruan, algoritma genetika, backpropagation, SSE, lokal minimum AbstractThe weakness of back propagation neural network is very slow to converge and local minima issues that makes artificial neural networks (ANN) are often being trapped in a local minima. A good combination between architecture, intial weight and bias are so important to overcome the weakness of backpropagation neural network.This study developed a method to determine the combination parameter of architectur, initial weight and bias. So far, trial and error is commonly used to select the combination of hidden layer, intial weight and bias. Initial weight and bias is used as a parameter in order to evaluate fitness value. Sum of squared error(SSE) is used to determine best individual. individual with the smallest SSE is the best individual. Best combination parameter of architecture, initial weight and bias will be used as a paramater in the backpropagation neural network learning. The results of this study is an alternative solution to solve the problems on the backpropagation learning that often have problems in determining the parameters of the learning. The result shows genetic algorithm method can provide a solution for backpropagation learning and can improve the accuracy, also reduce long learning when it compared with the parameters were determined manually. Keywords: Artificial neural network, genetic algorithm, backpropagation, SSE, local minima.
I Putu Adi Pratama, Agus Harjoko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 9; doi:10.22146/ijccs.6641

Abstract:AbstrakK-means merupakan salah satu algoritmaclustering yang paling populer. Salah satu alasan dari kepopuleran K-means adalah karena mudah dan sederhana ketika diimplementasikan. Namun hasil klaster dari K-means sangat sensitif terhadap pemilihan titik pusat awalnya. K-means seringkali terjebak pada solusi lokal optima. Hasil klaster yang lebih baik seringkali baru bisa didapatkan setelah dilakukan beberapa kali percobaan. Penyebab lain seringnya K-means terjebak pada solusi lokal optima adalah karena cara penentuan titik pusat baru untuk setiap iterasi dalam K-means dilakukan dengan menggunakan nilai mean dari data-data yang ada pada klaster bersangkutan. Hal tersebut menyebabkan K-means hanya akan melakukan pencarian calon titik pusat baru disekitar titik pusat awal. Untuk mengatasi permasalahan tersebut, penerapan metode yang memiliki kemampuan untuk melakukan pencarian global akan mampu membantu K-means untuk dapat menemukan titik pusat klaster yang lebih baik. Invasive Weed Optimization merupakan algoritma pencarian global yang terinspirasi oleh proses kolonisasi rumput liar. Pada penelitian ini diusulkan sebuah metode yang merupakan hasil hibridasi dari metode K-means dan algoritma Invasive Weed Optimization (IWOKM). Kinerja dari metode IWOKM telah dicobakan pada data bunga Iris kemudian hasilnya dibandingkan dengan K-means. Dari pengujian yang dilakukan, didapat hasil bahwa metode IWOKM mampu menghasilkan hasil klaster yang lebih baik dari K-means. Kata kunci—K-means, IWO, IWOKM, analisa klaster AbstractK-means is one of the most popular clustering algorithm. One reason for the popularity of K-means is it is easy and simple when implemented. However, the results of K-means is very sensitive to the selection of initial centroid. The results are often better after several experiment. Another reason why K-means stuck in local optima is due to the method of determining the new center point for each iteration that is performed using the mean value of the data that exist on the cluster. This causes the algorithm will do search for the centroid candidates around the center point. To overcome this, implement a method that is able to do a global search to determine the center point on K-means may be able to assist K-means in finding better cluster center. Invasive Weed Optimization (IWO) is a global search algorithm inspired by weed colonization process. In this study proposed a method which is the result of hybridization of K-means and IWO (IWOKM). Performance of the method has been tested on flower Iris dataset. The results are then compared with the result from K-means. The result show that IWOKM able to produce better cluster center than K-means. Keywords—K-means, IWO, IWOKM, cluster analysis
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 8; doi:10.22146/ijccs.3489

Abstract:IJCCSIndonesian Journal of Computing and Cybernetic SystemsContentsSistem Pendukung Keputusan Penentuan Pemenang Tender Pekerjaan Konstruksi dengan Metode Fuzzy AHP Pages: 1 - 12Peggi Sri Astuti, Retantyo Wardoyo Penerapan Metode Support Vector Machine pada Sistem Deteksi Intrusi secara Real-timePage: 13 - 24Agustinus Jacobus, Edi WinarkoOptimasi Bobot Jaringan Syaraf Tiruan Menggunakan Particle Swarm OptimizationPage:25-36Harry Ganda Nugraha, Azhari SNSistem Evaluasi Kelayakan Mahasiswa Magang Menggunakan Elman Recurrent Neural NetworkPage: 37-48Agus Aan Jiwa Permana, Widodo PrijodiprodjoPeramalan KLB Campak Menggunakan Gabungan Metode JST Backpropagation dan CARTPage : 49 - 58Sulistyowati, Edi WinarkoKlasifikasi Massa pada Citra Mammogram Berdasarkan Gray Level Cooccurence Matric (GLCM)Page: 59 - 68Refta Listia, Agus Harjoko Perbandingan Mother Wavelet dalam Proses Denoising pada SuaraPage: 69-80Rahmat Ramadhan, Agfianto Eko PutraPenyembunyian Data pada File Video Menggunakan Metode LSB dan DCTPage: 81 - 90Mahmuddin Yunus, Agus HarjokoAnalisis Sentimen Twitter untuk Teks Berbahasa Indonesia dengan Maximum Entropy dan Support Vector MachinePage: 91 - 100Noviah Dwi Putranti, Edi winarkoPengelompokan Berita Indonesia Berdasarkan Histogram Kata Menggunakan Self-Organizing MapPage: 101 - 110Ambarwati, Edi Winarko
Peggi Sri Astuti, Retantyo Wardoyo
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 8, pp 1-12; doi:10.22146/ijccs.3490

Abstract:AbstrakPengambilan keputusan dalam penentuan pemenang tender pekerjaan konstruksi (tidak kompleks) pada pembangunan gedung kuliah Fakultas Ekonomi Univesitas Udayana (UNUD) oleh panitia tender di Bagian Perlengkapan Rektorat UNUD masih dilakukan secara manual (dengan software Microsoft Excel dan Word), sehingga untuk membantu dan mempercepat pengambilan keputusan tersebut (dalam situasi beberapa/banyak peserta tender memenuhi semua evaluasi kriteria dan memiliki harga penawaran terkoreksi terendah yang sama di bawah HPS) maka penelitian ini bertujuan untuk membangun SPK (Sistem Pendukung Keputusan) dengan metode Fuzzy AHP. Versi Fuzzy AHP yang dipakai adalah model Chang (1992) karena memiliki langkah-langkah sederhana dan mudah diaplikasikan pada penelitian ini. Hasil penelitian menunjukkan bahwa SPK yang dibangun menghasilkan perangkingan peserta 1, 2, dan 3 yang sama dengan sistem manual yang ada di Bagian Perlengkapan Rektorat UNUD, meskipun perangkingan 4, 5, 6 yang juga dihasilkan SPK ini tidak ada di sistem manual karena perangkingan 4, 5, 6 tidak memenuhi evaluasi kriteria kualifikasi (syarat untuk lulus tender adalah memenuhi semua evaluasi kriteria). Maka disimpulkan bahwa SPK yang dibangun menghasillkan informasi yang valid. Kata kunci— sistem pendukung keputusan, fuzzy AHP, tender, pekerjaan konstruksi AbstractDecision-making to determine the winner of project tender (not complex one) on the construction of college buildings for Economics Faculty of Udayana University by tender committee at the Rectorate Equipment Section of Udayana University, still is carried out manually (applying Microsoft Excel and Word), so to assist and accelerate the decision (in this situation a few/many bidders met all evaluation criteria and have the same lowest bidding price corrected under HPS), this study aims to build a DSS (Decision Supporting System) with Fuzzy AHP method. The applied Fuzzy AHP version is Chang model (1992) because it has simple steps and easy to apply in this study. The results showed that SPK produced ranking method of 1, 2, and 3 that are similar to the existing manual system in Equipment Section of the Rectorate, though the ranking method of 4, 5, 6, which also produced by SPK, is not contained in the manual system because ranking method of 4, 5, 6 did not meet the qualifying criteria evaluation (a requirement for graduation is to fulfill all tender evaluation criteria). It, therefore, comes to conclude that the DSS produce valid information. Keywords— decision supporting system, fuzzy AHP, tender, construction project
Agustinus Jacobus, Edi Winarko
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 8, pp 13-24; doi:10.22146/ijccs.3491

Abstract:AbstrakSistem deteksi intrusi adalah sebuah sistem yang dapat mendeteksi serangan atau intrusi dalam sebuah jaringan atau sistem komputer, umum pendeteksian intrusi dilakukan dengan membandingkan pola lalu lintas jaringan dengan pola serangan yang diketahui atau mencari pola tidak normal dari lalu lintas jaringan. Pertumbuhan aktivitas internet meningkatkan jumlah paket data yang harus dianalisis untuk membangun pola serangan ataupun normal, situasi ini menyebabkan kemungkinan bahwa sistem tidak dapat mendeteksi serangan dengan teknik yang baru, sehingga dibutuhkan sebuah sistem yang dapat membangun pola atau model secara otomatis.Penelitian ini memiliki tujuan untuk membangun sistem deteksi intrusi dengan kemampuan membuat sebuah model secara otomatis dan dapat mendeteksi intrusi dalam lingkungan real-time, dengan menggunakan metode support vector machine sebagai salah satu metode data mining untuk mengklasifikasikan audit data lalu lintas jaringan dalam 3 kelas, yaitu: normal, probe, dan DoS. Data audit dibuat dari preprocessing rekaman paket data jaringan yang dihasilkan oleh Tshark.Berdasar hasil pengujian, sistem dapat membantu sistem administrator untuk membangun model atau pola secara otomatis dengan tingkat akurasi dan deteksi serangan yang tinggi serta tingkat false positive yang rendah. Sistem juga dapat berjalan pada lingkungan real-time. Kata kunci— deteksi intrusi, klasifikasi, preprocessing, support vector machine AbstractIntrusion detection system is a system for detecting attacks or intrusions in a network or computer system, generally intrusion detection is done with comparing network traffic pattern with known attack pattern or with finding unnormal pattern of network traffic. The raise of internet activity has increase the number of packet data that must be analyzed for build the attack or normal pattern, this situation led to the possibility that the system can not detect the intrusion with a new technique, so it needs a system that can automaticaly build a pattern or model.This research have a goal to build an intrusion detection system with ability to create a model automaticaly and can detect the intrusion in real-time environment with using support vector machine method as a one of data mining method for classifying network traffic audit data in 3 classes, namely: normal, probe, and DoS. Audit data was established from preprocessing of network packet capture files that obtained from Tshark. Based on the test result, the system can help system administrator to build a model or pattern automaticaly with high accuracy, high attack detection rate, and low false positive rate. The system also can run in real-time environment. Keywords— intrusion detection, classification, preprocessing, support vector machine
Nur Rokhman, Iqnatius Dimas Nugroho
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 7, pp 209-220; doi:10.22146/ijccs.3361

Abstract:AbstrakSebuah smartphone umunya dilengkapi dengan Global Positioning System (GPS). Pengguna smartphone disamping dapat mengetahui lokasi dirinya, umumnya juga ingin mengetahui lokasi sekitarnya. Foursquare merupakan salah satu jejaring sosial yang menyediakan layanan berbasis lokasi. Foursquare memiliki fitur check-in, untuk menandai lokasi pengguna.Dalam penelitian ini akan dikembangkan aplikasi pada perangkat dengan sistem operasi Android yang dapat mencari lokasi fasilitas umum di sekitar pengguna dengan memanfaatkan teknologi layanan berbasis lokasi. Aplikasi ini memanfaatkan data dari Foursquare. Hasil pengujian terhadap aplikasi yang dibangun menunjukkan filter data dan sistem auto check-in berjalan dengan baik sehingga duplikasi data dalam Foursquare dapat diminimalkan. Kata kunci— Layanan berbasis lokasi, Android, Foursquare, Fasilitas umum AbstractA smartphone is equipped with Global Positioning System (GPS). A smartphone user may know the location itself, and usually want to know the surrounding location. Foursquare is a social network that provide location-based services. Foursquare has a check-in feature to mark the location of the user.This research develops applications on devices with Android operating system that can find location of public facilities around the user by using location-based services technology. This application uses Foursquare data.The test results showed that the application can filter data and the check-in systems running properly such that duplication of data in Foursquare can be minimized. Keywords— Location Based Services (LBS), Android, Foursquare, Public Facility
Sri Hartati, Sri Nurdiati
IJCCS (Indonesian Journal of Computing and Cybernetics Systems), Volume 5; doi:10.22146/ijccs.1996

Abstract:— In recent years, the occurrence of protein shortage of children under 5 years old in many poor area has dramatically increased. Since this situation can cause serious problem to children like a delay in their growth, delay in their development and also disfigurement, disability, dependency, the early diagnose of protein shortage is vital. Many applications have been developed in performing disease detection such as an expert system for diagnosing diabetics and artificial neural network (ANN) applications for diagnosing breast cancer, acidosis diseases, and lung cancer. This paper is mainly focusing on the development of protein shortage disease diagnosing application using Backpropagation Neural Network (BPNN) technique. It covers two classes of protein shortage that are Heavy Protein Deficiency. On top of this, a BPNN model is constructed based on result analysis of the training and testing from the developed application. The model has been successfully tested using new data set. It shows that the BPNN is able to early diagnose heavy protein deficiency accurately. Keywords— Artificial Neural Network, Backpropagation Neural Network, Protein Deficiency.
Page of 1
Articles per Page
by

Refine Search

Authors

New Search

Advanced search