Scholarly Publication
Permanent URI for this collection
Browse
Browsing Scholarly Publication by Issue Date
Now showing 1 - 20 of 41
Results Per Page
Sort Options
- ItemA novel approach to outliers removal in a noisy numeric dataset for efficient Mining.(Published by Faculty of Computing and Information Systems, University of Ilorin., 2016-02-16) Ajiboye, A. R, Adewole, K. S., Babatunde, R. S. and Oladipo, I. D. (2016):Data pre-processing is a key task in the data mining process. The task generally consumes the largest portion of the total data engineering effort while unveiling useful patterns from datasets. Basically, data mining is about fitting descriptive or predictive models from data. However, the presence of outlier sometimes reduces the reliability of the models created. It is, therefore, essential to have raw data properly pre-processed before exploring them for mining. In this paper, an algorithm that detects and removes outliers in a numeric dataset is proposed. In order to establish the effectiveness of the proposed algorithm, the clean data obtained through the implementation of the proposed approach is used to create a prediction model. Similarly, the clean data obtained through the use of one of the existing techniques is also used to create a prediction model. Each of the models created is simulated using a set of untrained data and the error associated with each model is measured. The resulting outputs from the two approaches reveal that, the prediction model created using the output from the proposed algorithm has an error of 0.38, while the prediction model created using the cleaned data from the clustering method gives an error of 0.61. Comparison of the errors associated with the models created using the two approaches shows that, the proposed algorithm is suitable for cleaning numeric dataset. The results of the experiment also unveils that, the proposed approach is efficient and can be used as an alternative technique to other existing cleaning methods.
- ItemGender Recognition using Local Binary Pattern and Naïve Bayes Classifier(African Journal Online. Published by Nigeria Computer Society Journal. Journal of Computer Science and its Applications, 2016-10-24) Babatunde, R. S., Abdulsalam, S. O., Yusuff, S. R., and Babatunde, A. N. (2016):Human face provides important visual information for gender perception. Ability to recognize a particular gender is very important for the purpose of differentiation. Automatic gender classification has many important applications, for example, intelligent user interface, surveillance, identity authentication, access control and human-computer interaction amongst others. Gender recognition is a fundamental task for human beings, as many social functions critically depend on the correct gender perception. Consequently, real-world applications require gender classification on real-life faces, which is much more challenging due to significant appearance variations in unconstrained scenarios. In this study, Local Binary Pattern is used to detect the occurrence of a face in a given image by reading the texture change within the regions of the image, while Naive Bayes Classifier was used for the gender classification. From the results obtained, the gender correlation was 100% and the highest accuracy of the result obtained was 99%.The system can be employed for use in scenarios where real time gender recognition is required.
- ItemImage Encryption System based on Length Three Moduli Set(Foundation of Computer Science (FCS), NY, USA, 2018-05-01) Damilare Peter OyinloyeDigital images have found usage in almost all our everyday applications. These images sometimes contain confidential and intelligible information which need to be protected when stored on memory or transmitted over networks (Intranet or Internet). Many techniques have been proposed to deal with this security issues in the past. This paper proposes a simple scrambling algorithm to encrypt and decrypt the grey level image based on random number generation and Residue Number System (Forward and Reverse Conversion). The image is first encrypted by changing the position of each pixel in the original image without changing the value of grey level. The original image reads row by row, pixel by pixel and each pixel will take a new position in the scrambled image. The new position is chosen based on random number generation from the random number generator. The key will be generated as a matrix during the encryption process and also the key saves the position of each pixel in the encrypted/scrambled image. The encryption layer transforms the scrambled image to moduli images which automatically adds an extra security layer to our data. The encrypted moduli images is decrypted by decoding the moduli images and converting them back to a single scrambled image (Reverse Conversion) and the single scrambled image back to the Plain and Original Image by using the saved key matrix. This scheme achieves an enhanced image encryption process and a more efficient decryption process without loses of any inherent information of the recovered plain image.
- ItemAn Improved Image Scrambling Algorithm Using {2 n -1, 2 n , 2 n +1}(University of the West of Scotland, School of Computing, 2018-10-01) Damilare Peter Oyinloye; Kazeem Alagbe GbolagadeTraditional iris segmentation methods and strategies regularly contain an exhaustive search of a large parameter space, which is sensitive to noise, time-consuming and no longer secured enough. To address these challenges, this paper proposes a secured iris template. This paper proposes a technique to secure the iris template using steganography. The experimental analysis was carried out on matrix laboratory (MATLABR2015A) environment. The segmented iris region was normalized to decrease the dimensional inconsistencies between iris region areas with the aid of the usage of Hough transform (HT). The features of the iris were encoded by convolving the normalized iris region with 1D Log- Gabor filters in order to generate a bit-wise biometric template. Then, least significant bit (LSB) was used to secure the iris template. The Hamming distance was chosen as a matching metric, which gives the measure of how many bits disagreed between the templates of the iris. The system operated at a very good training time and high level of conditional testing signifying high optimization with recognition accuracy of 92% and error of 1.7%. The proposed system is reliable, secure and efficient with the computational complexity significantly reduced. The proposed technique provides an efficient approach for securing the iris template.
- ItemAdaptive Neuro Fuzzy Inference System (ANFIS) Based Detection of Cervical Cancer.(Published by LAUTECH, Ogbomoso. Oyo State, 2018-11-22) Babatunde, R. S. and Muhammad-Thani S. (2018):Cancer occurs when cells in an area of the body grows abnormally. Cervical cancer is a cancer that is known to be the second most deadly cancer affecting women worldwide which emerges from the cervix of a woman. The normal cervix has two main types of cells: squamous cells, that protect the outside of the cervix and glandular cells. Cervical cancer is caused by abnormal changes in either of these cell types in the cervix. Pap smear screening is the most successful attempt of medical science and practice for the early detection of cervical cancer. However, the manual analysis of the cervical cells (pap smear) is time consuming, laborious and error prone. In this research, detection of cervical cancer was done using Adaptive Neuro-Fuzzy Inference System (ANFIS), a hybrid intelligent system which combines the fuzzy logic qualitative approach and adaptive neural network capabilities. The system was modelled to predict the probability of cervical cancer as either high probability or low probability based on the input risk factors of cervical cancer as contained in the dataset used. The efficiency of the ANFIS in predicting cervical cancer was tested using a dataset collected from Leah Foundation, a non-governmental organization in Nigeria, having a total of 250 patient’s data. The dataset contains eight risk factors of cervical cancer, which serves as the input variables. The output is the classified risk factors each patient is prone to. A patient with low level (represented as 1) is classified to be of no or less probability of having cervical cancer, while a patient with high level (represented as 2) is classified to be a victim of high probability of having cervical cancer. The result of this system shows that the system can be adopted by medical practitioners as well as for individual use.
- ItemPerformance Analysis of Gray Code Number System in Image Security(2019-10-03) Babatunde, A.N., Jimoh, E.R, Oshodi, O. & Alabi, O.A.The encryption of digital images has become essential since it is vulnerable to interception while being transmitted or stored. A new image encryption algorithm to address the security challenges of traditional image encryption algorithms is presented in this research. The proposed scheme transforms the pixel information of an original image by taking into consideration the pixel location such that two neighboring pixels are processed via two separate algorithms. The proposed scheme utilized the Gray code number system. The experimental results and comparison shows the encrypted images were different from the original images. Also, pixel histogram revealed that the distribution of the plain images and their decrypted images have the same pixel histogram distributions, which means that there is a high correlation between the original images and decrypted images. The scheme also offers strong resistance to statistical attacks. Keywords: Gray Code, Number System, Spatial domain, Image encryption.
- ItemProof of Behaviour (PoBh): An Enhanced Proof of Stake Blockchain Consensus Protocol(International Journal of Advanced Studies in Computers, Science and Engineering, 2020-03-31) Damilare Peter OyinloyeAlternative protocols such as proof of stake (PoS) emerged after the drawbacks of proof of work (PoW) consensus protocol had been analyzed by researchers. Bitcon which is powered by proof of work consumes almost the same amount of energy as Ireland yearly among other drawbacks. PoS became the protocol of the moment because it reduces the unimaginable energy consumption in PoW with other enhancements. PoS was not without its shortcomings/drawbacks with respect to its performance, accountability and security. This work proposes a proof of behavior (PoBh) consensus protocol, an enhanced PoS algorithm with a much better performance, enhanced security and accountability.
- ItemA New Hash Function Based on Chaotic Maps and Deterministic Finite State Automata(IEEE, 2020-06-16) Moatsum Alawida; Je Sen Teh; Damilare Peter Oyinloye; Musheer Ahmad; Rami S AlkhawaldehIn this paper, a new chaos-based hash function is proposed based on a recently proposed structure known as the deterministic chaotic finite state automata (DCFSA). Out of its various configurations, we select the forward and parameter permutation variant, DCFSAFWP due to its desirable chaotic properties. These properties are analogous to hash function requirements such as diffusion, confusion and collision resistance. The proposed hash function consists of six machine states and three simple chaotic maps. This particular structure of DCFSA can process larger message blocks (leading to higher hashing rates) and optimizes its randomness. The proposed hash function is analyzed in terms of various security aspects and compared with other recently proposed chaos-based hash functions to demonstrate its efficiency and reliability. Results indicate that the proposed hash function has desirable statistical characteristics, elevated randomness, optimal diffusion and confusion properties as well as flexibility.
- ItemImplementation of Yorùbá Unicode generation for an indigenous keyboard(Published by Kwara State University, Malete., 2020-07-19) Jumoke Falilat Ajao, Ronke Seyi Babatunde, Suliyat Oyindamola Asapetu, Shakirat Ronke YusuffCharacters are generally represented on standard keyboards by the use of coding schemes such as ASCII and Unicode but some characters of the Yorùbá language were not captured because of the accent and diacritic signs inherent in Yorùbá language. Coding representation for Yorùbá characters has become a key problem in circuit implementations of keyboards for typing Yorùbá documents correctly. The standard keyboard layouts do not have a simple key combination for all the characters. To represent Yorùbá characters in human-computer interaction, this paper proposes the use of Unicode for the deployment of Yorùbá keyboard for effective and efficient typing of documents in Yorùbá language. The approach used for the development of the Yorùbá keyboard uses both binary and hexadecimal representation of the coding standards to codify Yorùbá characters with their diacritic signs and the under dot. This dimension introduces the complex Yorùbá character coding representations using a single code point standard. The generated Unicode point was transformed into HTML entities. The HTML entities generated were converted to its character equivalent for the Yorùbá keyboard. The new Unicode was compared with the results of the two encoding standard scheme using the bit relationships of upper and lower case characters to ascertain conformity of the new Unicode with the standard encoding scheme. This provided some insights about the efficient representation of Yorùbá character scheme. The representation is expected to quickly identify solutions to the design of Yorùbá keyboards and signage.
- ItemComparative Analysis of Lempel-Ziv-Welch and Embedded Zero Tree Compression Algorithms on Kidney Images.(Published by Faculty of Computing and Information Science. University of Ilorin. Special Issue on Computing and Communication Technologies, 2020-07-24) Babatunde Ronke SeyiImage compression refers to the reduction of digital image data to store or transmit images with less rate of distortion. It is the reduction of the quantity of graphical data required to represent a digital image. Medical images play an important role in providing detailed information about patient injuries, fractures and other critical issues related to different disease and discomfort. Transmission of a reduced size of image data will help reduce computational burden as well as storage cost. Several compression algorithms have been adopted for use by numerous research on different types images, of which embedded zero tree (EZT) and Lempel-Ziv-Wetch (LZW) compression algorithms are most commonly used amongst others. The choice of a suitable compression algorithm for compression task on specific images has been an open issue, which this research seeks to look into. In this research, the compression capability of embedded zero tree (EZT) and Lempel-Ziv-Wetch (LZW) compression algorithms on Kidney images was compared. The performance of these algorithms was evaluated using peak signal noise ratio (PSNR), and compression ratio (CR). Effective compression implies that the resultant stream of image will be smaller than the original image size, without loss of vital content. Empirical results obtained shows that the LZW, having PSNR and CR of 97.28 and 2.85 respectively outperforms EZT with PSNR of 89.17 and CR of 1.82 when the same sets of images were subjected under the two compression algorithms. The results from this research suggest the usability of LZW for compression task involving kidney images. Similarly, lossless compression is recommended because loss of vital information content of medical images could lead to wrong diagnosis.
- ItemImprove Efficiency in Management andAdministration of Immunization System through a Centralized Database System(University of Oradea, 2020-10-01) OLANIYI O. Timothy; YUSUFF R. Shakirat; BALOGUN F. BukolaImmunization is a process by which a person is made resistant to life threatening infection diseases though vaccines’ administration. Immunization history contained in the individual hand card contains vital information that is required by several users for decision making. The current manual method of keeping immunization information is prone to several challenges such as local domiciliation of immunization history, loss/damage to immunization hand cards, poor data collation process and unavailability of real-time information. The proposed centralized system will allow individual to access their immunization history anywhere and at any time. The system will ensure resident of the State can commence immunization in a health facility and continue vaccination in other facilities within the State. The system is implemented using PHP as the programming interface and MySQL as the backend. The performance of the system shows it will reduces the burden of carrying paper hand card as well as helping health care centers, government official and interested partners in speeding up report generation for decision making.
- ItemCredit Risk Prediction in Commercial Bank Using Chi-Square with SVM-RBF(Springer Nature Switzerland, 2020-11-24) Kayode Omotosho Alabi, Sulaiman Olaniyi Abdulsalam, Roseline Oluwaseun Ogundokun, and Micheal Olaolu ArowoloFinancial credit risk analysis management has been a foremost influence with a lot of challenges, especially for banks reducing their principal loss. AQ1 In this study, the machine learning technique is a promising area used for credit scoring for analyzing risks in banks. It has become critical to extract beneficial knowledge for a great number of complex datasets. In this study, a machine learning approach using Chi-Square with SVM-RBF classifier was analyzed for Taiwan bank credit data. The model provides important information with enhanced accuracy that will help in predicting loan status. The experiment achieves 93% accuracy compared to the state-of-the-art. Keywords: Credit risk · Chi-square · SVM · Bank · Machine learning
- ItemPredicting Student Academic Performance Using Artificial Neural Network.(LASU Journal of Research and Review in Science. 8(1): 1-7. Published by Lagos State University., 2021-03-23) Olaniyan, O. M., Oloyede, A. O., Aremu, I. A., Babatunde, R. S., and Adeyemi, B.M. (2021)Abstract Introduction: Predicting student academic performance plays an important role in academics. Classifying students using conventional techniques cannot give the desired level of accuracy, while doing it with the use of soft computing techniques may prove to be beneficial. Aim: This study aims to accurately predict and identify student academic performance using an Artificial Neural Network in educational institutions. Materials and Methods: Artificial Neural network was employed to compute the performance procedure over the MATLAB simulation tool. The performance of the Neural Network was evaluated by accuracy and Mean Square Error (MSE). This tool has a simple interface and can be used by an educator for classifying students and distinguishing students with low achievements or at-risk students who are likely to have low performance. Results: Findings revealed that the Neural network has the highest prediction accuracy by (98%) followed by the decision tree by (91%). Support vector machine and k-nearest neighbor had the same accuracy (83%), while naive Bayes gave lower prediction accuracy (76%). Conclusion: This work has helped to analyze the capabilities of an Artificial Neural Network in the accurate prediction of students’ academic performance using Regression and feed-forward neural network (FFNN) as evaluation metrics.
- ItemA scheme for Resilient Routing in Residue Number System based Software Defined Networks(KWASU PRESS AND PUBLISHING, 2021-04-01) Oke Afeez Adeshina, Akinbowale Nathaniel Babatunde, Oloyede Abdulkarim Ayopo, Kazeem Alagbe GbolagadeIn recent times, the advances in internet technology and the rise of network traffic, owing to relentless creation of rigorous and mission-critical applications, has been giving serious attention to resilient routing in Software Defined Network (SDN). The evolution has made it possible for extensive use of SDN to respond to the current requirements of modern networking. Residue Number System (RNS) been used to minimize latency in routing tables for Openflow protocol. However, majority of network elements remain vulnerable to failures, especially transmission devices and links. It has been shown that proactive approaches to link recovery in RNS based SDN can select alternate path during link failures, however, the challenges of fast failure recovery still exist with large data path labels for both primary and alternate routes. Furthermore, the problem of early backup path failure before primary path still remains a challenge. The paper presents an approach that is intended to improve SDN routing and reduces the computational in RNS based SDN. The proposed scheme utilizes the shortest path re-route algorithm and a reactive mechanism to respond to link failure when it occurs. The proposed scheme which will be implemented as a prototype using the mininet emulator and floodlight controller is intended to effectively route packets especially when there are link failures for RNS based SDN. Keywords: Residue Number System (RNS); Software Defined Network (SDN); resilience; OpenFlow; traffic engineering; controller
- ItemEnglish to Yoruba short message service Speech and Text Translator for Android Phones(Springer, 2021-05-20) Akinbowale Nathaniel Babatunde, Christiana Oluwakemi Abikoye, Abdulkarim Ayopo Oloyede, Roseline Oluwaseun Ogundokun, Afeez Adeshina Oke, Hafsat Omolola OlawuyiMachine language translation (MLT) has acquired a substantial quantity of investigation consideration in Europe and Asia, but works on African languages, especially the Yoruba language, are rare. There is, however, a communication barrier between the people who solely speak Yoruba and foreign visitors and places where the language is not being said or understood. There is a necessity to focus on the glitches of English to Yoruba machine translations to bridge the communication gap. This study, therefore, implemented the rule-based machine translation technique. The method used involved formulating 20 computational rules for the translation process. Offline Android-based English to Yoruba short message service application was developed using the Android Studio IDE and the some of the available inbuilt android plugins alongside a locally produced bilingual dictionary. This paper also examined translations done by the developed mobile app contrary to human translation to explore why machine translation systems still give faults in interpreting natural human language. A questionnaire was designed for the translations; they were distributed to one hundred individuals. Eighty-one participants gave their feedbacks. The responses were dispensed to statistical examination using SPSS. Discoveries showed that human translation handled best concerning precision and eloquence, which are knowledgeable by the quality and the quantity of the computational rules formulated. Keywords Machine translation language, Language translator, Speech recognition, Android, Yoruba language, Natural Language Processing. language processing how to work with this template
- ItemImproving the Accuracy of Static Source Code Based Software Change Impact Analysis Through Hybrid Techniques: A Review(Penerbit UMP, 2021-05-21) Yusuf S. R.; Bajeh A. O.; Aro T. O.; Adewole K. S.Change is an inevitable phenomenon of life. This inevitability of change in the real world has made software change an indispensable characteristic of software systems and a fundamental task of software maintenance and evolution. Changes to software may arise as a result of feature enhancement requests, bug fixes, technological advancements amongst others. The continuous evolution process of software systems can greatly affect the quality and reliability of such systems, if proper mechanisms to manage them are not adequately provided. Consequently, Software change Impact Analysis (CIA) has been identified as an approach to help address the problem. CIA is an essential activity for comprehending and identifying the impacts of potential software as a way of preventing the system from entering into an erroneous state. A good CIA technique, is one which helps to reduce maintenance costs. Hybrid CIA technique is a blend of multiple CIA techniques. A number of hybrid CIA techniques have been proposed by researchers in the literature. However, there has been no study that reviewed Hybrid CIA techniques holistically. The paper tries to fill this gap by presenting a summary of the methods and techniques so far adopted in code based Hybrid CIA techniques with a view to suggesting possible future directions. A number of literature including journal articles, conference proceedings, and workshop papers published between 2009 and 2019 related to the topic were reviewed. The following themes were employed in the analysis of the review based on their mention in most of the reviewed literature: size and type of subject software systems; level of granularity; CIA techniques and methods; and evaluation metrics. The results from the review, reveals that a combination of a minimum of two CIA techniques is sufficient to gain improved performance. Likewise, hybrid CIA techniques have always shown significant improvement in performance, over baseline technique. However, comparison of existing hybrid CIA techniques, in terms of performance, is yet to be carried out. In addition, findings from the paper, isolated Latent Semantic Indexing (LSI) as the main method utilized for analyzing textual source code data despite advancement in the field of Information Retrieval (IR). The paper further highlights areas for future research to include a performance evaluation of existing hybrid CIA techniques. To achieve this, it is proposed to have a universal benchmark source code dataset of different programming languages, size and scope. Furthermore, it is necessary to try out other categories of IR models such as Latent Dirichlet Allocation (LDA) topic model and those based on deep learning techniques like doc2vec. It would also be a good way forward, if other possible CIA combinations can be implemented, particularly in the aspect of utilizing the syntactic and semantic information inherent in source code to achieve a holistic source code CIA.
- ItemBlockchain consensus: An overview of alternative protocols(MDPI, 2021-07-27) Oyinloye, Damilare Peter; Teh, Je Sen; Jamil, Norziana; Alawida, MoatsumBlockchain networks are based on cryptographic notions that include asymmetric-key encryption, hash functions and consensus protocols. Despite their popularity, mainstream protocols, such as Proof of Work or Proof of Stake still have drawbacks. Efforts to enhance these protocols led to the birth of alternative consensus protocols, catering to specific areas, such as medicine or transportation. These protocols remain relatively unknown despite having unique merits worth investigating. Although past reviews have been published on popular blockchain consensus protocols, they do not include most of these lesser-known protocols. Highlighting these alternative consensus protocols contributes toward the advancement of the state of the art, as they have design features that may be useful to academics, blockchain practitioners and researchers. In this paper, we bridge this gap by providing an overview of alternative consensus protocols proposed within the past 3 years. We evaluate their overall performance based on metrics such as throughput, scalability, security, energy consumption, and finality. In our review, we examine the trade-offs that these consensus protocols have made in their attempts to optimize scalability and performance. To the best of our knowledge, this is the first paper that focuses on these alternative protocols, highlighting their unique features that can be used to develop future consensus protocols.
- ItemEnhanced Image Security using Residue Number System and New Arnold Transform(Department of Computer Engineering, Universitas Diponegoro, 2021-10-31) Akinbowale Nathaniel Babatunde, Oke Afeez Adeshina, Abdulkareem Ayopo Oloyede, Bello Aisha OizaThis paper aims to improve the image scrambling and encryption effect in traditional two-dimensional discrete Arnold transform by introducing a new Residue number system (RNS) with three moduli and the New Arnold Transform. The study focuses on improving the classical discrete Arnold transform with quasi-affine properties, applying image scrambling and encryption research. The design of the method is explicit to three moduli set {2n, 2n+1+1, 2n+1-1}. These moduli set includes equalized and shapely moduli leading to the effective execution of the residue to binary converter. The study employs an arithmetic residue to the binary converter and an improved Arnold transformation algorithm. The encryption process uses MATLAB to accept a digital image input and subsequently convert the image into an RNS representation. The images are connected as a group. The resulting encrypted image uses the Arnold transformation algorithm. The encrypted image is used as input at decryption using the anti-Arnold (Reverse Arnold) transformation algorithm to convert the picture to the original RNS (original pixel value). Then the RNS was used to retransform the original RNS to its binary form. Security analysis tests, like histogram analysis, keyspace, key sensitivity, and correlation coefficient analysis, were administered on the encrypted image. Results show that the hybrid system can use the improved Arnold transform algorithm with better security and no constraint on image width and size. Keywords - Cryptosystem, Forward conversion, Residue Number System, Reverse Conversion
- ItemApplication of Residue Number Systems in Enhancing the Transmission of Secured Videos(National Institute for Research and Development in Informatics – ICI Bucharest, 2021-12-01) Akinbowale Nathaniel BABATUNDE and Abdulkarim Ayopo OLOYEDEFull video Encryption has been established as the most suitable technique to guarantee the security of digital video data during transmission but cannot be widely adopted because of its large disk space and long processing time. Therefore, this paper aims at reducing the computational space complexity using Residue Number System (RNS). MPEG IV algorithm and encryption of pixel values using the traditional RNS moduli set were employed for the encryption while a modified reverse converter was used for decryption. JAVA was used for the implementation while an evaluation was performed to compare the disk size of the cipher and deciphered videos. The scheme outperformed the existing evaluated cryptosystems in terms of cipher size and the decrypted videos increased by 49% when compared with the original video. The space computational complexity problem in full video encryption was adequately reduced using RNS. It is recommended that RNS should be used for securing video transmission and storage. Keywords: Forward converter, Reverse converter, Moduli set, Residual frame, Moving Pictures Experts Group IV (MPEG IV), Video encryption, Residue Number System (RNS)
- ItemFraud detection in customers’ electricity consumption in Nigeria using machine learning approach(KWASU PRESS AND PUBLISHING, 2021-12-06) Sulaiman Olaniyi Abdulsalam*, Micheal Olaolu Arowolo , Ronke Babatunde , Musa Isiaka , Shakirat Oluwatosin Haroon SulymanElectricity theft is estimated to have cost Nigeria billions of Naira over the years. Electric utilities use data analytics to discover unusual consumption patterns and possible fraud in order to prevent electricity theft. This work uses data analysis to detect electricity theft, as well as a measure that uses this threat model to compare and evaluate anomaly detectors. This study employs machine learning algorithms to categorize fraud detection in customers’ electricity use, as data mining techniques has helped multiple companies and sectors better their various types of technology. Support Vector Machine (SVM) and C4.5 Decision Tree classification algorithms were used to detect fraud using consumer electricity use data. The accuracy of SVM and C4.5 was 63.4 percent and 65.9%, respectively. As a result, the Map-Reduced-ANOVA with SVM attained an accuracy of 77.5 %. Keywords: Fraud detection; Machine learning; classification; support vector machine; decision tree
- «
- 1 (current)
- 2
- 3
- »