Detecting Cell Outage By Applying Density Based Anomaly Detection Algorithm Using Machine Learning Technique The Case Of Ethio-telecom Umts Network

Telecommunication Engineering Project Topics

Get the Complete Project Materials Now! »

This thesis develops a model to detect cell outage on real ethio telecom network data by usingrndensity based anomaly detection algorithms through the application of machine learningrntechnique. Cell outage is the total/partial loss of radio network in a given area and the process ofrndetecting cell outage is called cell outage detection. Cross-Industry Standard Process (CRISPDM)rnrnmachine learning methodology, which is a six staged open standard process model thatrndescribes common approaches for data mining or machine learning was used. The consideredrndata was normal and problematic network environment obtained from ethio telecom UMTSrnnetwork in Addis Ababa. The proposed detection framework used network performance datarn(incoming handover (inHO), which is the process of transferring an ongoing call from one cell tornthe other, and traffic data, originated from base station and terminated to mobile device) of thernneighbor cells to capture the normal network state and to detect the outage of the target cellrnautomatically in a pre-set time interval in the UMTS network environment. rnTo profile the normal network operation, the study used two density based anomaly detectionrnalgorithms; namely, the K- Nearest Neighbor (K-NN) and Local Outlier Factor (LOF) algorithms,rnof which one was selected based on its performance during the training. To validate the models,rnK-fold cross validation technique was used and for the selection of the optimal model, parameterrnselection was done for different values of K (K=1,2, 3…30). To compare the two algorithms,rnReceiver Operating Characteristic (ROC) curves were used. Based on the results, the K-NNADrnwas found to be of a better performance than the LOFAD, thus was selected as a detector in thernprofiling stage. The proof of the system model was tested by using real problematic network staterndata and the results of classic data mining metrics were obtained. Based on the results obtainedrnfrom the testing, the K-NNAD method was found to perform better in detecting outage cells in thernproposed framework.

Get Full Work

Report copyright infringement or plagiarism

Be the First to Share On Social



1GB data
1GB data

RELATED TOPICS

1GB data
1GB data
Detecting Cell Outage By Applying Density Based Anomaly Detection Algorithm Using Machine  Learning Technique The Case Of Ethio-telecom Umts Network

258