Anomaly Detection Based on Hierarchical Clustering of Mobile Robot Data

Overview

Anomaly-Detection-Based-on-Hierarchical-Clustering-of-Mobile-Robot-Data

1. Introduction

This report is present an approach to detect anomaly of mobile robot's current and vibration data. The main idea is examine all data, separate them into two cluster as normal and anomaly and then using these clustering results figure out the merged anomaly score for each data sample. For this purpose, both of current and vibration data are cluster by using Hierarchical clustering algorithm. Before the clustering there are several preprocessing step that are windowing, feature extraction, dynamic time warping and min-max normalization.

You can access our paper here.

2. Interested Data

There are two different types of data that are coming from mobile robots sensors as current and vibration data. Both of them are produce at same frequency but they have different characteristic. Although the current data is numeric data, the vibration data is time series data. So, current data has a single value per each data packet but vibration data has much more value per each data packet.

Current Data Sample Vibration Data Sample

3. Proposed Method

There are two different method are proposed to detect anomaly on data. They have common step as windowing. And also they have some other different steps like feature extraction, normalization and dynamic time warping. These all are about preprocessing steps. After the preprocessing steps data is clustering into two subset by using hierarchical clustering as normal and anomaly. The anomaly scores of each data sample are produces as a result of clustering. And then, the results of two method are collect and anomaly scores are merge for each same data sample. While merging anomaly score, the mean of them are take. Given two method is perform separately using both current and vibration data. Proposed method is shown as below.

Rest of here, method 1 is represent a method which is use feature extraction and method 2 is also represent a method which is use DTW. Remember that both of these methods have also common steps.

3.1 Preprocessing Steps

A. Windowing
In this process, the data are parsed into subsets named as window with same size. For the extract of features of data, the data must be a time series data. In this way, the data are converted time series data. In this project, window size is 3. This step is implement for both two methods. Sample windowing process output is shown as below:

B. Feature Extraction
The features are extracted separately for each window. There are nine different feature as given below:

C. Dynamic Time Warping
In method 2, DTW is used for calculate similarity instead of Euclidean distance. After the windowing process, the data was converted time series data. So now, it is possible to use DTW on data.

Feature Extraction Dynamic Time Warping

D. Min-Max Normalization
Min-max normalization is one of the most common ways to normalize data. For every feature, the minimum value of that feature gets transformed into a 0, the maximum value gets transformed into a 1, and every other value gets transformed into a decimal between 0 and 1. Min-max normalization is executed on features that extracted from window. This step is implement only for method 1.

3.2 Hierarchical Clustering

This clustering technique is divided into two types as agglomerative and divisive. In this method, agglomerative approach is used. At this step, preprocessing steps is already done for method 1 and method 2 and the windows are ready to clustering. These windows are put into hierarchical algorithm to find clusters. As a result, the clusters which windows are belong to are found. They are used for calculate the anomaly score for whole data. This step is implemented for both two methods. And, the dendrogram which is represent the clustering result is produce.

3.3 Find Anomaly Score

The anomaly score is calculated separately from result of hierarchical clustering of both method 1 and method 2. The hierarchical clustering algorithm is produce clusters for each window. With use these clusters, the anomaly score is calculated for each cluster as given below (C: interested cluster, #All window: number of all window, #C window: number of window that belong to cluster C): C_anomaly=(#All Window - #C Window)/(#All Window)
< After the calculation of anomaly score for each method, the merged anomaly score is generate from mean of them. The formula is as follows for generate merged score: C_(merged anomaly score)=(C_(anomaly of method1)+ C_(anomaly of method2))/2
The anomaly score which is higher mean it is highly possible to be anomaly.

4. Experiments

An anomaly score is located right-top of figure. Different clusters are shown with different color.

Current Data Results

Feature Extracted Clustering Anomaly Score DTW Clustering Anoamly Score
Merged Anomaly Score

Vibration Data Results

Feature Extracted Clustering Anomaly Score DTW Clustering Anoamly Score
Merged Anomaly Score

You might also like...
Hierarchical-Bayesian-Defense - Towards Adversarial Robustness of Bayesian Neural Network through Hierarchical Variational Inference (Openreview) (JMLR'19) A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)
(JMLR'19) A Python Toolbox for Scalable Outlier Detection (Anomaly Detection)

Python Outlier Detection (PyOD) Deployment & Documentation & Stats Build Status & Coverage & Maintainability & License PyOD is a comprehensive and sca

A Python Library for Graph Outlier Detection (Anomaly Detection)
A Python Library for Graph Outlier Detection (Anomaly Detection)

PyGOD is a Python library for graph outlier detection (anomaly detection). This exciting yet challenging field has many key applications, e.g., detect

Dynamic vae - Dynamic VAE algorithm is used for anomaly detection of battery data
Dynamic vae - Dynamic VAE algorithm is used for anomaly detection of battery data

Dynamic VAE frame Automatic feature extraction can be achieved by probability di

Paper list of log-based anomaly detection

Paper list of log-based anomaly detection

MemStream: Memory-Based Anomaly Detection in Multi-Aspect Streams with Concept Drift
MemStream: Memory-Based Anomaly Detection in Multi-Aspect Streams with Concept Drift

MemStream Implementation of MemStream: Memory-Based Anomaly Detection in Multi-Aspect Streams with Concept Drift . Siddharth Bhatia, Arjit Jain, Shivi

LogDeep is an open source deeplearning-based log analysis toolkit for automated anomaly detection.
LogDeep is an open source deeplearning-based log analysis toolkit for automated anomaly detection.

LogDeep is an open source deeplearning-based log analysis toolkit for automated anomaly detection.

Industrial knn-based anomaly detection for images. Visit streamlit link to check out the demo.
Industrial knn-based anomaly detection for images. Visit streamlit link to check out the demo.

Industrial KNN-based Anomaly Detection ⭐ Now has streamlit support! ⭐ Run $ streamlit run streamlit_app.py This repo aims to reproduce the results of

LightLog is an open source deep learning based lightweight log analysis tool for log anomaly detection.

LightLog Introduction LightLog is an open source deep learning based lightweight log analysis tool for log anomaly detection. Function description [BG

Owner
Zekeriyya Demirci
Research Assistant at Eskişehir Osmangazi University , Contributor of VALU3S
Zekeriyya Demirci
Position detection system of mobile robot in the warehouse enviroment

Autonomous-Forklift-System About | GUI | Tests | Starting | License | Author | ?? About An application that run the autonomous forklift paletization a

Kamil Goś 1 Nov 24, 2021
Industrial Image Anomaly Localization Based on Gaussian Clustering of Pre-trained Feature

Industrial Image Anomaly Localization Based on Gaussian Clustering of Pre-trained Feature Q. Wan, L. Gao, X. Li and L. Wen, "Industrial Image Anomaly

smiler 6 Dec 25, 2022
MohammadReza Sharifi 27 Dec 13, 2022
Code for: Gradient-based Hierarchical Clustering using Continuous Representations of Trees in Hyperbolic Space. Nicholas Monath, Manzil Zaheer, Daniel Silva, Andrew McCallum, Amr Ahmed. KDD 2019.

gHHC Code for: Gradient-based Hierarchical Clustering using Continuous Representations of Trees in Hyperbolic Space. Nicholas Monath, Manzil Zaheer, D

Nicholas Monath 35 Nov 16, 2022
Deep Reinforcement Learning for mobile robot navigation in ROS Gazebo simulator

DRL-robot-navigation Deep Reinforcement Learning for mobile robot navigation in ROS Gazebo simulator. Using Twin Delayed Deep Deterministic Policy Gra

null 87 Jan 7, 2023
Streaming Anomaly Detection Framework in Python (Outlier Detection for Streaming Data)

Python Streaming Anomaly Detection (PySAD) PySAD is an open-source python framework for anomaly detection on streaming multivariate data. Documentatio

Selim Firat Yilmaz 181 Dec 18, 2022
Graph Regularized Residual Subspace Clustering Network for hyperspectral image clustering

Graph Regularized Residual Subspace Clustering Network for hyperspectral image clustering

Yaoming Cai 5 Jul 18, 2022
Awesome Deep Graph Clustering is a collection of SOTA, novel deep graph clustering methods

ADGC: Awesome Deep Graph Clustering ADGC is a collection of state-of-the-art (SOTA), novel deep graph clustering methods (papers, codes and datasets).

yueliu1999 297 Dec 27, 2022
Guiding evolutionary strategies by (inaccurate) differentiable robot simulators @ NeurIPS, 4th Robot Learning Workshop

Guiding Evolutionary Strategies by Differentiable Robot Simulators In recent years, Evolutionary Strategies were actively explored in robotic tasks fo

Vladislav Kurenkov 4 Dec 14, 2021
Space robot - (Course Project) Using the space robot to capture the target satellite that is disabled and spinning, then stabilize and fix it up

Space robot - (Course Project) Using the space robot to capture the target satellite that is disabled and spinning, then stabilize and fix it up

Mingrui Yu 3 Jan 7, 2022