Auto HMM: Automatic Discrete and Continous HMM including Model selection

Overview

Auto_HMM

Hidden Markov Model

If you would like to buy me a coffee

Buy Me A Coffee

Auto HMM: Automatic Discrete and Continous HMM including Model selection

Description

Citation

Features

Instruction

License


Description

Python package to automatically perfoming model selection for discrete and continuous unsupervised HMM.


Citation

If you find this package useful or if you use it in your research or work please consider citing it as follows:

@article{tadayon2020comparative,
  title={Comparative analysis of the hidden markov model and lstm: A simulative approach},
  author={Tadayon, Manie and Pottie, Greg},
  journal={arXiv preprint arXiv:2008.03825},
  year={2020}
}

Instruction

For more information, please go over two example (HMM_testing and DHMM_testing files).


License

This software is released under the MIT liecense.

You might also like...
An official reimplementation of the method described in the INTERSPEECH 2021 paper - Speech Resynthesis from Discrete Disentangled Self-Supervised Representations.
An official reimplementation of the method described in the INTERSPEECH 2021 paper - Speech Resynthesis from Discrete Disentangled Self-Supervised Representations.

Speech Resynthesis from Discrete Disentangled Self-Supervised Representations Implementation of the method described in the Speech Resynthesis from Di

Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions
Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions

torch-imle Concise and self-contained PyTorch library implementing the I-MLE gradient estimator proposed in our NeurIPS 2021 paper Implicit MLE: Backp

This Jupyter notebook shows one way to implement a simple first-order low-pass filter on sampled data in discrete time.

How to Implement a First-Order Low-Pass Filter in Discrete Time We often teach or learn about filters in continuous time, but then need to implement t

 Projecting interval uncertainty through the discrete Fourier transform
Projecting interval uncertainty through the discrete Fourier transform

Projecting interval uncertainty through the discrete Fourier transform This repo

Drslmarkov - Distributionally Robust Structure Learning for Discrete Pairwise Markov Networks

Distributionally Robust Structure Learning for Discrete Pairwise Markov Networks

Generative Flow Networks for Discrete Probabilistic Modeling

Energy-based GFlowNets Code for Generative Flow Networks for Discrete Probabilistic Modeling by Dinghuai Zhang, Nikolay Malkin, Zhen Liu, Alexandra Vo

Implementation of the method described in the Speech Resynthesis from Discrete Disentangled Self-Supervised Representations.
Implementation of the method described in the Speech Resynthesis from Discrete Disentangled Self-Supervised Representations.

Speech Resynthesis from Discrete Disentangled Self-Supervised Representations Implementation of the method described in the Speech Resynthesis from Di

A library built upon PyTorch for building embeddings on discrete event sequences using self-supervision

pytorch-lifestream a library built upon PyTorch for building embeddings on discrete event sequences using self-supervision. It can process terabyte-si

High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.
High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.

What is xLearn? xLearn is a high performance, easy-to-use, and scalable machine learning package that contains linear model (LR), factorization machin

Comments
  • Computer vision project

    Computer vision project

    We are doing a project to detect ADHD from a video and we extract a feature vector from each frame. Now we want to extract a fixed length feature vector for the whole video and we think that we can use markov model in that purpose. Could you tell us if your code can be helpful in our case as we aren't sure how we could use it for training and testing.

    opened by Aya-Mahmoud-99 0
  • utilizing Auto_HMM in finance

    utilizing Auto_HMM in finance

    Hi, I'm trying to use Auto_HMM to find upward and downward, and stable states of the stock's close prices time series. This model is supposed to learn to distinguish which period has an upward trend and which one has a downward trend and which hasn't a specific trend(rather monotonic). Below you can see the Adj Close Price data from S&P 500 index between '2010-07-01' and '2011-11-01'. image Three periods are denoted by a line, each with a different color. I want to use this data to feed the HMM model and get the states according to periods. Here is what I have tried so far:

    import pandas as pd
    from pandas_datareader import data as web
    import numpy as np
    
    x = web.get_data_yahoo('^GSPC' , start = '2010-07-01' , end = '2011-11-01')['Adj Close'].rename('close')
    x[250:].to_csv("the path")
    
    # ------------------------------------------------------------------------------------------------
     # then, I used your code:
    from Hidden_Markov_Model import *
    from Hidden_Markov_Model.HMM import *
    import time
    
    Start=time.time()
    Train_ratio=0.1
    Cov_Type='diag'
    Max_state=3
    Max_mixture=2
    Iter=1000
    Feat=1
    
    # I have 140 records, so I guess N should be equal to 140.
    N=140
    T=50
    flag=1
    Path="the path"
    Data=pd.read_csv(Path)
    Exam_HMM=Supervised_HMM(Train_ratio,Cov_Type,Max_state,Max_mixture,Iter,Feat,N,T,Data,flag)
    Exam_HMM.Best_States()
    END=time.time()
    print('Total Time Takes in seconds',END-Start)
    

    But I got this error:

    One mixture component is over 1
    ---------------------------------------------------------------------------
    ValueError                                Traceback (most recent call last)
    ~\AppData\Local\Temp/ipykernel_7616/86438458.py in <module>
         17 Data=pd.read_csv(Path)
         18 Exam_HMM=Supervised_HMM(Train_ratio,Cov_Type,Max_state,Max_mixture,Iter,Feat,N,T,Data,flag)
    ---> 19 Exam_HMM.Best_States()
         20 END=time.time()
         21 print('Total Time Takes in seconds',END-Start)
    
    c:\Users\Shayan\Desktop\Hidden_Markov_Model\HMM.py in Best_States(self)
        146         self.Len=[self.T for ii in range(0,self.Data_train.shape[0])]   # Lengths must be list
        147         self.Train_Data = np.array(self.Data_train).reshape((-1,1)) # Convert to numpy array with one column
    --> 148         self.AIC_BIC()
        149         self.Best_BIC()
        150 
    
    c:\Users\Shayan\Desktop\Hidden_Markov_Model\HMM.py in AIC_BIC(self)
        176             # self.num_params=self.Max_state*(self.Max_state-1)+ self.Max_state*(ii-1)+(ii*self.Max_state)*self.Feat+((self.Feat**2+self.Feat)/2)*ii*self.Max_state  # Full Covariance
        177             self.num_params = self.Max_state*(self.Max_state-1)+ self.Max_state*(ii-1)+(ii*self.Max_state)*self.Feat+(self.Max_state*ii*self.Feat)  # Diagonal
    --> 178             Model=GMMHMM(n_components=self.Max_state,n_mix=ii,covariance_type=self.Cov_Type,params='stmcw', init_params='stmcw',tol=pow(10,-5),n_iter=self.Iter).fit(self.Train_Data,self.Len)
        179             AIC.append(-2 * Model.score(self.Train_Data) + 2 * self.num_params)
        180             BIC.append(-2 * Model.score(self.Train_Data) +  self.num_params * np.log(self.Train_Data.shape[0]))
    
    ~\Anaconda3\envs\Python3.10\lib\site-packages\hmmlearn\base.py in fit(self, X, lengths)
        504             stats = self._initialize_sufficient_statistics()
        505             curr_log_prob = 0
    --> 506             for sub_X in _utils.split_X_lengths(X, lengths):
        507                 lattice, log_prob, posteriors, fwdlattice, bwdlattice = \
        508                         impl(sub_X)
    
    ~\Anaconda3\envs\Python3.10\lib\site-packages\hmmlearn\_utils.py in split_X_lengths(X, lengths)
         14         n_samples = len(X)
         15         if cs[-1] > n_samples:
    ---> 16             raise ValueError(
         17                 f"more than {n_samples} samples in lengths array {lengths}")
         18         elif cs[-1] != n_samples:
    
    ValueError: more than 14 samples in lengths array [50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50, 50]
    

    Can you please help me? How can I achieve my goal?


    The expected output in my case:

    [0,0,0,0,0,0,0...,0,1,1,1,1,1,1,1,1,...,1,2,2,2,2,2,2,2...,2]
    

    I expect HMM to find those three states(downward trend - monotonic period - upward trend).

    opened by shayandavoodii 4
  • How to use this dataset?

    How to use this dataset?

    Hi Manitadayon, I am glad to find your code and thanks from sharig it. I am new in machine learning. Now I want to code for this paper and want to use this dataset. But i don't know how can i do this. Can you help me? I don't know exactly what is states in this paper and how give dataset to your code, and how wite code for this paper.

    opened by moryekram 6
Owner
Chess_champion
Ph.D. from UCLA. Expert in machine learning and causal inference, Python, R, and MATLAB. want to learn more about me visit my personal website.
Chess_champion
PyTorch Implementation for AAAI'21 "Do Response Selection Models Really Know What's Next? Utterance Manipulation Strategies for Multi-turn Response Selection"

UMS for Multi-turn Response Selection Implements the model described in the following paper Do Response Selection Models Really Know What's Next? Utte

Taesun Whang 47 Nov 22, 2022
Implementation of "Selection via Proxy: Efficient Data Selection for Deep Learning" from ICLR 2020.

Selection via Proxy: Efficient Data Selection for Deep Learning This repository contains a refactored implementation of "Selection via Proxy: Efficien

Stanford Future Data Systems 70 Nov 16, 2022
SelfAugment extends MoCo to include automatic unsupervised augmentation selection.

SelfAugment extends MoCo to include automatic unsupervised augmentation selection. In addition, we've included the ability to pretrain on several new datasets and included a wandb integration.

Colorado Reed 24 Oct 26, 2022
Automatic self-diagnosis program (python required)Automatic self-diagnosis program (python required)

auto-self-checker 자동으로 자가진단 해주는 프로그램(python 필요) 중요 이 프로그램이 실행될때에는 절대로 마우스포인터를 움직이거나 키보드를 건드리면 안된다(화면인식, 마우스포인터로 직접 클릭) 사용법 프로그램을 구동할 폴더 내의 cmd창에서 pip

null 1 Dec 30, 2021
OMAMO: orthology-based model organism selection

OMAMO: orthology-based model organism selection OMAMO is a tool that suggests the best model organism to study a biological process based on orthologo

Dessimoz Lab 5 Apr 22, 2022
PyTorch package for the discrete VAE used for DALL·E.

Overview [Blog] [Paper] [Model Card] [Usage] This is the official PyTorch package for the discrete VAE used for DALL·E. Installation Before running th

OpenAI 9.5k Jan 5, 2023
DCT-Mask: Discrete Cosine Transform Mask Representation for Instance Segmentation

DCT-Mask: Discrete Cosine Transform Mask Representation for Instance Segmentation This project hosts the code for implementing the DCT-MASK algorithms

Alibaba Cloud 57 Nov 27, 2022
Official codes for the paper "Learning Hierarchical Discrete Linguistic Units from Visually-Grounded Speech"

ResDAVEnet-VQ Official PyTorch implementation of Learning Hierarchical Discrete Linguistic Units from Visually-Grounded Speech What is in this repo? M

Wei-Ning Hsu 21 Aug 23, 2022
This is 2nd term discrete maths project done by UCU students that uses backtracking to solve various problems.

Backtracking Project Sponsors This is a project made by UCU students: Olha Liuba - crossword solver implementation Hanna Yershova - sudoku solver impl

Dasha 4 Oct 17, 2021