Transformer_Huffman_coding
Complete Huffman coding through transformer
2022/2/19 Release Notes 1: generate a new branch
2: Divide the previous main.py file into three: main.py ----->Responsible for sequence training
model.py ----->The construction of transformers model
utils.py ----->Create a training data generation function
3: Deleted the function in the original file that will automatically generate the corresponding Huffman encoding according to the sequence
4: import Huffman package is used to generate a constant codebook
5: Beautify my code with prettier
Issues awaiting resolution : 1:Add code to generate attention map
2:The print that checks the intermediate result is removed from the code
3:Chinese comments in translation code