《双周报告教学课件.ppt》由会员分享,可在线阅读,更多相关《双周报告教学课件.ppt(24页珍藏版)》请在得力文库 - 分享文档赚钱的网站上搜索。
1、双周报告,PART 1:THE PROCESS OF ASRPART 2:NEURAL NETWORKPART 3:EXPERIMENT,Part1:the process of asr,Process of ASR (speechtext),DNN,Speech(vector),MFCC/Fbank,Acoustic feature(matrix frame by frame),text,word,phone,triphone,state,GMM-HMM,INPUT,OUTPUT,(TRAIN)DNN,INPUT,OUTPUT,DECODE,TEXT,PDF,Part2:Neural net
2、work,Bp(Back Propagation),Traditional network can not deal with linearlynon-separable directly,but itseasy for BP.,The Architecture is:Input hiddenoutput,But there is a shortcoming: Gradient disappearance,RBM(Restricted Boltzmann Machines),Give energy function E(v,h) between Visible and Hidden,p(v,h
3、),p(v|h),p(h|v),P(v|h)p(h|v),DBN(deep believe nerwork),1-N,The process of trian DBN1:Unsupervise learning RBM initalize the wight(pre-training)2:Supervise learning BP fine tune the weight,Because of the pre-train the DBN can prevent the disappeard of gradient,RNN and lstm,Good at deal with sequence
4、data,RNN is good at deal with the sequence data but the gradient disappearance on time dimension is a shotrcoming,LSTM,CNN and tdnn,1. Sparse Connectivity,2. Shared Weights3.convolution-sub-sampling,TDNN,SummarizesNN,The main task of all kind of neural network is to find :,F(x)y,Actually it is :,F(x
5、)=y,Part 3:experiment,There are FOUR experiments,1. process of ASR2.tdnn3.lstm4.simplyfying lstm(revise the config document and implement by myself),Process of asr,lstm,config,Picture from config,%WER 78.66 63827 / 81139, 472 ins, 15198 del, 48157 sub exp/nnet3/lstm_S_ld5/decode_test_word/wer_9_0.0,Simplyfying LSTM,LSTM,%WER 24.41 19803 / 81139, 224 ins, 1144 del, 18435 sub exp/nnet3/lstm_1_ld5/decode_test_word/wer_9_0.0,Summarizes,ShellRead code How to handle the bug of programHow to read configHow to creat a network by revise the config,