请 [注册] 或 [登录]  | 返回主站

量化交易吧 /  数理科学 帖子:3364784 新帖:20

Keras框架 深度学习模型CNN+LSTM+Atte

Tango发表于:5 月 10 日 02:10回复(1)

有问题欢迎与我交流。
评论留言或者联系我的邮箱:jiaohaibin@ruc.edu.cn
数据由JQData本地量化金融数据支持
实验2:
使?历史前5个时刻的 open close high low volume money
预测当前时刻的收盘价,
即 [None, 5, 6] => [None, 1] # None是 batch_size

这一篇继续对 实验2的模型 进行拓展,增加Attention机制

先写点Attention的简单介绍
attention本质:
其实就是一个加权求和。
attention处理的问题,往往面临的是这样一个场景:
你有k个d维的特征向量hi(i=1,2,...,k)。现在你想整合这k个特征向量的信息,变成一个向量h?(一般也是d维)。
solution:
1.一个最简单粗暴的办法就是这k个向量以element-wise取平均,得到新的向量,作为h?,显然不够合理。
2.较为合理的办法就是,加权平均,即(αi为权重): 而attention所做的事情就是如何将αi(权重)合理的算出来。

神经科学和计算神经科学中的neural processes已经广泛研究了注意力机制。视觉注意力机制是一个特别值得研究的方向:许多动物专注于视觉输入的特定部分,去计算适当的反映。这个原理对神经计算有很大的影响,因为我们需要选择最相关的信息,而不是使用所有可用的信息,所有可用信息中有很大一部分与计算神经元反映无关。一个类似于视觉专注于输入的特定部分,也就是注意力机制已经用于深度学习、语音识别、翻译、推理以及视觉识别。

模型架构
Attention.jpg
实验结果:结果看误差:MSE Test loss/误差: 0.0005358342003804944
Attention.png

import pandas as pdimport time, datetimedf_data_5minute=pd.read_csv('黄金主力5分钟数据.csv')
df_data_5minute.head()

.dataframe tbody tr th:only-of-type {        vertical-align: middle;    }    .dataframe tbody tr th {        vertical-align: top;    }    .dataframe thead th {        text-align: right;    }


Unnamed: 0openclosehighlowvolumemoney
02016-01-04 09:05:00226.70226.65226.85226.455890.01.335146e+09
12016-01-04 09:10:00226.75226.50226.75226.402562.05.804133e+08
22016-01-04 09:15:00226.45226.45226.60226.401638.03.709666e+08
32016-01-04 09:20:00226.45226.25226.50226.203162.07.157891e+08
42016-01-04 09:25:00226.25226.25226.30226.201684.03.809907e+08
df_data_5minute

.dataframe tbody tr th:only-of-type {        vertical-align: middle;    }    .dataframe tbody tr th {        vertical-align: top;    }    .dataframe thead th {        text-align: right;    }


Unnamed: 0openclosehighlowvolumemoney
02016-01-04 09:05:00226.70226.65226.85226.455890.01.335146e+09
12016-01-04 09:10:00226.75226.50226.75226.402562.05.804133e+08
22016-01-04 09:15:00226.45226.45226.60226.401638.03.709666e+08
32016-01-04 09:20:00226.45226.25226.50226.203162.07.157891e+08
42016-01-04 09:25:00226.25226.25226.30226.201684.03.809907e+08
52016-01-04 09:30:00226.25226.30226.35226.20922.02.086313e+08
62016-01-04 09:35:00226.30226.35226.40226.202476.05.603541e+08
72016-01-04 09:40:00226.30226.45226.45226.252516.05.695246e+08
82016-01-04 09:45:00226.45226.35226.45226.301344.03.042327e+08
92016-01-04 09:50:00226.30226.30226.35226.201414.03.199363e+08
102016-01-04 09:55:00226.35226.45226.50226.301610.03.645328e+08
112016-01-04 10:00:00226.45226.40226.50226.40972.02.200957e+08
122016-01-04 10:05:00226.40226.50226.55226.352004.04.538166e+08
132016-01-04 10:10:00226.50226.45226.55226.40780.01.766423e+08
142016-01-04 10:15:00226.45226.45226.50226.401530.03.464690e+08
152016-01-04 10:35:00226.55226.45226.65226.452564.05.807784e+08
162016-01-04 10:40:00226.45226.50226.55226.45900.02.038475e+08
172016-01-04 10:45:00226.55226.70226.80226.503008.06.817039e+08
182016-01-04 10:50:00226.70226.65226.85226.602510.05.691306e+08
192016-01-04 10:55:00226.65226.60226.65226.60930.02.107595e+08
202016-01-04 11:00:00226.65226.75226.75226.601184.02.683818e+08
212016-01-04 11:05:00226.75226.65226.75226.601044.02.366603e+08
222016-01-04 11:10:00226.65226.60226.70226.60342.07.751130e+07
232016-01-04 11:15:00226.60226.60226.65226.55640.01.450196e+08
242016-01-04 11:20:00226.60226.65226.70226.60502.01.137778e+08
252016-01-04 11:25:00226.65226.95226.95226.653222.07.308042e+08
262016-01-04 11:30:00226.90226.90226.95226.801472.03.339398e+08
272016-01-04 13:35:00227.10227.25227.25227.004894.01.111496e+09
282016-01-04 13:40:00227.25227.55227.60227.205338.01.214103e+09
292016-01-04 13:45:00227.60227.75228.00227.508612.01.961599e+09
........................
532802017-12-29 10:35:00278.05277.95278.05277.90448.01.245318e+08
532812017-12-29 10:40:00277.90277.95278.00277.90506.01.406423e+08
532822017-12-29 10:45:00277.95277.95278.00277.95180.05.003790e+07
532832017-12-29 10:50:00277.95278.00278.05277.95936.02.602273e+08
532842017-12-29 10:55:00278.05277.90278.05277.90942.02.618281e+08
532852017-12-29 11:00:00277.85277.90277.95277.85518.01.439454e+08
532862017-12-29 11:05:00277.95277.95277.95277.90614.01.706443e+08
532872017-12-29 11:10:00277.90277.90277.95277.851046.02.906776e+08
532882017-12-29 11:15:00277.95277.90277.95277.90206.05.725350e+07
532892017-12-29 11:20:00277.90277.90277.95277.85740.02.056435e+08
532902017-12-29 11:25:00277.90277.85277.90277.85200.05.557570e+07
532912017-12-29 11:30:00277.90277.90277.95277.85756.02.100840e+08
532922017-12-29 13:35:00277.90278.00278.00277.90490.01.362097e+08
532932017-12-29 13:40:00278.00278.05278.15278.00768.02.135675e+08
532942017-12-29 13:45:00278.10278.15278.15278.05252.07.008070e+07
532952017-12-29 13:50:00278.10278.05278.10278.00800.02.224430e+08
532962017-12-29 13:55:00278.00278.00278.05277.95184.05.115390e+07
532972017-12-29 14:00:00278.00277.95278.00277.90474.01.317464e+08
532982017-12-29 14:05:00277.95277.95277.95277.90334.09.282880e+07
532992017-12-29 14:10:00277.95277.90277.95277.90332.09.226560e+07
533002017-12-29 14:15:00277.90277.95277.95277.90672.01.867720e+08
533012017-12-29 14:20:00277.90277.85277.95277.85994.02.762458e+08
533022017-12-29 14:25:00277.90277.90277.95277.85352.09.781830e+07
533032017-12-29 14:30:00277.90277.80277.95277.80784.02.178426e+08
533042017-12-29 14:35:00277.85277.80277.85277.75920.02.555711e+08
533052017-12-29 14:40:00277.80277.80277.85277.75606.01.683349e+08
533062017-12-29 14:45:00277.80277.85277.85277.80560.01.555840e+08
533072017-12-29 14:50:00277.85277.85277.90277.80802.02.228271e+08
533082017-12-29 14:55:00277.85277.75277.90277.751236.03.433855e+08
533092017-12-29 15:00:00277.80277.80277.90277.701790.04.972797e+08

53310 rows × 7 columns

df_data_5minute.drop('Unnamed: 0', axis=1, inplace=True)df_data_5minute

.dataframe tbody tr th:only-of-type {        vertical-align: middle;    }    .dataframe tbody tr th {        vertical-align: top;    }    .dataframe thead th {        text-align: right;    }


openclosehighlowvolumemoney
0226.70226.65226.85226.455890.01.335146e+09
1226.75226.50226.75226.402562.05.804133e+08
2226.45226.45226.60226.401638.03.709666e+08
3226.45226.25226.50226.203162.07.157891e+08
4226.25226.25226.30226.201684.03.809907e+08
5226.25226.30226.35226.20922.02.086313e+08
6226.30226.35226.40226.202476.05.603541e+08
7226.30226.45226.45226.252516.05.695246e+08
8226.45226.35226.45226.301344.03.042327e+08
9226.30226.30226.35226.201414.03.199363e+08
10226.35226.45226.50226.301610.03.645328e+08
11226.45226.40226.50226.40972.02.200957e+08
12226.40226.50226.55226.352004.04.538166e+08
13226.50226.45226.55226.40780.01.766423e+08
14226.45226.45226.50226.401530.03.464690e+08
15226.55226.45226.65226.452564.05.807784e+08
16226.45226.50226.55226.45900.02.038475e+08
17226.55226.70226.80226.503008.06.817039e+08
18226.70226.65226.85226.602510.05.691306e+08
19226.65226.60226.65226.60930.02.107595e+08
20226.65226.75226.75226.601184.02.683818e+08
21226.75226.65226.75226.601044.02.366603e+08
22226.65226.60226.70226.60342.07.751130e+07
23226.60226.60226.65226.55640.01.450196e+08
24226.60226.65226.70226.60502.01.137778e+08
25226.65226.95226.95226.653222.07.308042e+08
26226.90226.90226.95226.801472.03.339398e+08
27227.10227.25227.25227.004894.01.111496e+09
28227.25227.55227.60227.205338.01.214103e+09
29227.60227.75228.00227.508612.01.961599e+09
.....................
53280278.05277.95278.05277.90448.01.245318e+08
53281277.90277.95278.00277.90506.01.406423e+08
53282277.95277.95278.00277.95180.05.003790e+07
53283277.95278.00278.05277.95936.02.602273e+08
53284278.05277.90278.05277.90942.02.618281e+08
53285277.85277.90277.95277.85518.01.439454e+08
53286277.95277.95277.95277.90614.01.706443e+08
53287277.90277.90277.95277.851046.02.906776e+08
53288277.95277.90277.95277.90206.05.725350e+07
53289277.90277.90277.95277.85740.02.056435e+08
53290277.90277.85277.90277.85200.05.557570e+07
53291277.90277.90277.95277.85756.02.100840e+08
53292277.90278.00278.00277.90490.01.362097e+08
53293278.00278.05278.15278.00768.02.135675e+08
53294278.10278.15278.15278.05252.07.008070e+07
53295278.10278.05278.10278.00800.02.224430e+08
53296278.00278.00278.05277.95184.05.115390e+07
53297278.00277.95278.00277.90474.01.317464e+08
53298277.95277.95277.95277.90334.09.282880e+07
53299277.95277.90277.95277.90332.09.226560e+07
53300277.90277.95277.95277.90672.01.867720e+08
53301277.90277.85277.95277.85994.02.762458e+08
53302277.90277.90277.95277.85352.09.781830e+07
53303277.90277.80277.95277.80784.02.178426e+08
53304277.85277.80277.85277.75920.02.555711e+08
53305277.80277.80277.85277.75606.01.683349e+08
53306277.80277.85277.85277.80560.01.555840e+08
53307277.85277.85277.90277.80802.02.228271e+08
53308277.85277.75277.90277.751236.03.433855e+08
53309277.80277.80277.90277.701790.04.972797e+08

53310 rows × 6 columns

df=df_data_5minuteclose = df['close']df.drop(labels=['close'], axis=1,inplace = True)df.insert(0, 'close', close)df

.dataframe tbody tr th:only-of-type {        vertical-align: middle;    }    .dataframe tbody tr th {        vertical-align: top;    }    .dataframe thead th {        text-align: right;    }


closeopenhighlowvolumemoney
0226.65226.70226.85226.455890.01.335146e+09
1226.50226.75226.75226.402562.05.804133e+08
2226.45226.45226.60226.401638.03.709666e+08
3226.25226.45226.50226.203162.07.157891e+08
4226.25226.25226.30226.201684.03.809907e+08
5226.30226.25226.35226.20922.02.086313e+08
6226.35226.30226.40226.202476.05.603541e+08
7226.45226.30226.45226.252516.05.695246e+08
8226.35226.45226.45226.301344.03.042327e+08
9226.30226.30226.35226.201414.03.199363e+08
10226.45226.35226.50226.301610.03.645328e+08
11226.40226.45226.50226.40972.02.200957e+08
12226.50226.40226.55226.352004.04.538166e+08
13226.45226.50226.55226.40780.01.766423e+08
14226.45226.45226.50226.401530.03.464690e+08
15226.45226.55226.65226.452564.05.807784e+08
16226.50226.45226.55226.45900.02.038475e+08
17226.70226.55226.80226.503008.06.817039e+08
18226.65226.70226.85226.602510.05.691306e+08
19226.60226.65226.65226.60930.02.107595e+08
20226.75226.65226.75226.601184.02.683818e+08
21226.65226.75226.75226.601044.02.366603e+08
22226.60226.65226.70226.60342.07.751130e+07
23226.60226.60226.65226.55640.01.450196e+08
24226.65226.60226.70226.60502.01.137778e+08
25226.95226.65226.95226.653222.07.308042e+08
26226.90226.90226.95226.801472.03.339398e+08
27227.25227.10227.25227.004894.01.111496e+09
28227.55227.25227.60227.205338.01.214103e+09
29227.75227.60228.00227.508612.01.961599e+09
.....................
53280277.95278.05278.05277.90448.01.245318e+08
53281277.95277.90278.00277.90506.01.406423e+08
53282277.95277.95278.00277.95180.05.003790e+07
53283278.00277.95278.05277.95936.02.602273e+08
53284277.90278.05278.05277.90942.02.618281e+08
53285277.90277.85277.95277.85518.01.439454e+08
53286277.95277.95277.95277.90614.01.706443e+08
53287277.90277.90277.95277.851046.02.906776e+08
53288277.90277.95277.95277.90206.05.725350e+07
53289277.90277.90277.95277.85740.02.056435e+08
53290277.85277.90277.90277.85200.05.557570e+07
53291277.90277.90277.95277.85756.02.100840e+08
53292278.00277.90278.00277.90490.01.362097e+08
53293278.05278.00278.15278.00768.02.135675e+08
53294278.15278.10278.15278.05252.07.008070e+07
53295278.05278.10278.10278.00800.02.224430e+08
53296278.00278.00278.05277.95184.05.115390e+07
53297277.95278.00278.00277.90474.01.317464e+08
53298277.95277.95277.95277.90334.09.282880e+07
53299277.90277.95277.95277.90332.09.226560e+07
53300277.95277.90277.95277.90672.01.867720e+08
53301277.85277.90277.95277.85994.02.762458e+08
53302277.90277.90277.95277.85352.09.781830e+07
53303277.80277.90277.95277.80784.02.178426e+08
53304277.80277.85277.85277.75920.02.555711e+08
53305277.80277.80277.85277.75606.01.683349e+08
53306277.85277.80277.85277.80560.01.555840e+08
53307277.85277.85277.90277.80802.02.228271e+08
53308277.75277.85277.90277.751236.03.433855e+08
53309277.80277.80277.90277.701790.04.972797e+08

53310 rows × 6 columns

data_train =df.iloc[:int(df.shape[0] * 0.7), :]data_test = df.iloc[int(df.shape[0] * 0.7):, :]print(data_train.shape, data_test.shape)
(37317, 6) (15993, 6)
# -*- coding: utf-8 -*-import pandas as pdimport numpy as npimport tensorflow as tfimport matplotlib.pyplot as plt%matplotlib inlinefrom sklearn.preprocessing import MinMaxScalerimport timescaler = MinMaxScaler(feature_range=(-1, 1))scaler.fit(data_train)
/Users/jiaohaibin/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
MinMaxScaler(copy=True, feature_range=(-1, 1))
data_train = scaler.transform(data_train)data_test = scaler.transform(data_test)
data_train
array([[-0.98877193, -0.98736842, -0.98459384, -0.99297259, -0.82504604,
        -0.85978547],
       [-0.99298246, -0.98596491, -0.98739496, -0.99437807, -0.92389948,
        -0.93904608],
       [-0.99438596, -0.99438596, -0.99159664, -0.99437807, -0.95134557,
        -0.96104178],
       ...,
       [ 0.61263158,  0.61824561,  0.61484594,  0.61349262, -0.90916652,
        -0.90885626],
       [ 0.61684211,  0.61403509,  0.61204482,  0.61630358, -0.94754352,
        -0.94737162],
       [ 0.6154386 ,  0.6154386 ,  0.61064426,  0.61349262, -0.94445435,
        -0.9442865 ]])
from keras.layers import Input, Dense, LSTMfrom keras.models import Modelfrom keras.layers import *from keras.models import *from keras.optimizers import Adamoutput_dim = 1batch_size = 256 #每轮训练模型时,样本的数量epochs = 60 #训练60轮次seq_len = 5hidden_size = 128TIME_STEPS = 5INPUT_DIM = 6lstm_units = 64X_train = np.array([data_train[i : i + seq_len, :] for i in range(data_train.shape[0] - seq_len)])y_train = np.array([data_train[i + seq_len, 0] for i in range(data_train.shape[0]- seq_len)])X_test = np.array([data_test[i : i + seq_len, :] for i in range(data_test.shape[0]- seq_len)])y_test = np.array([data_test[i + seq_len, 0] for i in range(data_test.shape[0] - seq_len)])print(X_train.shape, y_train.shape, X_test.shape, y_test.shape)
(37312, 5, 6) (37312,) (15988, 5, 6) (15988,)
Using TensorFlow backend.
inputs = Input(shape=(TIME_STEPS, INPUT_DIM))#drop1 = Dropout(0.3)(inputs)x = Conv1D(filters = 64, kernel_size = 1, activation = 'relu')(inputs)  #, padding = 'same'#x = Conv1D(filters=128, kernel_size=5, activation='relu')(output1)#embedded_sequencesx = MaxPooling1D(pool_size = 5)(x)x = Dropout(0.2)(x)print(x.shape)
WARNING:tensorflow:From /Users/jiaohaibin/anaconda3/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py:497: calling conv1d (from tensorflow.python.ops.nn_ops) with data_format=NHWC is deprecated and will be removed in a future version.
Instructions for updating:
`NHWC` for data_format is deprecated, use `NWC` instead
(?, 1, 64)
lstm_out = Bidirectional(LSTM(lstm_units, activation='relu'), name='bilstm')(x)#lstm_out = LSTM(lstm_units,activation='relu')(x)print(lstm_out.shape)
(?, 128)
from keras import backend as Kfrom keras.engine.topology import Layerimport numpy as npfrom keras import initializers# Attention GRU network  未用     class AttLayer(Layer):def __init__(self, **kwargs):self.init = initializers.get('normal')#self.input_spec = [InputSpec(ndim=3)]super(AttLayer, self).__init__(**kwargs)def build(self, input_shape):assert len(input_shape)==128#self.W = self.init((input_shape[-1],1))self.W = self.init((input_shape[-1],))#self.input_spec = [InputSpec(shape=input_shape)]self.trainable_weights = [self.W]super(AttLayer, self).build(input_shape)  # be sure you call this somewhere!def call(self, x, mask=None):eij = K.tanh(K.dot(x, self.W))ai = K.exp(eij)weights = ai/K.sum(ai, axis=1).dimshuffle(0,'x')weighted_input = x*weights.dimshuffle(0,1,'x')return weighted_input.sum(axis=1)def get_output_shape_for(self, input_shape):return (input_shape[0], input_shape[-1])
'''l_att = AttLayer()(lstm_out)output = Dense(1, activation='sigmoid')(l_att)print(output.shape)'''
"\nl_att = AttLayer()(lstm_out)\noutput = Dense(1, activation='sigmoid')(l_att)\nprint(output.shape)"
from keras.layers import Input, Dense, mergefrom keras import layers# ATTENTION PART STARTS HEREattention_probs = Dense(128, activation='sigmoid', name='attention_vec')(lstm_out)#attention_mul=layers.merge([stm_out,attention_probs], output_shape],mode='concat',concat_axis=1))attention_mul =Multiply()([lstm_out, attention_probs])#attention_mul = merge([lstm_out, attention_probs],output_shape=32, name='attention_mul', mode='mul')
output = Dense(1, activation='sigmoid')(attention_mul)#output = Dense(10, activation='sigmoid')(drop2)model = Model(inputs=inputs, outputs=output)print(model.summary())
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            (None, 5, 6)         0                                            
__________________________________________________________________________________________________
conv1d_1 (Conv1D)               (None, 5, 64)        448         input_1[0][0]                    
__________________________________________________________________________________________________
max_pooling1d_1 (MaxPooling1D)  (None, 1, 64)        0           conv1d_1[0][0]                   
__________________________________________________________________________________________________
dropout_1 (Dropout)             (None, 1, 64)        0           max_pooling1d_1[0][0]            
__________________________________________________________________________________________________
bilstm (Bidirectional)          (None, 128)          66048       dropout_1[0][0]                  
__________________________________________________________________________________________________
attention_vec (Dense)           (None, 128)          16512       bilstm[0][0]                     
__________________________________________________________________________________________________
multiply_1 (Multiply)           (None, 128)          0           bilstm[0][0]                     
                                                                 attention_vec[0][0]              
__________________________________________________________________________________________________
dense_1 (Dense)                 (None, 1)            129         multiply_1[0][0]                 
==================================================================================================
Total params: 83,137
Trainable params: 83,137
Non-trainable params: 0
__________________________________________________________________________________________________
None
model.compile(loss='mean_squared_error', optimizer='adam')model.fit(X_train, y_train, epochs=epochs, batch_size=batch_size, shuffle=False)y_pred = model.predict(X_test)print('MSE Train loss:', model.evaluate(X_train, y_train, batch_size=batch_size))print('MSE Test loss:', model.evaluate(X_test, y_test, batch_size=batch_size))plt.plot(y_test, label='test')plt.plot(y_pred, label='pred')plt.legend()plt.show()
Epoch 1/60
37312/37312 [==============================] - 3s 92us/step - loss: 0.1865
Epoch 2/60
37312/37312 [==============================] - 2s 46us/step - loss: 0.0514
Epoch 3/60
37312/37312 [==============================] - 2s 42us/step - loss: 0.0442
Epoch 4/60
37312/37312 [==============================] - 2s 44us/step - loss: 0.0439
Epoch 5/60
37312/37312 [==============================] - 1s 39us/step - loss: 0.0436
Epoch 6/60
37312/37312 [==============================] - 1s 35us/step - loss: 0.0432
Epoch 7/60
37312/37312 [==============================] - 1s 35us/step - loss: 0.0429
Epoch 8/60
37312/37312 [==============================] - 1s 39us/step - loss: 0.0426
Epoch 9/60
37312/37312 [==============================] - 1s 37us/step - loss: 0.0424
Epoch 10/60
37312/37312 [==============================] - 1s 34us/step - loss: 0.0422
Epoch 11/60
37312/37312 [==============================] - 1s 36us/step - loss: 0.0420
Epoch 12/60
37312/37312 [==============================] - 1s 40us/step - loss: 0.0419
Epoch 13/60
37312/37312 [==============================] - 1s 40us/step - loss: 0.0418
Epoch 14/60
37312/37312 [==============================] - 1s 38us/step - loss: 0.0417
Epoch 15/60
37312/37312 [==============================] - 2s 42us/step - loss: 0.0417
Epoch 16/60
37312/37312 [==============================] - 2s 45us/step - loss: 0.0416
Epoch 17/60
37312/37312 [==============================] - 1s 39us/step - loss: 0.0416
Epoch 18/60
37312/37312 [==============================] - 2s 42us/step - loss: 0.0416
Epoch 19/60
37312/37312 [==============================] - 2s 44us/step - loss: 0.0416
Epoch 20/60
37312/37312 [==============================] - 1s 35us/step - loss: 0.0416
Epoch 21/60
37312/37312 [==============================] - 1s 40us/step - loss: 0.0416
Epoch 22/60
37312/37312 [==============================] - 2s 41us/step - loss: 0.0416
Epoch 23/60
37312/37312 [==============================] - 1s 37us/step - loss: 0.0415
Epoch 24/60
37312/37312 [==============================] - 1s 35us/step - loss: 0.0416
Epoch 25/60
37312/37312 [==============================] - 1s 40us/step - loss: 0.0415
Epoch 26/60
37312/37312 [==============================] - 2s 41us/step - loss: 0.0415
Epoch 27/60
37312/37312 [==============================] - 2s 46us/step - loss: 0.0415
Epoch 28/60
37312/37312 [==============================] - 2s 47us/step - loss: 0.0415
Epoch 29/60
37312/37312 [==============================] - 2s 43us/step - loss: 0.0414
Epoch 30/60
37312/37312 [==============================] - 1s 39us/step - loss: 0.0414
Epoch 31/60
37312/37312 [==============================] - 2s 41us/step - loss: 0.0414
Epoch 32/60
37312/37312 [==============================] - 2s 42us/step - loss: 0.0414
Epoch 33/60
37312/37312 [==============================] - 1s 37us/step - loss: 0.0414
Epoch 34/60
37312/37312 [==============================] - 2s 44us/step - loss: 0.0414
Epoch 35/60
37312/37312 [==============================] - 2s 49us/step - loss: 0.0413
Epoch 36/60
37312/37312 [==============================] - 1s 40us/step - loss: 0.0413
Epoch 37/60
37312/37312 [==============================] - 1s 35us/step - loss: 0.0413
Epoch 38/60
37312/37312 [==============================] - 2s 48us/step - loss: 0.0413
Epoch 39/60
37312/37312 [==============================] - 1s 40us/step - loss: 0.0412
Epoch 40/60
37312/37312 [==============================] - 1s 38us/step - loss: 0.0413
Epoch 41/60
37312/37312 [==============================] - 2s 42us/step - loss: 0.0412
Epoch 42/60
37312/37312 [==============================] - 2s 41us/step - loss: 0.0412
Epoch 43/60
37312/37312 [==============================] - 1s 36us/step - loss: 0.0412
Epoch 44/60
37312/37312 [==============================] - 1s 40us/step - loss: 0.0412
Epoch 45/60
37312/37312 [==============================] - 2s 43us/step - loss: 0.0412
Epoch 46/60
37312/37312 [==============================] - 1s 37us/step - loss: 0.0412
Epoch 47/60
37312/37312 [==============================] - 1s 38us/step - loss: 0.0412
Epoch 48/60
37312/37312 [==============================] - 2s 43us/step - loss: 0.0412
Epoch 49/60
37312/37312 [==============================] - 1s 39us/step - loss: 0.0411
Epoch 50/60
37312/37312 [==============================] - 1s 37us/step - loss: 0.0411
Epoch 51/60
37312/37312 [==============================] - 2s 42us/step - loss: 0.0411
Epoch 52/60
37312/37312 [==============================] - 2s 43us/step - loss: 0.0411
Epoch 53/60
37312/37312 [==============================] - 1s 38us/step - loss: 0.0411
Epoch 54/60
37312/37312 [==============================] - 2s 47us/step - loss: 0.0410
Epoch 55/60
37312/37312 [==============================] - 2s 50us/step - loss: 0.0410
Epoch 56/60
37312/37312 [==============================] - 2s 43us/step - loss: 0.0410
Epoch 57/60
37312/37312 [==============================] - 2s 48us/step - loss: 0.0410
Epoch 58/60
37312/37312 [==============================] - 2s 50us/step - loss: 0.0410
Epoch 59/60
37312/37312 [==============================] - 2s 40us/step - loss: 0.0410
Epoch 60/60
37312/37312 [==============================] - 2s 45us/step - loss: 0.0410
37312/37312 [==============================] - 1s 20us/step
MSE Train loss: 0.04148709757006819
15988/15988 [==============================] - 0s 15us/step
MSE Test loss: 0.0005358342003804944
 

全部回复

0/140

量化课程

    移动端课程