在Keras中CNN联合LSTM进行分类实例

作者:FQ_G 时间:2023-10-12 22:07:07 

我就废话不多说,大家还是直接看代码吧~


def get_model():
 n_classes = 6
 inp=Input(shape=(40, 80))
 reshape=Reshape((1,40,80))(inp)
#  pre=ZeroPadding2D(padding=(1, 1))(reshape)
 # 1
 conv1=Convolution2D(32, 3, 3, border_mode='same',init='glorot_uniform')(reshape)
 #model.add(Activation('relu'))
 l1=LeakyReLU(alpha=0.33)(conv1)

conv2=ZeroPadding2D(padding=(1, 1))(l1)
 conv2=Convolution2D(32, 3, 3, border_mode='same',init='glorot_uniform')(conv2)
 #model.add(Activation('relu'))
 l2=LeakyReLU(alpha=0.33)(conv2)

m2=MaxPooling2D((3, 3), strides=(3, 3))(l2)
 d2=Dropout(0.25)(m2)
 # 2
 conv3=ZeroPadding2D(padding=(1, 1))(d2)
 conv3=Convolution2D(64, 3, 3, border_mode='same',init='glorot_uniform')(conv3)
 #model.add(Activation('relu'))
 l3=LeakyReLU(alpha=0.33)(conv3)

conv4=ZeroPadding2D(padding=(1, 1))(l3)
 conv4=Convolution2D(64, 3, 3, border_mode='same',init='glorot_uniform')(conv4)
 #model.add(Activation('relu'))
 l4=LeakyReLU(alpha=0.33)(conv4)

m4=MaxPooling2D((3, 3), strides=(3, 3))(l4)
 d4=Dropout(0.25)(m4)
 # 3
 conv5=ZeroPadding2D(padding=(1, 1))(d4)
 conv5=Convolution2D(128, 3, 3, border_mode='same',init='glorot_uniform')(conv5)
 #model.add(Activation('relu'))
 l5=LeakyReLU(alpha=0.33)(conv5)

conv6=ZeroPadding2D(padding=(1, 1))(l5)
 conv6=Convolution2D(128, 3, 3, border_mode='same',init='glorot_uniform')(conv6)
 #model.add(Activation('relu'))
 l6=LeakyReLU(alpha=0.33)(conv6)

m6=MaxPooling2D((3, 3), strides=(3, 3))(l6)
 d6=Dropout(0.25)(m6)
 # 4
 conv7=ZeroPadding2D(padding=(1, 1))(d6)
 conv7=Convolution2D(256, 3, 3, border_mode='same',init='glorot_uniform')(conv7)
 #model.add(Activation('relu'))
 l7=LeakyReLU(alpha=0.33)(conv7)

conv8=ZeroPadding2D(padding=(1, 1))(l7)
 conv8=Convolution2D(256, 3, 3, border_mode='same',init='glorot_uniform')(conv8)
 #model.add(Activation('relu'))
 l8=LeakyReLU(alpha=0.33)(conv8)
 g=GlobalMaxPooling2D()(l8)
 print("g=",g)
 #g1=Flatten()(g)
 lstm1=LSTM(
   input_shape=(40,80),
   output_dim=256,
   activation='tanh',
   return_sequences=False)(inp)
 dl1=Dropout(0.3)(lstm1)

den1=Dense(200,activation="relu")(dl1)
 #model.add(Activation('relu'))
 #l11=LeakyReLU(alpha=0.33)(d11)
 dl2=Dropout(0.3)(den1)

#   lstm2=LSTM(
#     256,activation='tanh',
#     return_sequences=False)(lstm1)
#   dl2=Dropout(0.5)(lstm2)
 print("dl2=",dl1)
 g2=concatenate([g,dl2],axis=1)
 d10=Dense(1024)(g2)
 #model.add(Activation('relu'))
 l10=LeakyReLU(alpha=0.33)(d10)
 l10=Dropout(0.5)(l10)
 l11=Dense(n_classes, activation='softmax')(l10)

model=Model(input=inp,outputs=l11)
 model.summary()
 #编译model
 adam = keras.optimizers.Adam(lr = 0.0005, beta_1=0.95, beta_2=0.999,epsilon=1e-08)
 #adam = keras.optimizers.Adam(lr = 0.001, beta_1=0.95, beta_2=0.999,epsilon=1e-08)
 #sgd = keras.optimizers.SGD(lr = 0.001, decay = 1e-06, momentum = 0.9, nesterov = False)

#reduce_lr = ReduceLROnPlateau(monitor = 'loss', factor = 0.1, patience = 2,verbose = 1, min_lr = 0.00000001, mode = 'min')
 model.compile(loss='categorical_crossentropy', optimizer=adam, metrics=['accuracy'])

return model

补充知识:keras中如何将不同的模型联合起来(以cnn/lstm为例)

可能会遇到多种模型需要揉在一起,如cnn和lstm,而我一般在keras框架下开局就是一句

model = Sequential()

然后model.add ,model.add , ......到最后

model.compile(loss=["mae"], optimizer='adam',metrics=[mape])

这突然要把模型加起来,这可怎么办?

以下示例代码是将cnn和lstm联合起来,先是由cnn模型卷积池化得到特征,再输入到lstm模型中得到最终输出


import os
import keras
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
from keras.models import Model
from keras.layers import *
from matplotlib import pyplot
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
from keras.layers import Dense,Dropout,Activation,Convolution2D,MaxPooling2D,Flatten
from keras.layers import LSTM
def design_model():
 # design network
 inp=Input(shape=(11,5))
 reshape=Reshape((11,5,1))(inp)
 conv1=Convolution2D(32,3,3,border_mode='same',init='glorot_uniform')(reshape)
 print(conv1)
 l1=Activation('relu')(conv1)
 conv2=Convolution2D(64,3,3, border_mode='same',)(l1)
 l2=Activation('relu')(conv2)
 print(l2)
 m2=MaxPooling2D(pool_size=(2, 2), border_mode='valid')(l2)
 print(m2)
 reshape1=Reshape((10,64))(m2)
 lstm1=LSTM(input_shape=(10,64),output_dim=30,activation='tanh',return_sequences=False)(reshape1)
 dl1=Dropout(0.3)(lstm1)
 # den1=Dense(100,activation="relu")(dl1)
 den2=Dense(1,activation="relu")(dl1)
 model=Model(input=inp,outputs=den2)
 model.summary() #打印出模型概况
 adam = keras.optimizers.Adam(lr = 0.001, beta_1=0.95, beta_2=0.999,epsilon=1e-08)
 model.compile(loss=["mae"], optimizer=adam,metrics=['mape'])
 return model
model=design_model()
history = model.fit(train_x, train_y, epochs=epochs, batch_size=batch_size, validation_data=[test_x, test_y],verbose=2, shuffle=True)
# #save LeNet_model_files after train
model.save('model_trained.h5')

以上示例代码中cnn和lstm是串联即cnn输出作为lstm的输入,一条路线到底

如果想实现并联,即分开再汇总到一起

可用concatenate函数把cnn的输出端和lstm的输出端合并起来,后面再接上其他层,完成整个模型图的构建。

g2=concatenate([g,dl2],axis=1)

总结一下:

这是keras框架下除了Sequential另一种函数式构建模型的方式,更有灵活性,主要是在模型最后通过 model=Model(input=inp,outputs=den2)来确定整个模型的输入和输出

来源:https://blog.csdn.net/qq_33266320/article/details/80845619

标签:Keras,CNN,LSTM,分类
0
投稿

猜你喜欢

  • 使用python根据端口号关闭进程的方法

    2022-12-26 01:58:41
  • Python实现学生信息管理系统的示例代码

    2022-12-06 22:17:56
  • JS实现用户管理系统

    2023-08-29 04:51:20
  • Golang使用Consul详解

    2024-04-26 17:35:56
  • 教你如何在pycharm中使用less

    2021-08-12 13:59:32
  • PHP实现向关联数组指定的Key之前插入元素的方法

    2023-07-14 08:41:33
  • Django集成MongoDB实现过程解析

    2022-07-10 03:32:35
  • JS Object构造函数之Object.freeze

    2024-05-09 10:37:53
  • 使用Python制作表情包实现换脸功能

    2022-01-14 09:10:18
  • 浅谈python中列表、字符串、字典的常用操作

    2023-02-02 23:59:15
  • Python figure参数及subplot子图绘制代码

    2023-09-14 17:13:00
  • Python数据结构栈实现进制转换简单示例

    2022-10-21 13:48:21
  • 一文弄懂Pytorch的DataLoader, DataSet, Sampler之间的关系

    2022-06-27 14:21:53
  • 利用ASP在线维护数据库

    2007-10-12 13:53:00
  • Ubuntu 18.04下mysql 8.0 安装配置方法图文教程

    2024-01-25 18:05:41
  • MySQL 8 新特性之Invisible Indexes

    2024-01-19 04:56:06
  • python如何利用cv2.rectangle()绘制矩形框

    2021-03-14 10:12:51
  • python分布式编程实现过程解析

    2023-11-10 21:13:48
  • 利用PyCharm操作Github(仓库新建、更新,代码回滚)

    2022-09-13 05:01:34
  • python自动化测试工具Helium使用示例

    2022-09-26 22:59:05
  • asp之家 网络编程 m.aspxhome.com