📅  最后修改于: 2022-03-11 14:45:36.345000             🧑  作者: Mango
from keras.layers import LeakyReLU
model = Sequential()
# here change your line to leave out an activation
model.add(Dense(90))
# now add a ReLU layer explicitly:
model.add(LeakyReLU(alpha=0.05))