Update 第二章 Transformer架构.md

This commit is contained in:
Logan Zou
2025-06-24 10:54:02 +08:00
committed by GitHub
parent 71f8d48290
commit edb73c7aeb

View File

@@ -540,7 +540,7 @@ class DecoderLayer(nn.Module):
norm_x = self.attention_norm_2(x) norm_x = self.attention_norm_2(x)
h = x + self.attention.forward(norm_x, enc_out, enc_out) h = x + self.attention.forward(norm_x, enc_out, enc_out)
# 经过前馈神经网络 # 经过前馈神经网络
out = h + self.feed_forward.forward(self.fnn_norm(h)) out = h + self.feed_forward.forward(self.ffn_norm(h))
return out return out
``` ```