Add use_distributed_sampler=False in Trainer (#756)

if you have defined your own sampler, you should have to set use_distributed_sampler to False!
当使用自定义的 sampler 时,必须设置 use_distributed_sampler 为 False
This commit is contained in:
huangxu1991
2024-07-19 10:33:24 +08:00
committed by GitHub
parent 0a3a1e4505
commit 4f8e1660af

View File

@@ -134,6 +134,7 @@ def main(args):
logger=logger,
num_sanity_val_steps=0,
callbacks=[ckpt_callback],
use_distributed_sampler=False, # 非常简单的修改,但解决了采用自定义的 bucket_sampler 下训练步数不一致的问题!
)
model: Text2SemanticLightningModule = Text2SemanticLightningModule(