torch.distributed.fsdp

API名称

是否支持

限制与说明

torch.distributed.fsdp.FullyShardedDataParallel

  

torch.distributed.fsdp.FullyShardedDataParallel.apply

  

torch.distributed.fsdp.FullyShardedDataParallel.clip_grad_norm_

  

torch.distributed.fsdp.FullyShardedDataParallel.forward

  

torch.distributed.fsdp.FullyShardedDataParallel.named_buffers

  

torch.distributed.fsdp.FullyShardedDataParallel.named_parameters

  

torch.distributed.fsdp.FullyShardedDataParallel.no_sync

  

torch.distributed.fsdp.FullyShardedDataParallel.register_comm_hook

  

torch.distributed.fsdp.BackwardPrefetch

  

torch.distributed.fsdp.ShardingStrategy

  

torch.distributed.fsdp.MixedPrecision

  

torch.distributed.fsdp.CPUOffload