Hugging Face发布PyTorch新库「Accelerate」:适用于多GPU、TPU、混合精度训练
程序员大白
共 3660字,需浏览 8分钟
·
2021-04-22 08:23
点击上方“程序员大白”,选择“星标”公众号
重磅干货,第一时间送达
机器之心报道
多数 PyTorch 高级库都支持分布式训练和混合精度训练,但是它们引入的抽象化往往需要用户学习新的 API 来定制训练循环。许多 PyTorch 用户希望完全控制自己的训练循环,但不想编写和维护训练所需的样板代码。Hugging Face 最近发布的新库 Accelerate 解决了这个问题。
import torch
import torch.nn.functional as F
from datasets import load_dataset
+ from accelerate import Accelerator
+ accelerator = Accelerator()
- device = 'cpu'
+ device = accelerator.device
model = torch.nn.Transformer().to(device)
optim = torch.optim.Adam(model.parameters())
dataset = load_dataset('my_dataset')
data = torch.utils.data.DataLoader(dataset, shuffle=True)
+ model, optim, data = accelerator.prepare(model, optim, data)
model.train()
for epoch in range(10):
for source, targets in data:
source = source.to(device)
targets = targets.to(device)
optimizer.zero_grad()
output = model(source)
loss = F.cross_entropy(output, targets)
+ accelerator.backward(loss)
- loss.backward()
optimizer.step()
import torch
import torch.nn.functional as F
from datasets import load_dataset
+ from accelerate import Accelerator
+ accelerator = Accelerator()
- device = 'cpu'
+ model = torch.nn.Transformer()
- model = torch.nn.Transformer().to(device)
optim = torch.optim.Adam(model.parameters())
dataset = load_dataset('my_dataset')
data = torch.utils.data.DataLoader(dataset, shuffle=True)
+ model, optim, data = accelerator.prepare(model, optim, data)
model.train()
for epoch in range(10):
for source, targets in data:
- source = source.to(device)
- targets = targets.to(device)
optimizer.zero_grad()
output = model(source)
loss = F.cross_entropy(output, targets)
+ accelerator.backward(loss)
- loss.backward()
optimizer.step()
accelerate config
accelerate launch my_script.py --args_to_my_script
accelerator = Accelerator()
model, optim, data = accelerator.prepare(model, optim, data)
accelerator.backward(loss)
CPU
单 GPU
单一节点多 GPU
多节点多 GPU
TPU
带有本地 AMP 的 FP16(路线图上的顶点)
推荐阅读
关于程序员大白
程序员大白是一群哈工大,东北大学,西湖大学和上海交通大学的硕士博士运营维护的号,大家乐于分享高质量文章,喜欢总结知识,欢迎关注[程序员大白],大家一起学习进步!
评论