Leandro von Werra on Twitter: "Finally upgraded my MacBook to an M1 (Max). I was curious if I could use the GPU to train a Transformer model. TL;DR: It works and training
![Wandb sweeps running on Kaggle GPU or Colab GPU are much slower than on my local CPU - W&B Help - W&B Community Wandb sweeps running on Kaggle GPU or Colab GPU are much slower than on my local CPU - W&B Help - W&B Community](https://global.discourse-cdn.com/business7/uploads/wandb/original/1X/bd774e4ba9518473bb4221a43d7d52287d992ea3.png)
Wandb sweeps running on Kaggle GPU or Colab GPU are much slower than on my local CPU - W&B Help - W&B Community
logdet on GPU is slower than on CPU and uses a lot of CPU-time · Issue #32048 · pytorch/pytorch · GitHub
![Fine-tuning GPT-J 6B on Google Colab or Equivalent Desktop or Server GPU | by Mike Ohanu | Better Programming Fine-tuning GPT-J 6B on Google Colab or Equivalent Desktop or Server GPU | by Mike Ohanu | Better Programming](https://miro.medium.com/v2/resize:fit:1400/1*PTKjYFjm2myYJYmb2KWOsA.png)