What's the relationship between number of gpu and batch size (global batch size)) #13314
-
|
Yesterday, I saw the code 🔗
In tensorflow, we need to calculate the global batch size in multi-gpu settings. Is the same concept also applied in the multi-gpu of trainer ? If we setup |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Hi @HuangChiEn, in your case shown in the sample code, the actual batch size will be It behaves differently depending on which strategy is used. You can read more about it |
Beta Was this translation helpful? Give feedback.
Hi @HuangChiEn, in your case shown in the sample code, the actual batch size will be
128(32 * 4).It behaves differently depending on which strategy is used. You can read more about it