PyTorch warning
-
[PyTorch Warning] W accumulate_grad.h:170 Warning: grad and param do not obey the gradient layout contract.Programming Error/PyTorch 2021. 7. 13. 16:41
[W accumulate_grad.h:170] Warning: grad and param do not obey the gradient layout contract. This is not an error, but may impair performance. grad.sizes() = [64, 768, 1, 1], strides() = [768, 1, 1, 1] param.sizes() = [64, 768, 1, 1], strides() = [768, 1, 768, 768] (function operator()) PyTorch Framework로 모델을 학습하는 도중에 다음과 같은 Warning을 만났습니다. Error는 아니고 Warning이라 넘어갈까했지만 may impair performance 성능을 ..