The Wayback Machine - https://web.archive.org/web/20220124015627/https://github.com/d2l-ai/d2l-en/issues/2019
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Tensorflow] Plots appear smoother inconsistent with MXNet/PyTorch #2019

Open
AnirudhDagar opened this issue Jan 24, 2022 · 0 comments
Open

[Tensorflow] Plots appear smoother inconsistent with MXNet/PyTorch #2019

AnirudhDagar opened this issue Jan 24, 2022 · 0 comments

Comments

@AnirudhDagar
Copy link
Member

@AnirudhDagar AnirudhDagar commented Jan 24, 2022

Although the results look nice and ideal in all TensorFlow plots and are consistent across all frameworks, there is a small difference (more of a consistency issue). The result training loss/accuracy plots look like they are sampling on a lesser number of points. It looks more straight and smooth and less wiggly as compared to PyTorch or MXNet.

It can be clearly seen in chapter 6(CNN Lenet), 7(Modern CNN), 11 (Optimization LR Scheduler) plots. It is due to how the class TrainCallback is implemented on epoch instead of on batch here.

This consistency issue plagues many of the TensorFlow plots across the book. PRs are welcome, this sounds like a good issue for beginners who would like to contribute to d2l. Feel free to ask any questions/doubts.

cc @astonzhang @terrytangyuan

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment