Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor TensorFlow 2 code to hybrid functions #418

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

khatchad
Copy link

@khatchad khatchad commented Feb 5, 2024

Convert several eager execution function to hybrid execution. We have some preliminary evidence that this improves the run-time performance of the models:

Test Python version TF version Before accuracy After accuracy Before loss After loss Before elapsed time (s) After elapsed time (s) Speedup
neural_network 3.10.0 2.9.3 0.9624 0.9635     9.333428266 3.812376191 2.448191836
autoencoder 3.10.0 2.9.3     0.006999 0.007014 110.4210886 34.1057281 3.23761124
logistic_regression 3.10.0 2.9.3 0.8286328125 0.8316015625 0.918056736 0.9068725395 1.415692188 0.7934420485 1.784241446
bidirectional_rnn 3.10.0 2.9.3 0.85625 0.821875 0.5128818989 0.58627882 28.0902812 5.041457747 5.571856913
convolutional_network 3.10.0 2.9.3 0.9867734375 0.9869921875 1.48369342 1.483417908 31.07854785 17.71073562 1.754785827
dcgan 3.10.0 2.9.3     1.208782502 0.04901289759 78.12116778 36.0855548 2.164887535
dynamic_rnn 3.10.0 2.9.3 0.8580357143 0.8657738095 0.3000548454 0.285470572 48.15241052 8.490720483 5.671180745
recurrent_network 3.10.0 2.9.3 0.9375 0.93125 0.1873067699 0.2336050078 42.15870964 7.818362872 5.39226822
build_custom_layers 3.10.0 2.9.3 0.907109375 0.919921875 3.339067001 3.328515396 1.387739662 0.843345543 1.645517277
save_restore_model 3.10.0 2.9.3 0.8957291667 0.8922395833 107.3751221 110.7431885 4.18468971 1.885201852 2.219756842
tensorboard_example 3.10.0 2.9.3 0.872734375 0.8712239583 110.2372933 112.0809294 8.789215875 4.572643171 1.922130275

For dcgan, we believe that the difference in loss is due to a TF bug that is still present in 2.15.0. This test can be reverted if desired.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant