Benchmarking Python Tools for Automatic Differentiation

Andrei Turkin, Aung Thu

Abstract


In this paper we compare several Python tools for automatic differentiation. In order to assess the difference in performance and precision, the problem of finding the optimal geometrical structure of the cluster with identical atoms is used as follows. First, we compare performance of calculating gradients for the objective function. We showed that the PyADOL-C and PyCppAD tools have much better performance for big clusters than the other ones. Second, we assess precision of these two tools by calculating the difference between the obtained at the optimal configuration gradient norms. We conclude that PyCppAD has the best performance among others, while having almost the same precision as the second- best performing tool – PyADOL-C.


Full Text:

PDF

References


Y. G. Evtushenko. Optimization and fast automatic differentiation. Dorodnicyn Computing Center of Russian Academy of Sciences, 2013. [Online]. Available: http://www.ccas.ru/personal/evtush/p/198.pdf

A. Griewank and A. Walther. Evaluating Derivatives: Principles and Techniques of Algorithmic Differentiation. Society for Industrial Mathematics, 2nd edition, November 2008.

A. Griewank. A mathematical view of automatic differentiation. Acta Numerica, 12:321-398, 2003.

A. G. Baydin, and B. A. Pearlmutter. Automatic differentiation of algorithms for machine learning. arXiv preprint arXiv:1404.7456 (2014). Anailable: http://arxiv.org/pdf/1404.7456.pdf

M. J. Weinstein and A.V. Rao. A source transformation via operator overloading method for the automatic differentiation of mathematical functions in MATLAB. ACM Transactions on Mathematical Software (2014).

C. H. Bischof and H. M. Bucker. Computing derivatives of computer programs. Modern Methods and Algorithms of Quantum Chemistry: Proceedings, Second Edition, NIC Series, 3:315-327, 2000.

S. F. Walter. PyADOL-C: a python module to differentiate complex algorithms written in python. Available: www.github.com/b45ch1/pyadolc/

B. M. Bell and S.F. Walter. Pycppad: Python algorithmic differentiation using cppad. Available: http://www.seanet.com/~bradbell/pycppad/pycppad.htm

J. Andersson. A General-Purpose Software Framework for Dynamic Optimization, October 2013.

B. Stadie, Z. Xie, P. Moritz, J. Schulman, J. Ho. Computational graph toolkit: a library for evaluation and differentiation of functions of multidimensional arrays.

F. Bastien, P. Lamblin, R. Pascanu, J. Bergstra, I. J. Goodfellow, A. Bergeron, N. Bouchard, and Y. Bengio. Theano: new features and speed improvements. Deep Learning and Unsupervised Feature Learning NIPS 2012 Workshop, 2012.

J. Bergstra, O. Breuleux, F. Bastien, P. Lamblin, R. Pascanu, G. Desjardins, J. Turian, D. Warde-Farley, and Y. Bengio. Theano: A CPU and GPU math expression compiler. In Porceedings of the Python for Scientific Computing Conference (SciPy), June 2010. Oral Presentation.

A.D. Lee AD: python package for first- and second-order automatic differentiation. Available: http://pythonhosted.org/ad

A. Walther and A. Griewank. Getting started with ADOL-C. In Combinatorial scientific computing, pages 181–202, 2009.

B.M. Bell. Cppad: A package for differentiation of c++ algorithms. Available: http://www.coin-or.org/CppAD

D. Abrahams and R. W. Grosse-Kunstleve. Building hybrid systems with boost. python. CC Plus Plus Users Journal, 21(7):29–36, 2003.

J. Andersson, J. Akesson, and M. Diehl. Recent Advances in Algorithmic Differentiation, chapter CasADi: A Symbolic Package for Automatic Differentiation and Optimal Control, pages 297–307.

Springer Berlin Heidelberg, Berlin, Heidelberg, 2012.

D.M Beazley. Automated scientific software scripting with swig.

Future Generation Computer Systems, 19(5):599–609, 2003.

M.A. Posypkin. Searching for minimum energy molecular cluster: Methods and distributed software infrastructure for numerical solution of the problem. Vestnik of Lobachevsky University of Nizhni

Novgorod, (1):210 – 219, 2010.

D.J Wales and J.P.K. Doye. Global optimization by basin-hopping

and the lowest energy structures of Lennard-Jones clusters containing up to 110 atoms. The Journal of Physical Chemistry A, 101(28):5111–5116, 1997.

D.C. Liu and J. Nocedal. On the limited memory BFGS method for large scale optimization. Mathematical programming, 45(1-3):503– 528, 1989.

J.A. Northby. Structure and binding of Lennard-Jones clusters: 13 ≤ n ≤ 147. The Journal of chemical physics, 87(10):6166–6177, 1987.

X. Shao, H. Jiang, and W. Cai. Parallel random tunneling algorithm for structural optimization of Lennard-Jones clusters up to n= 330. Journal of chemical information and computer sciences, 44(1):193– 199, 2004.

Y. Xiang, L. Cheng, W. Cai, and X. Shao. Structural distribution of Lennard-Jones clusters containing 562 to 1000 atoms. The Journal of Physical Chemistry A, 108(44):9516–9520, 2004.

X. Shao, Y. Xiang, and W. Cai. Structural transition from icosahedra to decahedra of large Lennard-Jones clusters. The Journal of Physical Chemistry A, 109(23):5193–5197, 2005.


Refbacks

  • There are currently no refbacks.


Abava  Кибербезопасность MoNeTec 2024

ISSN: 2307-8162