Downloads provided by UsageCounts
Automatic differentiation(AD) is a method in computer algebra to evaluate the derivative of a function defined by source code, it's an alternative to symbolic and numerical differentiation. AD uses the fact than any algorithm can be decomposed to differentiable elementary operation and thereby differentiated using a chain rule. Clad is a Clang plugin based on source code transformation. Given C++ source code of a mathematical function, it can automatically generate C++ code for computing derivatives of the function. It supports both forward-mode and reverse-mode AD. Our talk covers two features of Clad. The first one is Jacobian matrix computation using forward and reverse modes. Jacobian matrix have applications in such fields as machine learning and computational physics. The second one is error estimates of floating-point numbers. It allows to monitor the estimated relative error during a computation involving differentiated function produced by clad. It achieved by source code transformation, Clad can produce code for error estimation along with AD.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 3 | |
| downloads | 3 |

Views provided by UsageCounts
Downloads provided by UsageCounts