DiffSharp 0.5.6
See the version list below for details.
dotnet add package DiffSharp --version 0.5.6
NuGet\Install-Package DiffSharp -Version 0.5.6
<PackageReference Include="DiffSharp" Version="0.5.6" />
paket add DiffSharp --version 0.5.6
#r "nuget: DiffSharp, 0.5.6"
// Install DiffSharp as a Cake Addin #addin nuget:?package=DiffSharp&version=0.5.6 // Install DiffSharp as a Cake Tool #tool nuget:?package=DiffSharp&version=0.5.6
DiffSharp (Diff#) is an automatic differentiation (AD) library implemented in the F# language.
AD allows exact and efficient calculation of derivatives, by systematically applying the chain rule of calculus at the elementary operator level. AD is different from numerical differentiation, which is prone to truncation and round-off errors, and symbolic differentiation, which is exact but not efficient for run-time calculations and can only handle closed-form mathematical expressions.
Using the DiffSharp library, derivative calculations (gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products) can be incorporated with minimal change into existing algorithms. Please see the API Overview page for a list of available operations.
The library is under active development by Atılım Güneş Baydin and Barak A. Pearlmutter mainly for research applications in machine learning, as part of their work at the Brain and Computation Lab, Hamilton Institute, National University of Ireland Maynooth.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET Framework | net is compatible. |
-
- FSharp.Quotations.Evaluator (>= 1.0.4)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on DiffSharp:
Package | Downloads |
---|---|
Hype
Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization. This is enabled by nested automatic differentiation (AD) giving you access to the automatic exact derivative of any floating-point value in your code with respect to any other. Underlying computations are run by a BLAS/LAPACK backend (OpenBLAS by default). |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated | |
---|---|---|---|
0.8.4-beta | 1,847 | 8/24/2019 | |
0.8.3-beta | 611 | 7/4/2019 | |
0.8.2-beta | 575 | 6/25/2019 | |
0.8.1-beta | 568 | 6/20/2019 | |
0.8.0-beta | 589 | 6/11/2019 | |
0.7.7 | 5,043 | 12/25/2015 | |
0.7.6 | 1,514 | 12/15/2015 | |
0.7.5 | 1,605 | 12/6/2015 | |
0.7.4 | 1,543 | 10/13/2015 | |
0.7.3 | 1,589 | 10/6/2015 | |
0.7.2 | 1,638 | 10/4/2015 | |
0.7.1 | 1,459 | 10/4/2015 | |
0.7.0 | 1,362 | 9/29/2015 | |
0.6.3 | 1,842 | 7/18/2015 | |
0.6.2 | 1,236 | 6/6/2015 | |
0.6.1 | 1,278 | 6/2/2015 | |
0.6.0 | 1,474 | 4/26/2015 | |
0.5.10 | 1,285 | 3/27/2015 | |
0.5.9 | 1,501 | 2/26/2015 | |
0.5.8 | 1,663 | 2/23/2015 | |
0.5.7 | 1,436 | 2/17/2015 | |
0.5.6 | 1,452 | 2/13/2015 | |
0.5.5 | 1,452 | 12/15/2014 | |
0.5.4 | 1,511 | 11/23/2014 | |
0.5.3 | 2,156 | 11/7/2014 | |
0.5.2 | 1,980 | 11/4/2014 | |
0.5.1 | 1,215 | 10/27/2014 | |
0.5.0 | 1,285 | 10/2/2014 |
Please visit
https://github.com/gbaydin/DiffSharp/releases
for the latest release notes.