The Wayback Machine - https://web.archive.org/web/20200619091210/https://github.com/topics/automatic-differentiation
Skip to content
#

automatic-differentiation

Here are 169 public repositories matching this topic...

gorgonia
pennylane
bob-carpenter
bob-carpenter commented Feb 21, 2020

Description

In the constrain_XXX functions applied to map unconstrained parameters to parameters in Stan, the value being constrained is required to be the same type as the log density target being incremented. These functions should allow the two types to vary independently. The target will always be at least double, whereas the variable being constrained might be int when used in t

ibell
ibell commented Sep 7, 2019

I know I have opened rather a lot of issues in the last 24 hours (and have a few more to go), but I just wanted to comment that I think you have some of the nicest documentation I have ever read for any C++ project. It's well put together and builds on itself very nicely so it is easy(ish) to see how to build up from simple hello world examples to something a bit more involved.

tribbloid
tribbloid commented Oct 27, 2019

I'm curious if your visions include making it a feature-complete NN training framework?

What will be the master plan? Integrating with Torch/TF/MXNet or build hardware-level compilation framework from scratch?

Also, what is the standard & code of conduct for contributions from the community?

I'm totally convinced of its capability and believe it can fit into the missing link between horiz

mjlosch
mjlosch commented Apr 24, 2020

With the increasing number of processors being used in many simulations, especially with exch2-grids, nPx quickly goes beyond 999 and the formatted write to msgBuf in ini_procs.F starting here:
https://github.com/MITgcm/MITgcm/blob/07e785229e35cf2d8247b74b6d9d95d2c3adb417/eesupp/src/ini_procs.F#L269
leads to "***" and error messages that clutter the output making it annoying to search

papamarkou
papamarkou commented Mar 25, 2017

The following works for gradient!():

using DiffBase, ReverseDiff

f(x) = sum(sin, x)+prod(tan, x)*sum(sqrt, x);

x = rand(4);

result = DiffBase.GradientResult(x);

rcfg = ReverseDiff.GradientConfig(x);

ReverseDiff.gradient!(result, f, x, rcfg);

DiffBase.value(result)

DiffBase.gradient(result)

However, the Hessian analogue of the above fails:

using DiffBase
willtebbutt
willtebbutt commented Jan 18, 2020

Lots has changed since the docs were first written. #152 addresses a number of things, but there are a few more things that we might want to consider:

  • changing all references to autodiff / automatic differentiation to AD / algorithmic differentiation, with a terminology box in the docs somewhere, explaining what we're on about.
  • In the "On writing good rrule and frule " bit, we should consi

Improve this page

Add a description, image, and links to the automatic-differentiation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the automatic-differentiation topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.