V
- The value type parameter of the tensors processed by this optimizer.public interface Optimizer<V> extends Component<Tensor<V>>, Optimization<V>
Optimizer
s are tensor components which implement the Optimization
(functional)
interface applying various optimization algorithms to the gradients of tensors.
Optimizer o = new SGD(0.01); // 0.01 learning rate
Tensor<Float> w = Tensor.of(0f);
w.set(o);
Optimizer
implementations in a functional style
take a look at the following example:
Optimizer.of( t -> {
Tensor<?> gradient = t.getGradient();
// ... apply algorithm ...
})
Or consider using the factory method below to process gradients directly:
Optimizer.ofGradient( gradient -> {
// ... apply algorithm ...
})
Component.IsBeing, Component.OwnerChangeRequest<O>
Modifier and Type | Field and Description |
---|---|
static AdaGradFactory |
AdaGrad |
static ADAMFactory |
ADAM |
static MomentumFactory |
Momentum |
static RMSPropFactory |
RMSProp |
static SGDFactory |
SGD |
Modifier and Type | Method and Description |
---|---|
static <T> Optimizer<T> |
of(Optimization<T> o) |
static <T> Optimizer<T> |
ofGradient(Optimization<T> o) |
optimize
static final ADAMFactory ADAM
static final AdaGradFactory AdaGrad
static final MomentumFactory Momentum
static final RMSPropFactory RMSProp
static final SGDFactory SGD
static <T> Optimizer<T> of(Optimization<T> o)
T
- The value type parameter of the tensors processed by this optimizer.o
- The Optimization
lambda which receives a tensor for optimization.Optimizer
which will process any passed tensor directly (see ofGradient(Optimization)
which processes gradients).static <T> Optimizer<T> ofGradient(Optimization<T> o)
T
- The value type parameter of the tensors processed by this optimizer.o
- The Optimization
lambda which receives the gradient of a tensor for optimization.Optimizer
which will process the gradient of any passed tensor (see of(Optimization)
which processes tensors directly).