Repartitioner interface: controls how data should be repartitioned before training.
A hook for the workers when training.
|TrainingMaster<R extends TrainingResult,W extends TrainingWorker<R>>||
A TrainingMaster controls how distributed training is executed in practice
In principle, a large number of different approches can be used in distributed training (synchronous vs. asynchronous, parameter vs. gradient averaging, etc).
TrainingResult: a class used by
|TrainingWorker<R extends TrainingResult>||
TrainingWorker is a small serializable class that can be passed (in serialized form) to each Spark executor for actually conducting training.
A simple configuration object (common settings for workers)
Approach to use when training from a
Enumeration that is used for specifying the behaviour of repartitioning in
RepartitionStrategy: different strategies for conducting repartitioning on training data, when repartitioning is required.
Copyright © 2020. All rights reserved.