To efficiently use our training resources we need to know how much parallelism to use for each portion of the training. There are a few levels of parallelism:
- The number of workers used to train a single model
- The number of models trained in parallel for cross-validation
- The number of cross-validations run in parallel.
Primarily this ticket is concerned with the first type, although some information about how things scale when more models are trained in parallel could also be useful.