Package | Description |
---|---|
org.apache.ignite.ml.composition.boosting.convergence |
Package contains implementation of convergency checking algorithms for gradient boosting.
|
org.apache.ignite.ml.composition.boosting.convergence.mean |
Contains implementation of convergence checking computer by mean of absolute value of errors in dataset.
|
org.apache.ignite.ml.composition.boosting.convergence.median |
Contains implementation of convergence checking computer by median of medians of errors in dataset.
|
org.apache.ignite.ml.composition.boosting.convergence.simple |
Contains implementation of Stub for convergence checking.
|
org.apache.ignite.ml.dataset |
Base package for machine learning dataset classes.
|
org.apache.ignite.ml.dataset.impl.bootstrapping |
Base package for bootstrapped implementation of machine learning dataset.
|
org.apache.ignite.ml.dataset.primitive.builder.context |
Contains partition
context builders. |
org.apache.ignite.ml.knn |
Contains main APIs for kNN algorithms.
|
org.apache.ignite.ml.knn.classification |
Contains main APIs for kNN classification algorithms.
|
org.apache.ignite.ml.knn.regression |
Contains helper classes for kNN regression algorithms.
|
org.apache.ignite.ml.knn.utils |
Contains util functionality for kNN algorithms.
|
org.apache.ignite.ml.nn |
Contains neural networks and related classes.
|
org.apache.ignite.ml.recommendation.util |
Contains util classes used in recommendation system framework.
|
org.apache.ignite.ml.tree |
Root package for decision trees.
|
org.apache.ignite.ml.tree.leaf |
Root package for decision trees leaf builders.
|
org.apache.ignite.ml.tree.randomforest |
Contains random forest implementation classes.
|
org.apache.ignite.ml.tree.randomforest.data.impurity |
Contains implementation of impurity computers based on histograms.
|
org.apache.ignite.ml.tree.randomforest.data.statistics |
Contains implementation of statistics computers for Random Forest.
|
Modifier and Type | Method and Description |
---|---|
abstract Double |
ConvergenceChecker.computeMeanErrorOnDataset(Dataset<EmptyContext,? extends FeatureMatrixWithLabelsOnHeapData> dataset,
ModelsComposition mdl)
Compute error for given model on learning dataset.
|
boolean |
ConvergenceChecker.isConverged(Dataset<EmptyContext,? extends FeatureMatrixWithLabelsOnHeapData> dataset,
ModelsComposition currMdl)
Checks convergency on dataset.
|
Modifier and Type | Method and Description |
---|---|
Double |
MeanAbsValueConvergenceChecker.computeMeanErrorOnDataset(Dataset<EmptyContext,? extends FeatureMatrixWithLabelsOnHeapData> dataset,
ModelsComposition mdl)
Compute error for given model on learning dataset.
|
Modifier and Type | Method and Description |
---|---|
Double |
MedianOfMedianConvergenceChecker.computeMeanErrorOnDataset(Dataset<EmptyContext,? extends FeatureMatrixWithLabelsOnHeapData> dataset,
ModelsComposition mdl)
Compute error for given model on learning dataset.
|
Modifier and Type | Method and Description |
---|---|
Double |
ConvergenceCheckerStub.computeMeanErrorOnDataset(Dataset<EmptyContext,? extends FeatureMatrixWithLabelsOnHeapData> dataset,
ModelsComposition mdl)
Compute error for given model on learning dataset.
|
boolean |
ConvergenceCheckerStub.isConverged(Dataset<EmptyContext,? extends FeatureMatrixWithLabelsOnHeapData> dataset,
ModelsComposition currMdl)
Checks convergency on dataset.
|
Modifier and Type | Method and Description |
---|---|
static <K,V,CO extends Serializable> |
DatasetFactory.createSimpleDataset(DatasetBuilder<K,V> datasetBuilder,
LearningEnvironmentBuilder envBuilder,
Preprocessor<K,V> featureExtractor)
Creates a new instance of distributed
SimpleDataset using the specified featureExtractor . |
static <K,V,CO extends Serializable> |
DatasetFactory.createSimpleDataset(Ignite ignite,
IgniteCache<K,V> upstreamCache,
LearningEnvironmentBuilder envBuilder,
Preprocessor<K,V> featureExtractor)
Creates a new instance of distributed
SimpleDataset using the specified featureExtractor . |
static <K,V,CO extends Serializable> |
DatasetFactory.createSimpleDataset(Ignite ignite,
IgniteCache<K,V> upstreamCache,
Preprocessor<K,V> featureExtractor)
Creates a new instance of distributed
SimpleDataset using the specified featureExtractor . |
static <K,V,CO extends Serializable> |
DatasetFactory.createSimpleDataset(Map<K,V> upstreamMap,
int partitions,
LearningEnvironmentBuilder envBuilder,
Preprocessor<K,V> featureExtractor)
Creates a new instance of local
SimpleDataset using the specified featureExtractor . |
static <K,V,CO extends Serializable> |
DatasetFactory.createSimpleLabeledDataset(DatasetBuilder<K,V> datasetBuilder,
LearningEnvironmentBuilder envBuilder,
Preprocessor<K,V> vectorizer)
Creates a new instance of distributed
SimpleLabeledDataset using the specified featureExtractor
and lbExtractor . |
static <K,V,CO extends Serializable> |
DatasetFactory.createSimpleLabeledDataset(Ignite ignite,
LearningEnvironmentBuilder envBuilder,
IgniteCache<K,V> upstreamCache,
Preprocessor<K,V> vectorizer)
Creates a new instance of distributed
SimpleLabeledDataset using the specified featureExtractor
and lbExtractor . |
static <K,V,CO extends Serializable> |
DatasetFactory.createSimpleLabeledDataset(Map<K,V> upstreamMap,
LearningEnvironmentBuilder envBuilder,
int partitions,
Preprocessor<K,V> vectorizer)
Creates a new instance of local
SimpleLabeledDataset using the specified featureExtractor and
lbExtractor . |
Modifier and Type | Method and Description |
---|---|
BootstrappedDatasetPartition |
BootstrappedDatasetBuilder.build(LearningEnvironment env,
Iterator<UpstreamEntry<K,V>> upstreamData,
long upstreamDataSize,
EmptyContext ctx)
Builds a new partition
data from a partition upstream data and partition context . |
Modifier and Type | Method and Description |
---|---|
EmptyContext |
EmptyContextBuilder.build(LearningEnvironment env,
Iterator<UpstreamEntry<K,V>> upstreamData,
long upstreamDataSize)
Builds a new partition
context from an upstream data. |
Modifier and Type | Method and Description |
---|---|
SpatialIndex<Double> |
KNNPartitionDataBuilder.build(LearningEnvironment env,
Iterator<UpstreamEntry<K,V>> upstreamData,
long upstreamDataSize,
EmptyContext ctx)
Builds a new partition
data from a partition upstream data and partition context . |
Modifier and Type | Method and Description |
---|---|
protected abstract M |
KNNTrainer.convertDatasetIntoModel(Dataset<EmptyContext,SpatialIndex<Double>> dataset)
Convers given dataset into KNN model (classification or regression depends on implementation).
|
Constructor and Description |
---|
KNNModel(Dataset<EmptyContext,SpatialIndex<L>> dataset,
DistanceMeasure distanceMeasure,
int k,
boolean weighted)
Constructs a new instance of KNN model.
|
Modifier and Type | Method and Description |
---|---|
protected KNNClassificationModel |
KNNClassificationTrainer.convertDatasetIntoModel(Dataset<EmptyContext,SpatialIndex<Double>> dataset)
Convers given dataset into KNN model (classification or regression depends on implementation).
|
Modifier and Type | Method and Description |
---|---|
protected KNNRegressionModel |
KNNRegressionTrainer.convertDatasetIntoModel(Dataset<EmptyContext,SpatialIndex<Double>> dataset)
Convers given dataset into KNN model (classification or regression depends on implementation).
|
Modifier and Type | Method and Description |
---|---|
static <K,V,C extends Serializable> |
KNNUtils.buildDataset(LearningEnvironmentBuilder envBuilder,
DatasetBuilder<K,V> datasetBuilder,
Preprocessor<K,V> vectorizer)
Builds dataset.
|
Modifier and Type | Method and Description |
---|---|
IgniteFunction<Dataset<EmptyContext,SimpleLabeledDatasetData>,MLPArchitecture> |
MLPTrainer.getArchSupplier()
Get the multilayer perceptron architecture supplier that defines layers and activators.
|
Modifier and Type | Method and Description |
---|---|
MLPTrainer<P> |
MLPTrainer.withArchSupplier(IgniteFunction<Dataset<EmptyContext,SimpleLabeledDatasetData>,MLPArchitecture> archSupplier)
Set up the multilayer perceptron architecture supplier that defines layers and activators.
|
Constructor and Description |
---|
MLPTrainer(IgniteFunction<Dataset<EmptyContext,SimpleLabeledDatasetData>,MLPArchitecture> archSupplier,
IgniteFunction<Vector,IgniteDifferentiableVectorToDoubleFunction> loss,
UpdatesStrategy<? super MultilayerPerceptron,P> updatesStgy,
int maxIterations,
int batchSize,
int locIterations,
long seed)
Constructs a new instance of multilayer perceptron trainer.
|
Modifier and Type | Method and Description |
---|---|
RecommendationDatasetData<O,S> |
RecommendationDatasetDataBuilder.build(LearningEnvironment env,
Iterator<UpstreamEntry<K,Z>> upstreamData,
long upstreamDataSize,
EmptyContext ctx)
Builds a new partition
data from a partition upstream data and partition context . |
RecommendationDatasetData<Serializable,Serializable> |
RecommendationBinaryDatasetDataBuilder.build(LearningEnvironment env,
Iterator<UpstreamEntry<Object,BinaryObject>> upstreamData,
long upstreamDataSize,
EmptyContext ctx)
Builds a new partition
data from a partition upstream data and partition context . |
Modifier and Type | Method and Description |
---|---|
<K,V> DecisionTreeNode |
DecisionTree.fit(Dataset<EmptyContext,DecisionTreeData> dataset) |
protected abstract ImpurityMeasureCalculator<T> |
DecisionTree.getImpurityMeasureCalculator(Dataset<EmptyContext,DecisionTreeData> dataset)
Returns impurity measure calculator.
|
protected ImpurityMeasureCalculator<GiniImpurityMeasure> |
DecisionTreeClassificationTrainer.getImpurityMeasureCalculator(Dataset<EmptyContext,DecisionTreeData> dataset)
Returns impurity measure calculator.
|
protected ImpurityMeasureCalculator<MSEImpurityMeasure> |
DecisionTreeRegressionTrainer.getImpurityMeasureCalculator(Dataset<EmptyContext,DecisionTreeData> dataset)
Returns impurity measure calculator.
|
Modifier and Type | Method and Description |
---|---|
DecisionTreeLeafNode |
MeanDecisionTreeLeafBuilder.createLeafNode(Dataset<EmptyContext,DecisionTreeData> dataset,
TreeFilter pred)
Creates new leaf node for given dataset and node predicate.
|
DecisionTreeLeafNode |
MostCommonDecisionTreeLeafBuilder.createLeafNode(Dataset<EmptyContext,DecisionTreeData> dataset,
TreeFilter pred)
Creates new leaf node for given dataset and node predicate.
|
DecisionTreeLeafNode |
DecisionTreeLeafBuilder.createLeafNode(Dataset<EmptyContext,DecisionTreeData> dataset,
TreeFilter pred)
Creates new leaf node for given dataset and node predicate.
|
Modifier and Type | Method and Description |
---|---|
protected boolean |
RandomForestClassifierTrainer.init(Dataset<EmptyContext,BootstrappedDatasetPartition> dataset)
Aggregates all unique labels from dataset and assigns integer id value for each label.
|
protected boolean |
RandomForestTrainer.init(Dataset<EmptyContext,BootstrappedDatasetPartition> dataset)
Init-step before learning.
|
Modifier and Type | Method and Description |
---|---|
Map<NodeId,ImpurityHistogramsComputer.NodeImpurityHistograms<S>> |
ImpurityHistogramsComputer.aggregateImpurityStatistics(ArrayList<TreeRoot> roots,
Map<Integer,BucketMeta> histMeta,
Map<NodeId,TreeNode> nodesToLearn,
Dataset<EmptyContext,BootstrappedDatasetPartition> dataset)
Computes histograms for each feature.
|
Modifier and Type | Method and Description |
---|---|
List<NormalDistributionStatistics> |
NormalDistributionStatisticsComputer.computeStatistics(List<FeatureMeta> meta,
Dataset<EmptyContext,BootstrappedDatasetPartition> dataset)
Computes statistics of normal distribution on features in dataset.
|
void |
LeafValuesComputer.setValuesForLeaves(ArrayList<TreeRoot> roots,
Dataset<EmptyContext,BootstrappedDatasetPartition> dataset)
Takes a list of all built trees and in one map-reduceImpurityStatistics step collect statistics for evaluating
leaf-values for each tree and sets values for leaves.
|
GridGain In-Memory Computing Platform : ver. 8.9.14 Release Date : November 5 2024