Procházet zdrojové kódy

[ML] adds new change_point pipeline aggregation (#83428)

adds a new `change_point` sibling pipeline aggregation.

This aggregation detects a change_point in a multi-bucket aggregation. 

Example:
```
POST kibana_sample_data_flights/_search
{
  "size": 0,
  "aggs": {
    "histo": {
      "date_histogram": {
        "field": "timestamp",
        "fixed_interval": "3h"
      },
      "aggs": {
        "ticket_price": {
          "max": {
            "field": "AvgTicketPrice"
          }
        }
      }
    },
    "changes": {
      "change_point": {
        "buckets_path": "histo>ticket_price"
      }
    }
  }
}
```

Response
```
{
  /*<snip>*/ 
  "aggregations" : {
    "histo" : {
      "buckets" : [ /*<snip>*/ ]
    },
    "changes" : {
      "bucket" : {
        "key" : "2022-01-28T23:00:00.000Z",
        "doc_count" : 48,
        "ticket_price" : {
          "value" : 1187.61083984375
        }
      },
      "type" : {
        "distribution_change" : {
          "p_value" : 0.023753965139433175,
          "change_point" : 40
        }
      }
    }
  }
}
```
Benjamin Trent před 3 roky
rodič
revize
cf151b53fe
19 změnil soubory, kde provedl 2077 přidání a 8 odebrání
  1. 5 0
      docs/changelog/83428.yaml
  2. 2 0
      docs/reference/aggregations/pipeline.asciidoc
  3. 99 0
      docs/reference/aggregations/pipeline/change-point-aggregation.asciidoc
  4. 4 0
      x-pack/plugin/ml/qa/ml-with-security/build.gradle
  5. 5 1
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/MachineLearning.java
  6. 64 7
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/MlAggsHelper.java
  7. 90 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregationBuilder.java
  8. 392 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregator.java
  9. 87 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointBucket.java
  10. 27 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointNamedContentProvider.java
  11. 342 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangeType.java
  12. 89 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/InternalChangePointAggregation.java
  13. 156 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/KDE.java
  14. 152 0
      x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/LeastSquaresOnlineRegression.java
  15. 28 0
      x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregationBuilderTests.java
  16. 203 0
      x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregatorTests.java
  17. 52 0
      x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/InternalChangePointAggregationTests.java
  18. 67 0
      x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/LeastSquaresOnlineRegressionTests.java
  19. 213 0
      x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/ml/change_point_agg.yml

+ 5 - 0
docs/changelog/83428.yaml

@@ -0,0 +1,5 @@
+pr: 83428
+summary: Adds new `change_point` pipeline aggregation
+area: Machine Learning
+type: feature
+issues: []

+ 2 - 0
docs/reference/aggregations/pipeline.asciidoc

@@ -283,6 +283,8 @@ include::pipeline/bucket-selector-aggregation.asciidoc[]
 
 include::pipeline/bucket-sort-aggregation.asciidoc[]
 
+include::pipeline/change-point-aggregation.asciidoc[]
+
 include::pipeline/cumulative-cardinality-aggregation.asciidoc[]
 
 include::pipeline/cumulative-sum-aggregation.asciidoc[]

+ 99 - 0
docs/reference/aggregations/pipeline/change-point-aggregation.asciidoc

@@ -0,0 +1,99 @@
+[role="xpack"]
+[[search-aggregations-change-point-aggregation]]
+=== Change point aggregation
+++++
+<titleabbrev>Change point</titleabbrev>
+++++
+
+experimental::[]
+
+A sibling pipeline that detects, spikes, dips, and change points in a metric. Given a distribution of values
+provided by the sibling multi-bucket aggregation, this aggregation indicates the bucket of any spike or dip
+and/or the bucket at which the largest change in the distribution of values, if they are statistically significant.
+
+
+
+[[change-point-agg-syntax]]
+==== Parameters
+
+`buckets_path`::
+(Required, string)
+Path to the buckets that contain one set of values in which to detect a change point. There must be at least 21 bucketed
+values. Fewer than 1,000 is preferred.
+For syntax, see <<buckets-path-syntax>>.
+
+==== Syntax
+
+A `change_point` aggregation looks like this in isolation:
+
+[source,js]
+--------------------------------------------------
+{
+  "change_point": {
+    "buckets_path": "date_histogram>_count" <1>
+  }
+}
+--------------------------------------------------
+// NOTCONSOLE
+<1> The buckets containing the values to test against.
+
+[[change-point-agg-response]]
+==== Response body
+
+`bucket`::
+(Optional, object)
+Values of the bucket that indicates the discovered change point. Not returned if no change point was found.
+All the aggregations in the bucket are returned as well.
++
+.Properties of bucket
+[%collapsible%open]
+====
+`key`:::
+(value)
+The key of the bucket matched. Could be string or numeric.
+
+`doc_count`:::
+(number)
+The document count of the bucket.
+====
+
+`type`::
+(object)
+The found change point type and its related values. Possible types:
++
+--
+* `dip`: a significant dip occurs at this change point
+* `distribution_change`: the overall distribution of the values has changed significantly
+* `non_stationary`: there is no change point, but the values are not from a stationary distribution
+* `spike`: a significant spike occurs at this point
+* `stationary`: no change point found
+* `step_change`: the change indicates a statistically significant step up or down in value distribution
+* `trend_change`: there is an overall trend change occurring at this point
+--
+
+==== Response example
+[source,js]
+--------------------------------------------------
+    "changes" : {
+      "bucket" : {
+        "key" : "2022-01-28T23:00:00.000Z", <1>
+        "doc_count" : 48, <2>
+        "ticket_price" : { <3>
+          "value" : 1187.61083984375
+        }
+      },
+      "type" : { <4>
+        "distribution_change" : {
+          "p_value" : 0.023753965139433175, <5>
+          "change_point" : 40 <6>
+        }
+      }
+    }
+--------------------------------------------------
+// NOTCONSOLE
+<1> The bucket key that is the change point.
+<2> The number of documents in that bucket.
+<3> Aggregated values in the bucket.
+<4> Type of change found.
+<5> The `p_value` indicates how extreme the change is; lower values indicate greater change.
+<6> The specific bucket where the change occurs (indexing starts at `0`).

+ 4 - 0
x-pack/plugin/ml/qa/ml-with-security/build.gradle

@@ -39,6 +39,10 @@ tasks.named("yamlRestTest").configure {
     'ml/categorization_agg/Test categorization agg simple',
     'ml/categorization_agg/Test categorization aggregation against unsupported field',
     'ml/categorization_agg/Test categorization aggregation with poor settings',
+    'ml/change_point_agg/Test change_point agg simple',
+    'ml/change_point_agg/Test change_point with missing buckets_path',
+    'ml/change_point_agg/Test change_point with bad buckets_path',
+    'ml/change_point_agg/Test change_point with too few buckets',
     'ml/custom_all_field/Test querying custom all field',
     'ml/datafeeds_crud/Test delete datafeed with missing id',
     'ml/datafeeds_crud/Test put datafeed referring to missing job_id',

+ 5 - 1
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/MachineLearning.java

@@ -273,6 +273,8 @@ import org.elasticsearch.xpack.ml.action.TransportValidateDetectorAction;
 import org.elasticsearch.xpack.ml.action.TransportValidateJobConfigAction;
 import org.elasticsearch.xpack.ml.aggs.categorization.CategorizeTextAggregationBuilder;
 import org.elasticsearch.xpack.ml.aggs.categorization.InternalCategorizationAggregation;
+import org.elasticsearch.xpack.ml.aggs.changepoint.ChangePointAggregationBuilder;
+import org.elasticsearch.xpack.ml.aggs.changepoint.ChangePointNamedContentProvider;
 import org.elasticsearch.xpack.ml.aggs.correlation.BucketCorrelationAggregationBuilder;
 import org.elasticsearch.xpack.ml.aggs.correlation.CorrelationNamedContentProvider;
 import org.elasticsearch.xpack.ml.aggs.heuristic.PValueScore;
@@ -1389,7 +1391,8 @@ public class MachineLearning extends Plugin
         return Arrays.asList(
             InferencePipelineAggregationBuilder.buildSpec(modelLoadingService, getLicenseState(), settings),
             BucketCorrelationAggregationBuilder.buildSpec(),
-            BucketCountKSTestAggregationBuilder.buildSpec()
+            BucketCountKSTestAggregationBuilder.buildSpec(),
+            ChangePointAggregationBuilder.buildSpec()
         );
     }
 
@@ -1514,6 +1517,7 @@ public class MachineLearning extends Plugin
         namedWriteables.addAll(new MlInferenceNamedXContentProvider().getNamedWriteables());
         namedWriteables.addAll(MlAutoscalingNamedWritableProvider.getNamedWriteables());
         namedWriteables.addAll(new CorrelationNamedContentProvider().getNamedWriteables());
+        namedWriteables.addAll(new ChangePointNamedContentProvider().getNamedWriteables());
         return namedWriteables;
     }
 

+ 64 - 7
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/MlAggsHelper.java

@@ -36,6 +36,25 @@ public final class MlAggsHelper {
      * @return The double values and doc_counts extracted from the path if the bucket path exists and the value is a valid number
      */
     public static Optional<DoubleBucketValues> extractDoubleBucketedValues(String bucketPath, Aggregations aggregations) {
+        return extractDoubleBucketedValues(bucketPath, aggregations, BucketHelpers.GapPolicy.INSERT_ZEROS, false);
+    }
+
+    /**
+     * This extracts the bucket values as doubles.
+     *
+     * If the gap policy is skippable, the true bucket index (as stored in the aggregation) is returned as well.
+     * @param bucketPath The bucket path from which to extract values
+     * @param aggregations The aggregations
+     * @param gapPolicy the desired gap policy
+     * @param excludeLastBucket should the last bucket be excluded? This is useful when excluding potentially partial buckets
+     * @return The double values, doc_counts, and bucket index positions extracted from the path if the bucket path exists
+     */
+    public static Optional<DoubleBucketValues> extractDoubleBucketedValues(
+        String bucketPath,
+        Aggregations aggregations,
+        BucketHelpers.GapPolicy gapPolicy,
+        boolean excludeLastBucket
+    ) {
         List<String> parsedPath = AggregationPath.parse(bucketPath).getPathElementsAsStringList();
         for (Aggregation aggregation : aggregations) {
             if (aggregation.getName().equals(parsedPath.get(0))) {
@@ -44,14 +63,19 @@ public final class MlAggsHelper {
                 List<? extends InternalMultiBucketAggregation.InternalBucket> buckets = multiBucketsAgg.getBuckets();
                 List<Double> values = new ArrayList<>(buckets.size());
                 List<Long> docCounts = new ArrayList<>(buckets.size());
+                List<Integer> bucketIndexes = new ArrayList<>(buckets.size());
+                int bucketCount = 0;
+                int totalBuckets = buckets.size();
                 for (InternalMultiBucketAggregation.InternalBucket bucket : buckets) {
-                    Double bucketValue = BucketHelpers.resolveBucketValue(
-                        multiBucketsAgg,
-                        bucket,
-                        sublistedPath,
-                        BucketHelpers.GapPolicy.INSERT_ZEROS
-                    );
+                    Double bucketValue = BucketHelpers.resolveBucketValue(multiBucketsAgg, bucket, sublistedPath, gapPolicy);
+                    if (excludeLastBucket && bucketCount >= totalBuckets - 1) {
+                        continue;
+                    }
                     if (bucketValue == null || Double.isNaN(bucketValue)) {
+                        if (gapPolicy.isSkippable) {
+                            bucketCount++;
+                            continue;
+                        }
                         throw new AggregationExecutionException(
                             "missing or invalid bucket value found for path ["
                                 + bucketPath
@@ -60,13 +84,15 @@ public final class MlAggsHelper {
                                 + "]"
                         );
                     }
+                    bucketIndexes.add(bucketCount++);
                     values.add(bucketValue);
                     docCounts.add(bucket.getDocCount());
                 }
                 return Optional.of(
                     new DoubleBucketValues(
                         docCounts.stream().mapToLong(Long::longValue).toArray(),
-                        values.stream().mapToDouble(Double::doubleValue).toArray()
+                        values.stream().mapToDouble(Double::doubleValue).toArray(),
+                        bucketCount == bucketIndexes.size() ? new int[0] : bucketIndexes.stream().mapToInt(Integer::intValue).toArray()
                     )
                 );
             }
@@ -74,16 +100,40 @@ public final class MlAggsHelper {
         return Optional.empty();
     }
 
+    public static Optional<InternalMultiBucketAggregation.InternalBucket> extractBucket(
+        String bucketPath,
+        Aggregations aggregations,
+        int bucket
+    ) {
+        List<String> parsedPath = AggregationPath.parse(bucketPath).getPathElementsAsStringList();
+        for (Aggregation aggregation : aggregations) {
+            if (aggregation.getName().equals(parsedPath.get(0))) {
+                InternalMultiBucketAggregation<?, ?> multiBucketsAgg = (InternalMultiBucketAggregation<?, ?>) aggregation;
+                List<? extends InternalMultiBucketAggregation.InternalBucket> buckets = multiBucketsAgg.getBuckets();
+                if (bucket < buckets.size() && bucket >= 0) {
+                    return Optional.of(buckets.get(bucket));
+                }
+            }
+        }
+        return Optional.empty();
+    }
+
     /**
      * Utility class for holding an unboxed double value and the document count for a bucket
      */
     public static class DoubleBucketValues {
         private final long[] docCounts;
         private final double[] values;
+        private final int[] buckets;
 
         public DoubleBucketValues(long[] docCounts, double[] values) {
+            this(docCounts, values, new int[0]);
+        }
+
+        public DoubleBucketValues(long[] docCounts, double[] values, int[] buckets) {
             this.docCounts = docCounts;
             this.values = values;
+            this.buckets = buckets;
         }
 
         public long[] getDocCounts() {
@@ -93,6 +143,13 @@ public final class MlAggsHelper {
         public double[] getValues() {
             return values;
         }
+
+        public int getBucketIndex(int bucketPos) {
+            if (buckets.length == 0) {
+                return bucketPos;
+            }
+            return buckets[bucketPos];
+        }
     }
 
 }

+ 90 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregationBuilder.java

@@ -0,0 +1,90 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.elasticsearch.Version;
+import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.common.io.stream.StreamOutput;
+import org.elasticsearch.plugins.SearchPlugin;
+import org.elasticsearch.search.aggregations.pipeline.BucketHelpers;
+import org.elasticsearch.search.aggregations.pipeline.BucketMetricsPipelineAggregationBuilder;
+import org.elasticsearch.search.aggregations.pipeline.PipelineAggregator;
+import org.elasticsearch.xcontent.ConstructingObjectParser;
+import org.elasticsearch.xcontent.ObjectParser;
+import org.elasticsearch.xcontent.ParseField;
+import org.elasticsearch.xcontent.XContentBuilder;
+
+import java.io.IOException;
+import java.util.Locale;
+import java.util.Map;
+
+import static org.elasticsearch.search.aggregations.pipeline.PipelineAggregator.Parser.GAP_POLICY;
+
+public class ChangePointAggregationBuilder extends BucketMetricsPipelineAggregationBuilder<ChangePointAggregationBuilder> {
+
+    public static final ParseField NAME = new ParseField("change_point");
+    @SuppressWarnings("unchecked")
+    public static final ConstructingObjectParser<ChangePointAggregationBuilder, String> PARSER = new ConstructingObjectParser<>(
+        NAME.getPreferredName(),
+        false,
+        (args, context) -> new ChangePointAggregationBuilder(context, (String) args[0])
+    );
+
+    static {
+        PARSER.declareString(ConstructingObjectParser.constructorArg(), BUCKETS_PATH_FIELD);
+        PARSER.declareField(
+            ConstructingObjectParser.optionalConstructorArg(),
+            p -> BucketHelpers.GapPolicy.parse(p.text().toLowerCase(Locale.ROOT), p.getTokenLocation()),
+            GAP_POLICY,
+            ObjectParser.ValueType.STRING
+        );
+    }
+
+    public ChangePointAggregationBuilder(String name, String bucketsPath) {
+        super(name, NAME.getPreferredName(), new String[] { bucketsPath });
+    }
+
+    public ChangePointAggregationBuilder(StreamInput in) throws IOException {
+        super(in, NAME.getPreferredName());
+    }
+
+    public static SearchPlugin.PipelineAggregationSpec buildSpec() {
+        return new SearchPlugin.PipelineAggregationSpec(NAME, ChangePointAggregationBuilder::new, ChangePointAggregationBuilder.PARSER)
+            .addResultReader(InternalChangePointAggregation::new);
+    }
+
+    @Override
+    public String getWriteableName() {
+        return NAME.getPreferredName();
+    }
+
+    @Override
+    public Version getMinimalSupportedVersion() {
+        return Version.V_8_2_0;
+    }
+
+    @Override
+    protected void innerWriteTo(StreamOutput out) throws IOException {}
+
+    @Override
+    protected PipelineAggregator createInternal(Map<String, Object> metadata) {
+        return new ChangePointAggregator(name, bucketsPaths[0], metadata);
+    }
+
+    @Override
+    protected boolean overrideBucketsPath() {
+        return true;
+    }
+
+    @Override
+    protected XContentBuilder doXContentBody(XContentBuilder builder, Params params) throws IOException {
+        builder.field(BUCKETS_PATH_FIELD.getPreferredName(), bucketsPaths[0]);
+        return builder;
+    }
+
+}

+ 392 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregator.java

@@ -0,0 +1,392 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.apache.commons.math3.special.Beta;
+import org.apache.commons.math3.stat.inference.KolmogorovSmirnovTest;
+import org.apache.commons.math3.stat.regression.SimpleRegression;
+import org.elasticsearch.core.Tuple;
+import org.elasticsearch.search.aggregations.AggregationExecutionException;
+import org.elasticsearch.search.aggregations.AggregationReduceContext;
+import org.elasticsearch.search.aggregations.Aggregations;
+import org.elasticsearch.search.aggregations.InternalAggregation;
+import org.elasticsearch.search.aggregations.InternalAggregations;
+import org.elasticsearch.search.aggregations.pipeline.BucketHelpers;
+import org.elasticsearch.search.aggregations.pipeline.SiblingPipelineAggregator;
+import org.elasticsearch.xpack.ml.aggs.MlAggsHelper;
+
+import java.util.Arrays;
+import java.util.HashSet;
+import java.util.Map;
+import java.util.Set;
+import java.util.function.IntToDoubleFunction;
+import java.util.stream.IntStream;
+
+import static org.elasticsearch.xpack.ml.aggs.MlAggsHelper.extractBucket;
+import static org.elasticsearch.xpack.ml.aggs.MlAggsHelper.extractDoubleBucketedValues;
+
+public class ChangePointAggregator extends SiblingPipelineAggregator {
+
+    static final double P_VALUE_THRESHOLD = 0.025;
+    private static final int MINIMUM_BUCKETS = 10;
+    private static final int MAXIMUM_CANDIDATE_CHANGE_POINTS = 1000;
+    private static final KolmogorovSmirnovTest KOLMOGOROV_SMIRNOV_TEST = new KolmogorovSmirnovTest();
+
+    static Tuple<int[], Integer> candidateChangePoints(double[] values) {
+        int minValues = Math.max((int) (0.1 * values.length + 0.5), MINIMUM_BUCKETS);
+        if (values.length - 2 * minValues <= MAXIMUM_CANDIDATE_CHANGE_POINTS) {
+            return Tuple.tuple(IntStream.range(minValues, values.length - minValues).toArray(), 1);
+        } else {
+            int step = (int) Math.ceil((double) (values.length - 2 * minValues) / MAXIMUM_CANDIDATE_CHANGE_POINTS);
+            return Tuple.tuple(IntStream.range(minValues, values.length - minValues).filter(i -> i % step == 0).toArray(), step);
+        }
+    }
+
+    public ChangePointAggregator(String name, String bucketsPath, Map<String, Object> metadata) {
+        super(name, new String[] { bucketsPath }, metadata);
+    }
+
+    @Override
+    public InternalAggregation doReduce(Aggregations aggregations, AggregationReduceContext context) {
+        MlAggsHelper.DoubleBucketValues maybeBucketsValue = extractDoubleBucketedValues(
+            bucketsPaths()[0],
+            aggregations,
+            BucketHelpers.GapPolicy.SKIP,
+            true
+        ).orElseThrow(
+            () -> new AggregationExecutionException(
+                "unable to find valid bucket values in bucket path [" + bucketsPaths()[0] + "] for agg [" + name() + "]"
+            )
+        );
+        if (maybeBucketsValue.getValues().length < (2 * MINIMUM_BUCKETS) + 2) {
+            throw new AggregationExecutionException(
+                "not enough buckets to calculate change_point. Requires at least [" + ((2 * MINIMUM_BUCKETS) + 2) + "]"
+            );
+        }
+        Tuple<int[], Integer> candidatePoints = candidateChangePoints(maybeBucketsValue.getValues());
+        ChangeType changeType = changePValue(maybeBucketsValue, candidatePoints, P_VALUE_THRESHOLD);
+        if (changeType.pValue() > P_VALUE_THRESHOLD) {
+            changeType = maxDeviationKdePValue(maybeBucketsValue, P_VALUE_THRESHOLD);
+        }
+        ChangePointBucket changePointBucket = null;
+        if (changeType.changePoint() >= 0) {
+            changePointBucket = extractBucket(bucketsPaths()[0], aggregations, changeType.changePoint()).map(
+                b -> new ChangePointBucket(b.getKey(), b.getDocCount(), (InternalAggregations) b.getAggregations())
+            ).orElse(null);
+        }
+
+        return new InternalChangePointAggregation(name(), metadata(), changePointBucket, changeType);
+    }
+
+    static ChangeType maxDeviationKdePValue(MlAggsHelper.DoubleBucketValues bucketValues, double pValueThreshold) {
+        double[] timeWindow = bucketValues.getValues();
+        double variance = RunningStats.from(timeWindow, i -> 1.0).variance();
+        if (variance == 0.0) {
+            return new ChangeType.Stationary();
+        }
+        int minIndex = 0;
+        double minValue = Double.MAX_VALUE;
+        int maxIndex = 0;
+        double maxValue = -Double.MAX_VALUE;
+        for (int i = 0; i < timeWindow.length; i++) {
+            if (timeWindow[i] < minValue) {
+                minValue = timeWindow[i];
+                minIndex = i;
+            }
+            if (timeWindow[i] > maxValue) {
+                maxValue = timeWindow[i];
+                maxIndex = i;
+            } else if (timeWindow[i] == maxValue) {
+                maxIndex = i;
+            }
+        }
+        KDE dist = new KDE(timeWindow, minIndex, maxIndex);
+        KDE.ValueAndMagnitude cdf = dist.cdf(minValue);
+        KDE.ValueAndMagnitude sf = dist.sf(maxValue);
+
+        if (cdf.isMoreSignificant(sf, timeWindow.length) && cdf.significance(timeWindow.length) * 2 < pValueThreshold) {
+            return new ChangeType.Dip(cdf.significance(timeWindow.length) * 2, bucketValues.getBucketIndex(minIndex));
+        }
+        if (sf.significance(timeWindow.length) * 2 < pValueThreshold) {
+            return new ChangeType.Spike(sf.significance(timeWindow.length) * 2, bucketValues.getBucketIndex(maxIndex));
+        }
+        return new ChangeType.Stationary();
+
+    }
+
+    static ChangeType changePValue(
+        MlAggsHelper.DoubleBucketValues bucketValues,
+        Tuple<int[], Integer> candidateChangePointsAndStep,
+        double pValueThreshold
+    ) {
+        double[] timeWindow = bucketValues.getValues();
+        double totalUnweightedVariance = RunningStats.from(timeWindow, i -> 1.0).variance();
+        ChangeType changeType = new ChangeType.Stationary();
+        if (totalUnweightedVariance == 0.0) {
+            return changeType;
+        }
+        double[] timeWindowWeights = outlierWeights(timeWindow);
+        int[] candidateChangePoints = candidateChangePointsAndStep.v1();
+        int step = candidateChangePointsAndStep.v2();
+        double totalVariance = RunningStats.from(timeWindow, i -> timeWindowWeights[i]).variance();
+        double vNull = totalVariance;
+        if (totalVariance == 0.0) {
+            return changeType;
+        }
+        double n = timeWindow.length;
+        double dfNull = n - 1;
+        LeastSquaresOnlineRegression allLeastSquares = new LeastSquaresOnlineRegression(2);
+        for (int i = 0; i < timeWindow.length; i++) {
+            allLeastSquares.add(i, timeWindow[i], timeWindowWeights[i]);
+        }
+        double rValue = allLeastSquares.rSquared();
+
+        double vAlt = totalVariance * (1 - Math.abs(rValue));
+        double dfAlt = n - 3;
+        double pValueVsNull = fTestPValue(vNull, dfNull, vAlt, dfAlt);
+        if (pValueVsNull < pValueThreshold && Math.abs(rValue) >= 0.5) {
+            double pValueVsStationary = fTestPValue(totalVariance, n - 1, vAlt, dfAlt);
+            SimpleRegression regression = new SimpleRegression();
+            for (int i = 0; i < timeWindow.length; i++) {
+                regression.addData(i, timeWindow[i]);
+            }
+            double slope = regression.getSlope();
+            changeType = new ChangeType.NonStationary(pValueVsStationary, rValue, slope < 0 ? "decreasing" : "increasing");
+            vNull = vAlt;
+            dfNull = dfAlt;
+        }
+        RunningStats lowerRange = new RunningStats();
+        RunningStats upperRange = new RunningStats();
+        // Initialize running stats so that they are only missing the individual changepoint values
+        upperRange.addValues(timeWindow, i -> timeWindowWeights[i], candidateChangePoints[0], timeWindow.length);
+        lowerRange.addValues(timeWindow, i -> timeWindowWeights[i], 0, candidateChangePoints[0]);
+        vAlt = Double.MAX_VALUE;
+        Set<Integer> discoveredChangePoints = new HashSet<>(3, 1.0f);
+        int changePoint = candidateChangePoints[candidateChangePoints.length - 1] + 1;
+        for (int cp : candidateChangePoints) {
+            double maybeVAlt = (cp * lowerRange.variance() + (n - cp) * upperRange.variance()) / n;
+            if (maybeVAlt < vAlt) {
+                vAlt = maybeVAlt;
+                changePoint = cp;
+            }
+            lowerRange.addValues(timeWindow, i -> timeWindowWeights[i], cp, cp + step);
+            upperRange.removeValues(timeWindow, i -> timeWindowWeights[i], cp, cp + step);
+        }
+        discoveredChangePoints.add(changePoint);
+        dfAlt = n - 2;
+
+        pValueVsNull = independentTrialsPValue(fTestPValue(vNull, dfNull, vAlt, dfAlt), candidateChangePoints.length);
+        if (pValueVsNull < pValueThreshold) {
+            changeType = new ChangeType.StepChange(pValueVsNull, bucketValues.getBucketIndex(changePoint));
+            vNull = vAlt;
+            dfNull = dfAlt;
+        }
+
+        VarianceAndRValue vAndR = new VarianceAndRValue(Double.MAX_VALUE, Double.MAX_VALUE);
+        changePoint = candidateChangePoints[candidateChangePoints.length - 1] + 1;
+        lowerRange = new RunningStats();
+        upperRange = new RunningStats();
+        // Initialize running stats so that they are only missing the individual changepoint values
+        upperRange.addValues(timeWindow, i -> timeWindowWeights[i], candidateChangePoints[0], timeWindow.length);
+        lowerRange.addValues(timeWindow, i -> timeWindowWeights[i], 0, candidateChangePoints[0]);
+        LeastSquaresOnlineRegression lowerLeastSquares = new LeastSquaresOnlineRegression(2);
+        LeastSquaresOnlineRegression upperLeastSquares = new LeastSquaresOnlineRegression(2);
+        for (int i = 0; i < candidateChangePoints[0]; i++) {
+            lowerLeastSquares.add(i, timeWindow[i], timeWindowWeights[i]);
+        }
+        for (int i = candidateChangePoints[0], x = 0; i < timeWindow.length; i++, x++) {
+            upperLeastSquares.add(x, timeWindow[i], timeWindowWeights[i]);
+        }
+        int upperMovingWindow = 0;
+        for (int cp : candidateChangePoints) {
+            double lowerRangeVar = lowerRange.variance();
+            double upperRangeVar = upperRange.variance();
+            double rv1 = lowerLeastSquares.rSquared();
+            double rv2 = upperLeastSquares.rSquared();
+            double v1 = lowerRangeVar * (1 - Math.abs(rv1));
+            double v2 = upperRangeVar * (1 - Math.abs(rv2));
+            VarianceAndRValue varianceAndRValue = new VarianceAndRValue((cp * v1 + (n - cp) * v2) / n, (cp * rv1 + (n - cp) * rv2) / n);
+            if (varianceAndRValue.compareTo(vAndR) < 0) {
+                vAndR = varianceAndRValue;
+                changePoint = cp;
+            }
+            for (int i = 0; i < step; i++) {
+                lowerRange.addValue(timeWindow[i + cp], timeWindowWeights[i + cp]);
+                upperRange.removeValue(timeWindow[i + cp], timeWindowWeights[i + cp]);
+                lowerLeastSquares.add(i + cp, timeWindow[i + cp], timeWindowWeights[i + cp]);
+                upperLeastSquares.remove(i + upperMovingWindow, timeWindow[i + cp], timeWindowWeights[i + cp]);
+                upperMovingWindow++;
+            }
+        }
+        discoveredChangePoints.add(changePoint);
+
+        dfAlt = n - 6;
+        pValueVsNull = independentTrialsPValue(fTestPValue(vNull, dfNull, vAndR.variance, dfAlt), candidateChangePoints.length);
+        if (pValueVsNull < pValueThreshold && Math.abs(vAndR.rValue) >= 0.5) {
+            double pValueVsStationary = independentTrialsPValue(
+                fTestPValue(totalVariance, n - 1, vAndR.variance, dfAlt),
+                candidateChangePoints.length
+            );
+            changeType = new ChangeType.TrendChange(pValueVsStationary, vAndR.rValue, bucketValues.getBucketIndex(changePoint));
+        }
+
+        if (changeType.pValue() > 1e-5) {
+            double diff = 0.0;
+            changePoint = -1;
+            lowerRange = new RunningStats();
+            upperRange = new RunningStats();
+            // Initialize running stats so that they are only missing the individual changepoint values
+            upperRange.addValues(timeWindow, i -> timeWindowWeights[i], candidateChangePoints[0], timeWindow.length);
+            lowerRange.addValues(timeWindow, i -> timeWindowWeights[i], 0, candidateChangePoints[0]);
+            for (int cp : candidateChangePoints) {
+                double otherDiff = Math.min(cp, timeWindow.length - cp) * (0.9 * Math.abs(lowerRange.mean() - upperRange.mean())) + 0.1
+                    * Math.abs(lowerRange.std() - upperRange.std());
+                if (otherDiff >= diff) {
+                    changePoint = cp;
+                    diff = otherDiff;
+                }
+                lowerRange.addValues(timeWindow, i -> timeWindowWeights[i], cp, cp + step);
+                upperRange.removeValues(timeWindow, i -> timeWindowWeights[i], cp, cp + step);
+            }
+            discoveredChangePoints.add(changePoint);
+            double pValue = 1;
+            for (int i : discoveredChangePoints) {
+                double ksTestPValue = KOLMOGOROV_SMIRNOV_TEST.kolmogorovSmirnovTest(
+                    Arrays.copyOfRange(timeWindow, 0, i),
+                    Arrays.copyOfRange(timeWindow, i, timeWindow.length)
+                );
+                if (ksTestPValue < pValue) {
+                    changePoint = i;
+                    pValue = ksTestPValue;
+                }
+            }
+            pValue = independentTrialsPValue(pValue, candidateChangePoints.length);
+            if (pValue < Math.min(pValueThreshold, 0.1 * changeType.pValue())) {
+                changeType = new ChangeType.DistributionChange(pValue, bucketValues.getBucketIndex(changePoint));
+            }
+        }
+        return changeType;
+    }
+
+    static double[] outlierWeights(double[] values) {
+        int i = (int) Math.ceil(0.025 * values.length);
+        double[] weights = Arrays.copyOf(values, values.length);
+        Arrays.sort(weights);
+        double a = weights[i];
+        double b = weights[values.length - i];
+        for (int j = 0; j < values.length; j++) {
+            if (values[j] < b && values[j] >= a) {
+                weights[j] = 1.0;
+            } else {
+                weights[j] = 0.01;
+            }
+        }
+        return weights;
+    }
+
+    static double independentTrialsPValue(double pValue, int nTrials) {
+        return pValue > 1e-10 ? 1.0 - Math.pow(1.0 - pValue, nTrials) : nTrials * pValue;
+    }
+
+    static double fTestPValue(double vNull, double dfNull, double varianceAlt, double dfAlt) {
+        if (varianceAlt == vNull) {
+            return 1.0;
+        }
+        if (varianceAlt == 0.0) {
+            return 0.0;
+        }
+        double F = dfAlt / dfNull * vNull / varianceAlt;
+        double sf = fDistribSf(dfNull, dfAlt, F);
+        return Math.min(2 * sf, 1.0);
+    }
+
+    static class RunningStats {
+        double sumOfSqrs;
+        double sum;
+        double count;
+
+        static RunningStats from(double[] values, IntToDoubleFunction weightFunction) {
+            return new RunningStats().addValues(values, weightFunction, 0, values.length);
+        }
+
+        RunningStats() {}
+
+        double variance() {
+            return Math.max((sumOfSqrs - ((sum * sum) / count)) / count, 0.0);
+        }
+
+        double mean() {
+            return sum / count;
+        }
+
+        double std() {
+            return Math.sqrt(variance());
+        }
+
+        RunningStats addValues(double[] value, IntToDoubleFunction weightFunction, int start, int end) {
+            for (int i = start; i < value.length && i < end; i++) {
+                addValue(value[i], weightFunction.applyAsDouble(i));
+            }
+            return this;
+        }
+
+        RunningStats addValue(double value, double weight) {
+            sumOfSqrs += (value * value * weight);
+            count += weight;
+            sum += (value * weight);
+            return this;
+        }
+
+        RunningStats removeValue(double value, double weight) {
+            sumOfSqrs = Math.max(sumOfSqrs - value * value * weight, 0);
+            count = Math.max(count - weight, 0);
+            sum -= (value * weight);
+            return this;
+        }
+
+        RunningStats removeValues(double[] value, IntToDoubleFunction weightFunction, int start, int end) {
+            for (int i = start; i < value.length && i < end; i++) {
+                removeValue(value[i], weightFunction.applyAsDouble(i));
+            }
+            return this;
+        }
+    }
+
+    static record VarianceAndRValue(double variance, double rValue) implements Comparable<VarianceAndRValue> {
+        @Override
+        public int compareTo(VarianceAndRValue o) {
+            int v = Double.compare(variance, o.variance);
+            if (v == 0) {
+                return Double.compare(rValue, o.rValue);
+            }
+            return v;
+        }
+
+        public VarianceAndRValue min(VarianceAndRValue other) {
+            if (this.compareTo(other) <= 0) {
+                return this;
+            }
+            return other;
+        }
+    }
+
+    static double fDistribSf(double numeratorDegreesOfFreedom, double denominatorDegreesOfFreedom, double x) {
+        if (x <= 0) {
+            return 1;
+        } else if (Double.isInfinite(x) || Double.isNaN(x)) {
+            return 0;
+        }
+
+        return Beta.regularizedBeta(
+            denominatorDegreesOfFreedom / (denominatorDegreesOfFreedom + numeratorDegreesOfFreedom * x),
+            0.5 * denominatorDegreesOfFreedom,
+            0.5 * numeratorDegreesOfFreedom
+        );
+    }
+
+}

+ 87 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointBucket.java

@@ -0,0 +1,87 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.common.io.stream.StreamOutput;
+import org.elasticsearch.search.aggregations.Aggregation;
+import org.elasticsearch.search.aggregations.Aggregations;
+import org.elasticsearch.search.aggregations.InternalAggregations;
+import org.elasticsearch.search.aggregations.InternalMultiBucketAggregation;
+import org.elasticsearch.xcontent.XContentBuilder;
+
+import java.io.IOException;
+import java.util.Objects;
+
+public class ChangePointBucket extends InternalMultiBucketAggregation.InternalBucket {
+    private final Object key;
+    private final long docCount;
+    private final InternalAggregations aggregations;
+
+    public ChangePointBucket(Object key, long docCount, InternalAggregations aggregations) {
+        this.key = key;
+        this.docCount = docCount;
+        this.aggregations = aggregations;
+    }
+
+    public ChangePointBucket(StreamInput in) throws IOException {
+        this.key = in.readGenericValue();
+        this.docCount = in.readVLong();
+        this.aggregations = InternalAggregations.readFrom(in);
+    }
+
+    @Override
+    public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
+        builder.startObject();
+        builder.field(Aggregation.CommonFields.KEY.getPreferredName(), key);
+        builder.field(Aggregation.CommonFields.DOC_COUNT.getPreferredName(), docCount);
+        aggregations.toXContentInternal(builder, params);
+        builder.endObject();
+        return builder;
+    }
+
+    @Override
+    public void writeTo(StreamOutput out) throws IOException {
+        out.writeGenericValue(key);
+        out.writeVLong(docCount);
+        aggregations.writeTo(out);
+    }
+
+    @Override
+    public Object getKey() {
+        return key;
+    }
+
+    @Override
+    public String getKeyAsString() {
+        return key.toString();
+    }
+
+    @Override
+    public long getDocCount() {
+        return docCount;
+    }
+
+    @Override
+    public Aggregations getAggregations() {
+        return aggregations;
+    }
+
+    @Override
+    public boolean equals(Object o) {
+        if (this == o) return true;
+        if (o == null || getClass() != o.getClass()) return false;
+        ChangePointBucket that = (ChangePointBucket) o;
+        return docCount == that.docCount && Objects.equals(key, that.key) && Objects.equals(aggregations, that.aggregations);
+    }
+
+    @Override
+    public int hashCode() {
+        return Objects.hash(key, docCount, aggregations);
+    }
+}

+ 27 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointNamedContentProvider.java

@@ -0,0 +1,27 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
+
+import java.util.List;
+
+public final class ChangePointNamedContentProvider {
+
+    public List<NamedWriteableRegistry.Entry> getNamedWriteables() {
+        return List.of(
+            new NamedWriteableRegistry.Entry(ChangeType.class, ChangeType.StepChange.NAME, ChangeType.StepChange::new),
+            new NamedWriteableRegistry.Entry(ChangeType.class, ChangeType.Spike.NAME, ChangeType.Spike::new),
+            new NamedWriteableRegistry.Entry(ChangeType.class, ChangeType.TrendChange.NAME, ChangeType.TrendChange::new),
+            new NamedWriteableRegistry.Entry(ChangeType.class, ChangeType.Dip.NAME, ChangeType.Dip::new),
+            new NamedWriteableRegistry.Entry(ChangeType.class, ChangeType.DistributionChange.NAME, ChangeType.DistributionChange::new),
+            new NamedWriteableRegistry.Entry(ChangeType.class, ChangeType.Stationary.NAME, ChangeType.Stationary::new),
+            new NamedWriteableRegistry.Entry(ChangeType.class, ChangeType.NonStationary.NAME, ChangeType.NonStationary::new)
+        );
+    }
+}

+ 342 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangeType.java

@@ -0,0 +1,342 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.elasticsearch.common.io.stream.NamedWriteable;
+import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.common.io.stream.StreamOutput;
+import org.elasticsearch.xcontent.XContentBuilder;
+import org.elasticsearch.xpack.core.ml.utils.NamedXContentObject;
+
+import java.io.IOException;
+import java.util.Objects;
+
+/**
+ * Writeable for all the change point types
+ */
+public interface ChangeType extends NamedWriteable, NamedXContentObject {
+
+    default int changePoint() {
+        return -1;
+    }
+
+    default double pValue() {
+        return 1.0;
+    }
+
+    abstract class AbstractChangePoint implements ChangeType {
+        private final double pValue;
+        private final int changePoint;
+
+        protected AbstractChangePoint(double pValue, int changePoint) {
+            this.pValue = pValue;
+            this.changePoint = changePoint;
+        }
+
+        @Override
+        public double pValue() {
+            return pValue;
+        }
+
+        @Override
+        public int changePoint() {
+            return changePoint;
+        }
+
+        public AbstractChangePoint(StreamInput in) throws IOException {
+            pValue = in.readDouble();
+            changePoint = in.readVInt();
+        }
+
+        @Override
+        public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
+            return builder.startObject().field("p_value", pValue).field("change_point", changePoint).endObject();
+        }
+
+        @Override
+        public String getWriteableName() {
+            return getName();
+        }
+
+        @Override
+        public void writeTo(StreamOutput out) throws IOException {
+            out.writeDouble(pValue);
+            out.writeVInt(changePoint);
+        }
+
+        @Override
+        public boolean equals(Object o) {
+            if (this == o) return true;
+            if (o == null || getClass() != o.getClass()) return false;
+            AbstractChangePoint that = (AbstractChangePoint) o;
+            return Double.compare(that.pValue, pValue) == 0 && changePoint == that.changePoint;
+        }
+
+        @Override
+        public int hashCode() {
+            return Objects.hash(pValue, changePoint);
+        }
+    }
+
+    /**
+     * Indicates that no change has occurred
+     */
+    class Stationary implements ChangeType {
+        public static final String NAME = "stationary";
+
+        public Stationary() {}
+
+        public Stationary(StreamInput input) {}
+
+        @Override
+        public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
+            return builder.startObject().endObject();
+        }
+
+        @Override
+        public String getWriteableName() {
+            return getName();
+        }
+
+        @Override
+        public void writeTo(StreamOutput out) throws IOException {}
+
+        @Override
+        public String getName() {
+            return NAME;
+        }
+
+        @Override
+        public int hashCode() {
+            return Objects.hashCode(getClass());
+        }
+
+        @Override
+        public boolean equals(Object obj) {
+            if (this == obj) return true;
+            return obj != null && obj.getClass() == getClass();
+        }
+    }
+
+    /**
+     * Indicates a step change occurred
+     */
+    class StepChange extends AbstractChangePoint {
+        public static final String NAME = "step_change";
+
+        public StepChange(double pValue, int changePoint) {
+            super(pValue, changePoint);
+        }
+
+        public StepChange(StreamInput in) throws IOException {
+            super(in);
+        }
+
+        @Override
+        public String getName() {
+            return NAME;
+        }
+    }
+
+    /**
+     * Indicates a distribution change occurred
+     */
+    class DistributionChange extends AbstractChangePoint {
+        public static final String NAME = "distribution_change";
+
+        public DistributionChange(double pValue, int changePoint) {
+            super(pValue, changePoint);
+        }
+
+        public DistributionChange(StreamInput in) throws IOException {
+            super(in);
+        }
+
+        @Override
+        public String getName() {
+            return NAME;
+        }
+
+    }
+
+    /**
+     * Indicates a trend change occurred
+     */
+    class NonStationary implements ChangeType {
+        public static final String NAME = "non_stationary";
+        private final double pValue;
+        private final double rValue;
+        private final String trend;
+
+        public NonStationary(double pValue, double rValue, String trend) {
+            this.pValue = pValue;
+            this.rValue = rValue;
+            this.trend = trend;
+        }
+
+        public NonStationary(StreamInput in) throws IOException {
+            pValue = in.readDouble();
+            rValue = in.readDouble();
+            trend = in.readString();
+        }
+
+        public String getTrend() {
+            return trend;
+        }
+
+        @Override
+        public double pValue() {
+            return pValue;
+        }
+
+        @Override
+        public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
+            return builder.startObject().field("p_value", pValue).field("r_value", rValue).field("trend", trend).endObject();
+        }
+
+        @Override
+        public String getWriteableName() {
+            return getName();
+        }
+
+        @Override
+        public void writeTo(StreamOutput out) throws IOException {
+            out.writeDouble(pValue);
+            out.writeDouble(rValue);
+            out.writeString(trend);
+        }
+
+        @Override
+        public String getName() {
+            return NAME;
+        }
+
+        @Override
+        public boolean equals(Object o) {
+            if (this == o) return true;
+            if (o == null || getClass() != o.getClass()) return false;
+            NonStationary that = (NonStationary) o;
+            return Double.compare(that.pValue, pValue) == 0
+                && Double.compare(that.rValue, rValue) == 0
+                && Objects.equals(trend, that.trend);
+        }
+
+        @Override
+        public int hashCode() {
+            return Objects.hash(pValue, rValue, trend);
+        }
+    }
+
+    /**
+     * Indicates a trend change occurred
+     */
+    class TrendChange implements ChangeType {
+        public static final String NAME = "trend_change";
+        private final double pValue;
+        private final double rValue;
+        private final int changePoint;
+
+        public TrendChange(double pValue, double rValue, int changePoint) {
+            this.pValue = pValue;
+            this.rValue = rValue;
+            this.changePoint = changePoint;
+        }
+
+        public TrendChange(StreamInput in) throws IOException {
+            pValue = in.readDouble();
+            rValue = in.readDouble();
+            changePoint = in.readVInt();
+        }
+
+        @Override
+        public double pValue() {
+            return pValue;
+        }
+
+        @Override
+        public int changePoint() {
+            return changePoint;
+        }
+
+        @Override
+        public XContentBuilder toXContent(XContentBuilder builder, Params params) throws IOException {
+            return builder.startObject().field("p_value", pValue).field("r_value", pValue).field("change_point", changePoint).endObject();
+        }
+
+        @Override
+        public String getWriteableName() {
+            return getName();
+        }
+
+        @Override
+        public void writeTo(StreamOutput out) throws IOException {
+            out.writeDouble(pValue);
+            out.writeDouble(rValue);
+            out.writeVInt(changePoint);
+        }
+
+        @Override
+        public String getName() {
+            return NAME;
+        }
+
+        @Override
+        public boolean equals(Object o) {
+            if (this == o) return true;
+            if (o == null || getClass() != o.getClass()) return false;
+            TrendChange that = (TrendChange) o;
+            return Double.compare(that.pValue, pValue) == 0 && Double.compare(that.rValue, rValue) == 0 && changePoint == that.changePoint;
+        }
+
+        @Override
+        public int hashCode() {
+            return Objects.hash(pValue, rValue, changePoint);
+        }
+    }
+
+    /**
+     * Indicates a spike occurred
+     */
+    class Spike extends AbstractChangePoint {
+        public static final String NAME = "spike";
+
+        public Spike(double pValue, int changePoint) {
+            super(pValue, changePoint);
+        }
+
+        public Spike(StreamInput in) throws IOException {
+            super(in);
+        }
+
+        @Override
+        public String getName() {
+            return NAME;
+        }
+    }
+
+    /**
+     * Indicates a dip occurred
+     */
+    class Dip extends AbstractChangePoint {
+        public static final String NAME = "dip";
+
+        public Dip(double pValue, int changePoint) {
+            super(pValue, changePoint);
+        }
+
+        public Dip(StreamInput in) throws IOException {
+            super(in);
+        }
+
+        @Override
+        public String getName() {
+            return NAME;
+        }
+    }
+
+}

+ 89 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/InternalChangePointAggregation.java

@@ -0,0 +1,89 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.common.io.stream.StreamOutput;
+import org.elasticsearch.search.aggregations.AggregationReduceContext;
+import org.elasticsearch.search.aggregations.InternalAggregation;
+import org.elasticsearch.xcontent.XContentBuilder;
+import org.elasticsearch.xpack.core.ml.utils.NamedXContentObjectHelper;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+
+public class InternalChangePointAggregation extends InternalAggregation {
+
+    private final ChangePointBucket bucket;
+    private final ChangeType changeType;
+
+    public InternalChangePointAggregation(String name, Map<String, Object> metadata, ChangePointBucket bucket, ChangeType changeType) {
+        super(name, metadata);
+        this.bucket = bucket;
+        this.changeType = changeType;
+    }
+
+    public InternalChangePointAggregation(StreamInput in) throws IOException {
+        super(in);
+        if (in.readBoolean()) {
+            this.bucket = new ChangePointBucket(in);
+        } else {
+            this.bucket = null;
+        }
+        this.changeType = in.readNamedWriteable(ChangeType.class);
+    }
+
+    public ChangePointBucket getBucket() {
+        return bucket;
+    }
+
+    public ChangeType getChangeType() {
+        return changeType;
+    }
+
+    @Override
+    public String getWriteableName() {
+        return ChangePointAggregationBuilder.NAME.getPreferredName();
+    }
+
+    @Override
+    protected void doWriteTo(StreamOutput out) throws IOException {
+        if (bucket != null) {
+            out.writeBoolean(true);
+            bucket.writeTo(out);
+        } else {
+            out.writeBoolean(false);
+        }
+        out.writeNamedWriteable(changeType);
+    }
+
+    @Override
+    public InternalAggregation reduce(List<InternalAggregation> aggregations, AggregationReduceContext reduceContext) {
+        throw new UnsupportedOperationException("Reducing a change_point aggregation is not supported");
+    }
+
+    @Override
+    protected boolean mustReduceOnSingleInternalAgg() {
+        return false;
+    }
+
+    @Override
+    public Object getProperty(List<String> path) {
+        return null;
+    }
+
+    @Override
+    public XContentBuilder doXContentBody(XContentBuilder builder, Params params) throws IOException {
+        if (bucket != null) {
+            builder.field("bucket", bucket);
+        }
+        NamedXContentObjectHelper.writeNamedObject(builder, params, "type", changeType);
+        return builder;
+    }
+}

+ 156 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/KDE.java

@@ -0,0 +1,156 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.apache.commons.math3.distribution.NormalDistribution;
+import org.apache.commons.math3.special.Erf;
+import org.apache.commons.math3.util.FastMath;
+import org.elasticsearch.common.Randomness;
+
+import java.util.Arrays;
+import java.util.List;
+import java.util.stream.Collectors;
+import java.util.stream.DoubleStream;
+import java.util.stream.IntStream;
+
+import static org.apache.commons.math3.stat.StatUtils.variance;
+
+/**
+ * Kernel Density Estimator
+ */
+final class KDE {
+    private static final double SQRT2 = FastMath.sqrt(2.0);
+    private static final double ESTIMATOR_EPS = 1e-10;
+
+    /**
+     * Fit KDE choosing bandwidth by maximum likelihood cross validation.
+     * @param orderedValues the provided values, sorted
+     * @return the maximum likelihood bandwidth
+     */
+    private static double maxLikelihoodBandwidth(double[] orderedValues) {
+        int step = Math.max((int) (orderedValues.length / 10.0 + 0.5), 2);
+        IntStream.Builder trainingIndicesBuilder = IntStream.builder();
+        IntStream.Builder testIndicesBuilder = IntStream.builder();
+        for (int i = 0; i < orderedValues.length; i += step) {
+            int adjStep = Math.min(i + step, orderedValues.length) - i;
+            List<Integer> indices = IntStream.range(i, i + adjStep).boxed().collect(Collectors.toList());
+            Randomness.shuffle(indices);
+            int n = Math.min(adjStep / 2, 4);
+            indices.stream().limit(n).forEach(trainingIndicesBuilder::add);
+            indices.stream().skip(n).forEach(testIndicesBuilder::add);
+        }
+        int[] trainingIndices = trainingIndicesBuilder.build().toArray();
+        int[] testIndices = testIndicesBuilder.build().toArray();
+        Arrays.sort(trainingIndices);
+        Arrays.sort(testIndices);
+        double[] xTrain = IntStream.of(trainingIndices).mapToDouble(i -> orderedValues[i]).toArray();
+        double maxLogLikelihood = -Double.MAX_VALUE;
+        double result = 0;
+        for (int i = 0; i < 20; ++i) {
+            double bandwidth = 0.02 * (i + 1) * (orderedValues[orderedValues.length - 1] - orderedValues[0]);
+            double logBandwidth = Math.log(bandwidth);
+            double logLikelihood = IntStream.of(testIndices)
+                .mapToDouble(j -> logLikelihood(xTrain, bandwidth, logBandwidth, orderedValues[j]))
+                .sum();
+            if (logLikelihood >= maxLogLikelihood) {
+                maxLogLikelihood = logLikelihood;
+                result = bandwidth;
+            }
+        }
+        return result;
+    }
+
+    private static int lowerBound(double[] xs, double x) {
+        int retVal = Arrays.binarySearch(xs, x);
+        if (retVal < 0) {
+            retVal = -1 - retVal;
+        }
+        return retVal;
+    }
+
+    private static double logLikelihood(double[] xs, double bandwidth, double logBandwidth, double x) {
+        int a = lowerBound(xs, x - 3.0 * bandwidth);
+        int b = lowerBound(xs, x + 3.0 * bandwidth);
+        double[] logPdfs = IntStream.range(Math.max(Math.min(a, b - 1), 0), Math.min(Math.max(b, a + 1), xs.length)).mapToDouble(i -> {
+            double y = (x - xs[i]) / bandwidth;
+            return -0.5 * y * y - logBandwidth;
+        }).toArray();
+        double maxLogPdf = DoubleStream.of(logPdfs).max().orElseThrow();
+        double result = DoubleStream.of(logPdfs).map(logPdf -> Math.exp(logPdf - maxLogPdf)).sum();
+        return Math.log(result) + maxLogPdf;
+    }
+
+    private final double[] orderedValues;
+    private final double bandwidth;
+
+    KDE(double[] values, int minIndex, int maxIndex) {
+        int excluded = (int) (0.025 * ((double) values.length) + 0.5);
+        double[] orderedValues = new double[values.length];
+        int j = 0;
+        for (int i = 0; i < values.length; i++) {
+            if ((i >= minIndex - excluded && i <= minIndex + excluded) || (i >= maxIndex - excluded && i <= maxIndex + excluded)) {
+                continue;
+            }
+            orderedValues[j++] = values[i];
+        }
+        this.orderedValues = Arrays.copyOf(orderedValues, j);
+        Arrays.sort(this.orderedValues);
+        double var = variance(this.orderedValues);
+        this.bandwidth = var > 0 ? maxLikelihoodBandwidth(this.orderedValues) : 0.01 * (values[maxIndex] - values[minIndex]);
+    }
+
+    ValueAndMagnitude cdf(double x) {
+        int a = lowerBound(orderedValues, x - 4.0 * bandwidth);
+        int b = lowerBound(orderedValues, x + 4.0 * bandwidth);
+        double cdf = 0.0;
+        double diff = Double.MAX_VALUE;
+        for (int i = a; i < Math.min(Math.max(b, a + 1), orderedValues.length); i++) {
+            cdf += new NormalDistribution(orderedValues[i], bandwidth).cumulativeProbability(x);
+            diff = Math.min(Math.abs(orderedValues[i] - x), diff);
+        }
+        cdf /= orderedValues.length;
+        return new ValueAndMagnitude(cdf, diff);
+    }
+
+    ValueAndMagnitude sf(double x) {
+        int a = lowerBound(orderedValues, x - 4.0 * bandwidth);
+        int b = lowerBound(orderedValues, x + 4.0 * bandwidth);
+        double sf = 0.0;
+        double diff = Double.MAX_VALUE;
+        for (int i = Math.max(Math.min(a, b - 1), 0); i < b; i++) {
+            sf += normSf(orderedValues[i], bandwidth, x);
+            diff = Math.min(Math.abs(orderedValues[i] - x), diff);
+        }
+        sf /= orderedValues.length;
+        return new ValueAndMagnitude(sf, diff);
+    }
+
+    static double normSf(double mean, double standardDeviation, double x) {
+        final double dev = x - mean;
+        if (Math.abs(dev) > 40 * standardDeviation) {
+            return dev > 0 ? 0.0d : 1.0d;
+        }
+        return 0.5 * Erf.erfc(dev / (standardDeviation * SQRT2));
+    }
+
+    static record ValueAndMagnitude(double value, double magnitude) {
+        boolean isMoreSignificant(ValueAndMagnitude o, int numberOfTestedValues) {
+            int c = Double.compare(significance(numberOfTestedValues), o.significance(numberOfTestedValues));
+            if (c != 0) {
+                return c < 0;
+            } else {
+                return magnitude > o.magnitude;
+            }
+        }
+
+        double significance(int numberOfTestedValues) {
+            return value > ESTIMATOR_EPS ? 1 - Math.pow(1 - value, numberOfTestedValues) : numberOfTestedValues * value;
+        }
+    }
+
+}

+ 152 - 0
x-pack/plugin/ml/src/main/java/org/elasticsearch/xpack/ml/aggs/changepoint/LeastSquaresOnlineRegression.java

@@ -0,0 +1,152 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.apache.commons.math3.linear.Array2DRowRealMatrix;
+import org.apache.commons.math3.linear.RealMatrix;
+import org.apache.commons.math3.linear.SingularValueDecomposition;
+
+import java.util.Arrays;
+import java.util.OptionalDouble;
+
+class LeastSquaresOnlineRegression {
+
+    private static final double SINGLE_VALUE_DECOMPOSITION_EPS = 1e+15;
+
+    private final RunningStatistics statistics;
+    private final Array2DRowRealMatrix Nx;
+    private final Array2DRowRealMatrix Ny;
+    private final Array2DRowRealMatrix Nz;
+    private final int N;
+
+    LeastSquaresOnlineRegression(int degrees) {
+        this.N = degrees + 1;
+        statistics = new RunningStatistics(3 * N);
+        this.Nx = new Array2DRowRealMatrix(this.N, this.N);
+        this.Ny = new Array2DRowRealMatrix(this.N, 1);
+        this.Nz = new Array2DRowRealMatrix(this.N, 1);
+    }
+
+    double rSquared() {
+        double result = 0;
+        if (statistics.count <= 0.0) {
+            return result;
+        }
+        double var = statistics.stats[3 * N - 1] - statistics.stats[2 * N - 1] * statistics.stats[2 * N - 1];
+        double residualVariance = var;
+        int n = N + 1;
+        boolean done = false;
+        while (--n > 0 && done == false) {
+            if (n == 1) {
+                return result;
+            } else if (n == this.N) {
+                OptionalDouble maybeResidualVar = residualVariance(N, Nx, Ny, Nz);
+                if (maybeResidualVar.isPresent()) {
+                    residualVariance = maybeResidualVar.getAsDouble();
+                    done = true;
+                }
+            } else {
+                Array2DRowRealMatrix x = new Array2DRowRealMatrix(n, n);
+                Array2DRowRealMatrix y = new Array2DRowRealMatrix(n, 1);
+                Array2DRowRealMatrix z = new Array2DRowRealMatrix(n, 1);
+                OptionalDouble maybeResidualVar = residualVariance(N, Nx, Ny, Nz);
+                if (maybeResidualVar.isPresent()) {
+                    residualVariance = maybeResidualVar.getAsDouble();
+                    done = true;
+                }
+            }
+        }
+        return Math.min(Math.max(1.0 - residualVariance / var, 0.0), 1.0);
+    }
+
+    private double[] statisticAdj(double x, double y) {
+        double[] d = new double[3 * N];
+        double xi = 1.0;
+        for (int i = 0; i < N; ++i, xi *= x) {
+            d[i] = xi;
+            d[i + 2 * N - 1] = xi * y;
+        }
+        for (int i = 3; i < 2 * N - 1; ++i, xi *= x) {
+            d[i] = xi;
+        }
+        d[3 * N - 1] = y * y;
+        return d;
+    }
+
+    void add(double x, double y, double weight) {
+        statistics.add(statisticAdj(x, y), weight);
+    }
+
+    void remove(double x, double y, double weight) {
+        statistics.remove(statisticAdj(x, y), weight);
+    }
+
+    private OptionalDouble residualVariance(int n, Array2DRowRealMatrix x, Array2DRowRealMatrix y, Array2DRowRealMatrix z) {
+        if (n == 1) {
+            return OptionalDouble.of(statistics.stats[3 * N - 1] - statistics.stats[2 * N - 1] * statistics.stats[2 * N - 1]);
+        }
+        for (int i = 0; i < n; ++i) {
+            x.setEntry(i, i, statistics.stats[i + i]);
+            y.setEntry(i, 0, statistics.stats[i + 2 * N - 1]);
+            z.setEntry(i, 0, statistics.stats[i]);
+            for (int j = i + 1; j < n; ++j) {
+                x.setEntry(i, j, statistics.stats[i + j]);
+                x.setEntry(j, i, statistics.stats[i + j]);
+            }
+        }
+
+        SingularValueDecomposition svd = new SingularValueDecomposition(x);
+        double[] singularValues = svd.getSingularValues();
+        if (singularValues[0] > SINGLE_VALUE_DECOMPOSITION_EPS * singularValues[n - 1]) {
+            return OptionalDouble.empty();
+        }
+        RealMatrix r = svd.getSolver().solve(y);
+        RealMatrix yr = y.transpose().multiply(r);
+        RealMatrix zr = z.transpose().multiply(r);
+        double t = statistics.stats[2 * N - 1] - zr.getEntry(0, 0);
+        return OptionalDouble.of((statistics.stats[3 * N - 1] - yr.getEntry(0, 0)) - (t * t));
+    }
+
+    private static class RunningStatistics {
+        private double count;
+        private final double[] stats;
+
+        RunningStatistics(int size) {
+            count = 0;
+            stats = new double[size];
+        }
+
+        void add(double[] values, double weight) {
+            assert values.length == stats.length
+                : "passed values for add are not of expected length; unable to update statistics for online least squares regression";
+            count += weight;
+            double alpha = weight / count;
+            double beta = 1 - alpha;
+
+            for (int i = 0; i < stats.length; i++) {
+                stats[i] = stats[i] * beta + alpha * values[i];
+            }
+        }
+
+        void remove(double[] values, double weight) {
+            assert values.length == stats.length
+                : "passed values for removal are not of expected length; unable to update statistics for online least squares regression";
+            count = Math.max(count - weight, 0);
+            if (count == 0) {
+                Arrays.fill(stats, 0);
+                return;
+            }
+            double alpha = weight / count;
+            double beta = 1 + alpha;
+
+            for (int i = 0; i < stats.length; i++) {
+                stats[i] = stats[i] * beta - alpha * values[i];
+            }
+        }
+    }
+}

+ 28 - 0
x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregationBuilderTests.java

@@ -0,0 +1,28 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.plugins.SearchPlugin;
+import org.elasticsearch.search.aggregations.BasePipelineAggregationTestCase;
+import org.elasticsearch.xpack.ml.MachineLearning;
+
+import java.util.Collections;
+import java.util.List;
+
+public class ChangePointAggregationBuilderTests extends BasePipelineAggregationTestCase<ChangePointAggregationBuilder> {
+    @Override
+    protected List<SearchPlugin> plugins() {
+        return Collections.singletonList(new MachineLearning(Settings.EMPTY));
+    }
+
+    @Override
+    protected ChangePointAggregationBuilder createTestAggregatorFactory() {
+        return new ChangePointAggregationBuilder(randomAlphaOfLength(10), randomAlphaOfLength(10));
+    }
+}

+ 203 - 0
x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/ChangePointAggregatorTests.java

@@ -0,0 +1,203 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.apache.commons.math3.distribution.NormalDistribution;
+import org.apache.commons.math3.random.RandomGeneratorFactory;
+import org.apache.lucene.document.NumericDocValuesField;
+import org.apache.lucene.document.SortedNumericDocValuesField;
+import org.apache.lucene.search.MatchAllDocsQuery;
+import org.apache.lucene.tests.index.RandomIndexWriter;
+import org.apache.lucene.util.NumericUtils;
+import org.elasticsearch.common.Randomness;
+import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.index.query.MatchAllQueryBuilder;
+import org.elasticsearch.plugins.SearchPlugin;
+import org.elasticsearch.search.aggregations.AggregationBuilders;
+import org.elasticsearch.search.aggregations.AggregatorTestCase;
+import org.elasticsearch.search.aggregations.bucket.filter.FilterAggregationBuilder;
+import org.elasticsearch.search.aggregations.bucket.filter.InternalFilter;
+import org.elasticsearch.search.aggregations.bucket.histogram.DateHistogramAggregationBuilder;
+import org.elasticsearch.search.aggregations.bucket.histogram.DateHistogramInterval;
+import org.elasticsearch.xpack.ml.MachineLearning;
+import org.elasticsearch.xpack.ml.aggs.MlAggsHelper;
+
+import java.io.IOException;
+import java.util.Arrays;
+import java.util.List;
+import java.util.concurrent.atomic.AtomicInteger;
+import java.util.function.Consumer;
+import java.util.stream.DoubleStream;
+
+import static org.hamcrest.Matchers.anyOf;
+import static org.hamcrest.Matchers.equalTo;
+import static org.hamcrest.Matchers.instanceOf;
+
+public class ChangePointAggregatorTests extends AggregatorTestCase {
+
+    @Override
+    protected List<SearchPlugin> getSearchPlugins() {
+        return List.of(new MachineLearning(Settings.EMPTY));
+    }
+
+    private static final DateHistogramInterval INTERVAL = DateHistogramInterval.minutes(1);
+    private static final String NUMERIC_FIELD_NAME = "value";
+    private static final String TIME_FIELD_NAME = "timestamp";
+
+    public void testNoChange() throws IOException {
+        double[] bucketValues = DoubleStream.generate(() -> 10).limit(100).toArray();
+        testChangeType(
+            bucketValues,
+            changeType -> assertThat(Arrays.toString(bucketValues), changeType, instanceOf(ChangeType.Stationary.class))
+        );
+    }
+
+    public void testSlopeUp() throws IOException {
+        NormalDistribution normal = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 0, 2);
+        AtomicInteger i = new AtomicInteger();
+        double[] bucketValues = DoubleStream.generate(() -> i.addAndGet(1) + normal.sample()).limit(40).toArray();
+        testChangeType(bucketValues, changeType -> {
+            assertThat(changeType, instanceOf(ChangeType.NonStationary.class));
+            assertThat(Arrays.toString(bucketValues), ((ChangeType.NonStationary) changeType).getTrend(), equalTo("increasing"));
+        });
+    }
+
+    public void testSlopeDown() throws IOException {
+        NormalDistribution normal = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 0, 2);
+        AtomicInteger i = new AtomicInteger(40);
+        double[] bucketValues = DoubleStream.generate(() -> i.decrementAndGet() + normal.sample()).limit(40).toArray();
+        testChangeType(bucketValues, changeType -> {
+            assertThat(changeType, instanceOf(ChangeType.NonStationary.class));
+            assertThat(Arrays.toString(bucketValues), ((ChangeType.NonStationary) changeType).getTrend(), equalTo("decreasing"));
+        });
+    }
+
+    public void testSlopeChange() throws IOException {
+        NormalDistribution normal = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 0, 1);
+        AtomicInteger i = new AtomicInteger();
+        double[] bucketValues = DoubleStream.concat(
+            DoubleStream.generate(() -> 10 + normal.sample()).limit(30),
+            DoubleStream.generate(() -> (11 + 2 * i.incrementAndGet()) + normal.sample()).limit(20)
+        ).toArray();
+        testChangeType(bucketValues, changeType -> {
+            assertThat(
+                Arrays.toString(bucketValues),
+                changeType,
+                anyOf(instanceOf(ChangeType.TrendChange.class), instanceOf(ChangeType.NonStationary.class))
+            );
+            if (changeType instanceof ChangeType.NonStationary nonStationary) {
+                assertThat(nonStationary.getTrend(), equalTo("increasing"));
+            }
+        });
+    }
+
+    public void testSpike() throws IOException {
+        NormalDistribution normal = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 0, 2);
+        double[] bucketValues = DoubleStream.concat(
+            DoubleStream.generate(() -> 10 + normal.sample()).limit(40),
+            DoubleStream.concat(DoubleStream.of(30 + normal.sample()), DoubleStream.generate(() -> 10 + normal.sample()).limit(40))
+        ).toArray();
+        testChangeType(bucketValues, changeType -> {
+            assertThat(
+                Arrays.toString(bucketValues),
+                changeType,
+                anyOf(instanceOf(ChangeType.Spike.class), instanceOf(ChangeType.DistributionChange.class))
+            );
+            if (changeType instanceof ChangeType.Spike) {
+                assertThat(changeType.changePoint(), equalTo(40));
+            }
+        });
+    }
+
+    public void testDip() throws IOException {
+        NormalDistribution normal = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 0, 1);
+        double[] bucketValues = DoubleStream.concat(
+            DoubleStream.generate(() -> 100 + normal.sample()).limit(40),
+            DoubleStream.concat(DoubleStream.of(30 + normal.sample()), DoubleStream.generate(() -> 100 + normal.sample()).limit(40))
+        ).toArray();
+        testChangeType(bucketValues, changeType -> {
+            assertThat(
+                Arrays.toString(bucketValues),
+                changeType,
+                anyOf(instanceOf(ChangeType.Dip.class), instanceOf(ChangeType.DistributionChange.class))
+            );
+            if (changeType instanceof ChangeType.Dip) {
+                assertThat(changeType.changePoint(), equalTo(40));
+            }
+        });
+    }
+
+    public void testStepChange() throws IOException {
+        NormalDistribution normal = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 0, 1);
+        double[] bucketValues = DoubleStream.concat(
+            DoubleStream.generate(() -> 10 + normal.sample()).limit(20),
+            DoubleStream.generate(() -> 30 + normal.sample()).limit(20)
+        ).toArray();
+        testChangeType(bucketValues, changeType -> {
+            assertThat(Arrays.toString(bucketValues), changeType, instanceOf(ChangeType.StepChange.class));
+            assertThat(changeType.changePoint(), equalTo(20));
+        });
+    }
+
+    public void testDistributionChange() throws IOException {
+        NormalDistribution first = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 50, 1);
+        NormalDistribution second = new NormalDistribution(RandomGeneratorFactory.createRandomGenerator(Randomness.get()), 50, 5);
+        double[] bucketValues = DoubleStream.concat(
+            DoubleStream.generate(first::sample).limit(50),
+            DoubleStream.generate(second::sample).limit(50)
+        ).toArray();
+        testChangeType(
+            bucketValues,
+            changeType -> assertThat(
+                Arrays.toString(bucketValues),
+                changeType,
+                anyOf(
+                    // Due to the random nature of the values generated, any of these could be detected
+                    // Distribution change is a "catch anything weird" if previous checks didn't find anything
+                    instanceOf(ChangeType.DistributionChange.class),
+                    instanceOf(ChangeType.Stationary.class),
+                    instanceOf(ChangeType.Spike.class),
+                    instanceOf(ChangeType.Dip.class),
+                    instanceOf(ChangeType.TrendChange.class)
+                )
+            )
+        );
+    }
+
+    void testChangeType(double[] bucketValues, Consumer<ChangeType> changeTypeAssertions) throws IOException {
+        FilterAggregationBuilder dummy = AggregationBuilders.filter("dummy", new MatchAllQueryBuilder())
+            .subAggregation(
+                new DateHistogramAggregationBuilder("time").field(TIME_FIELD_NAME)
+                    .fixedInterval(INTERVAL)
+                    .subAggregation(AggregationBuilders.max("max").field(NUMERIC_FIELD_NAME))
+            )
+            .subAggregation(new ChangePointAggregationBuilder("changes", "time>max"));
+        testCase(dummy, new MatchAllDocsQuery(), w -> writeTestDocs(w, bucketValues), (InternalFilter result) -> {
+            InternalChangePointAggregation agg = result.getAggregations().get("changes");
+            changeTypeAssertions.accept(agg.getChangeType());
+        }, longField(TIME_FIELD_NAME), doubleField(NUMERIC_FIELD_NAME));
+    }
+
+    private static void writeTestDocs(RandomIndexWriter w, double[] bucketValues) throws IOException {
+        long epoch_timestamp = 0;
+        for (double bucketValue : bucketValues) {
+            w.addDocument(
+                Arrays.asList(
+                    new NumericDocValuesField(NUMERIC_FIELD_NAME, NumericUtils.doubleToSortableLong(bucketValue)),
+                    new SortedNumericDocValuesField(TIME_FIELD_NAME, epoch_timestamp)
+                )
+            );
+            epoch_timestamp += INTERVAL.estimateMillis();
+        }
+    }
+
+    private static MlAggsHelper.DoubleBucketValues values(double[] values) {
+        return new MlAggsHelper.DoubleBucketValues(new long[0], values, new int[0]);
+    }
+
+}

+ 52 - 0
x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/InternalChangePointAggregationTests.java

@@ -0,0 +1,52 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
+import org.elasticsearch.common.io.stream.Writeable;
+import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.search.SearchModule;
+import org.elasticsearch.search.aggregations.InternalAggregations;
+import org.elasticsearch.test.AbstractWireSerializingTestCase;
+
+import java.util.ArrayList;
+import java.util.Collections;
+import java.util.List;
+
+public class InternalChangePointAggregationTests extends AbstractWireSerializingTestCase<InternalChangePointAggregation> {
+
+    @Override
+    protected NamedWriteableRegistry getNamedWriteableRegistry() {
+        List<NamedWriteableRegistry.Entry> namedWriteables = new ArrayList<>();
+        namedWriteables.addAll(new ChangePointNamedContentProvider().getNamedWriteables());
+        namedWriteables.addAll(new SearchModule(Settings.EMPTY, Collections.emptyList()).getNamedWriteables());
+        return new NamedWriteableRegistry(namedWriteables);
+    }
+
+    @Override
+    protected Writeable.Reader<InternalChangePointAggregation> instanceReader() {
+        return InternalChangePointAggregation::new;
+    }
+
+    @Override
+    protected InternalChangePointAggregation createTestInstance() {
+        return new InternalChangePointAggregation(
+            randomAlphaOfLength(10),
+            Collections.singletonMap("foo", "bar"),
+            randomBoolean() ? null : new ChangePointBucket(randomAlphaOfLength(10), randomNonNegativeLong(), InternalAggregations.EMPTY),
+            randomFrom(
+                new ChangeType.Stationary(),
+                new ChangeType.NonStationary(randomDouble(), randomDouble(), randomAlphaOfLength(10)),
+                new ChangeType.Dip(randomDouble(), randomInt(1000)),
+                new ChangeType.Spike(randomDouble(), randomInt(1000)),
+                new ChangeType.TrendChange(randomDouble(), randomDouble(), randomInt(1000)),
+                new ChangeType.DistributionChange(randomDouble(), randomInt(1000))
+            )
+        );
+    }
+}

+ 67 - 0
x-pack/plugin/ml/src/test/java/org/elasticsearch/xpack/ml/aggs/changepoint/LeastSquaresOnlineRegressionTests.java

@@ -0,0 +1,67 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.ml.aggs.changepoint;
+
+import org.apache.commons.math3.stat.regression.OLSMultipleLinearRegression;
+import org.elasticsearch.test.ESTestCase;
+
+import static org.hamcrest.Matchers.closeTo;
+
+public class LeastSquaresOnlineRegressionTests extends ESTestCase {
+
+    public void testAddRemove() {
+        LeastSquaresOnlineRegression lsLess = new LeastSquaresOnlineRegression(2);
+        LeastSquaresOnlineRegression lsAll = new LeastSquaresOnlineRegression(2);
+        double[] x = new double[] { 1.0, 4.0, 2.0, 15.0, 20.0 };
+        double[] xLess = new double[] { 1.0, 4.0, 2.0, 15.0 };
+        double[] y = new double[] { 2.0, 8.0, 7.0, 22.0, 23.0 };
+        double[] yLess = new double[] { 2.0, 8.0, 7.0, 22.0 };
+        for (int i = 0; i < y.length; i++) {
+            lsAll.add(x[i], y[i], 1.0);
+        }
+        for (int i = 0; i < yLess.length; i++) {
+            lsLess.add(xLess[i], yLess[i], 1.0);
+        }
+        double rsAll = lsAll.rSquared();
+        double rsLess = lsLess.rSquared();
+
+        lsAll.remove(x[x.length - 1], y[y.length - 1], 1.0);
+        lsLess.add(x[x.length - 1], y[y.length - 1], 1.0);
+
+        assertThat(rsAll, closeTo(lsLess.rSquared(), 1e-12));
+        assertThat(rsLess, closeTo(lsAll.rSquared(), 1e-12));
+    }
+
+    public void testOnlineRegression() {
+        LeastSquaresOnlineRegression ls = new LeastSquaresOnlineRegression(2);
+        OLSMultipleLinearRegression linearRegression = new OLSMultipleLinearRegression(0);
+        double[] x = new double[] { 1.0, 4.0, 2.0, 15.0, 20.0 };
+        double[] y = new double[] { 2.0, 8.0, 7.0, 22.0, 23.0 };
+        double[][] xs = new double[y.length][];
+        for (int i = 0; i < y.length; i++) {
+            xs[i] = new double[] { x[i], x[i] * x[i] };
+            ls.add(x[i], y[i], 1.0);
+        }
+        linearRegression.newSampleData(y, xs);
+        double slowRSquared = linearRegression.calculateRSquared();
+        double rs = ls.rSquared();
+        assertThat(rs, closeTo(slowRSquared, 1e-10));
+
+        // Test removing the last value
+        xs = new double[y.length - 1][];
+        for (int i = 0; i < y.length - 1; i++) {
+            xs[i] = new double[] { x[i], x[i] * x[i] };
+        }
+        linearRegression.newSampleData(new double[] { 2.0, 8.0, 7.0, 22.0 }, xs);
+        slowRSquared = linearRegression.calculateRSquared();
+        ls.remove(x[x.length - 1], y[y.length - 1], 1.0);
+        rs = ls.rSquared();
+        assertThat(rs, closeTo(slowRSquared, 1e-10));
+    }
+
+}

+ 213 - 0
x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/ml/change_point_agg.yml

@@ -0,0 +1,213 @@
+setup:
+  - skip:
+      features: headers
+  - do:
+      headers:
+        Authorization: "Basic eF9wYWNrX3Jlc3RfdXNlcjp4LXBhY2stdGVzdC1wYXNzd29yZA==" # run as x_pack_rest_user, i.e. the test setup superuser
+      indices.create:
+        index: store
+        body:
+          mappings:
+            properties:
+              cost:
+                type: integer
+              time:
+                type: date
+
+  - do:
+      headers:
+        Authorization: "Basic eF9wYWNrX3Jlc3RfdXNlcjp4LXBhY2stdGVzdC1wYXNzd29yZA==" # run as x_pack_rest_user, i.e. the test setup superuser
+      indices.create:
+        index: empty-store
+        body:
+          mappings:
+            properties:
+              cost:
+                type: integer
+              time:
+                type: date
+
+  - do:
+      headers:
+        Authorization: "Basic eF9wYWNrX3Jlc3RfdXNlcjp4LXBhY2stdGVzdC1wYXNzd29yZA==" # run as x_pack_rest_user, i.e. the test setup superuser
+        Content-Type: application/json
+      bulk:
+        index: store
+        refresh: true
+        body: |
+          {"index":{}}
+          {"cost":200,"time":1587501233000}
+          {"index":{}}
+          {"cost":200,"time":1587501243000}
+          {"index":{}}
+          {"cost":200,"time":1587501253000}
+          {"index":{}}
+          {"cost":250,"time":1587501263000}
+          {"index":{}}
+          {"cost":250,"time":1587501273000}
+          {"index":{}}
+          {"cost":580,"time":1587501283000}
+          {"index":{}}
+          {"cost":600,"time":1587501293000}
+          {"index":{}}
+          {"cost":600,"time":1587501303000}
+          {"index":{}}
+          {"cost":600,"time":1587501313000}
+          {"index":{}}
+          {"cost":600,"time":1587501313000}
+          {"index":{}}
+          {"cost":600,"time":1587501323000}
+          {"index":{}}
+          {"cost":600,"time":1587501333000}
+          {"index":{}}
+          {"cost":600,"time":1587501343000}
+          {"index":{}}
+          {"cost":600,"time":1587501353000}
+          {"index":{}}
+          {"cost":600,"time":1587501363000}
+          {"index":{}}
+          {"cost":600,"time":1587501373000}
+          {"index":{}}
+          {"cost":600,"time":1587501383000}
+          {"index":{}}
+          {"cost":600,"time":1587501393000}
+          {"index":{}}
+          {"cost":600,"time":1587501403000}
+          {"index":{}}
+          {"cost":600,"time":1587501413000}
+          {"index":{}}
+          {"cost":600,"time":1587501423000}
+          {"index":{}}
+          {"cost":600,"time":1587501433000}
+          {"index":{}}
+          {"cost":600,"time":1587501443000}
+          {"index":{}}
+          {"cost":600,"time":1587501453000}
+
+---
+"Test change_point agg simple":
+
+  - do:
+      search:
+        index: store
+        size: 0
+        body: >
+          {
+            "aggs": {
+              "date": {
+                "date_histogram": {
+                  "field": "time",
+                  "fixed_interval": "1s"
+                },
+                "aggs": {
+                  "avg": {
+                    "avg": {
+                      "field": "cost"
+                    }
+                  }
+                }
+              },
+              "change_point": {
+                "change_point": {
+                  "buckets_path": "date>avg"
+                }
+              }
+            }
+          }
+  - is_true: aggregations.change_point.type.trend_change
+  - is_true: aggregations.change_point.type.trend_change.p_value
+  - is_true: aggregations.change_point.type.trend_change.r_value
+
+---
+"Test change_point with missing buckets_path":
+
+  - do:
+      catch: /Required \[buckets_path\]/
+      search:
+        index: store
+        size: 0
+        body: >
+          {
+            "aggs": {
+              "date": {
+                "date_histogram": {
+                  "field": "time",
+                  "fixed_interval": "1s"
+                },
+                "aggs": {
+                  "avg": {
+                    "avg": {
+                      "field": "cost"
+                    }
+                  }
+                }
+              },
+              "change_point": {
+                "change_point": {
+                }
+              }
+            }
+          }
+
+---
+"Test change_point with bad buckets_path":
+
+  - do:
+      catch: /No aggregation found for path \[foo\]/
+      search:
+        index: store
+        size: 0
+        body: >
+          {
+            "aggs": {
+              "date": {
+                "date_histogram": {
+                  "field": "time",
+                  "fixed_interval": "1s"
+                },
+                "aggs": {
+                  "avg": {
+                    "avg": {
+                      "field": "cost"
+                    }
+                  }
+                }
+              },
+              "change_point": {
+                "change_point": {
+                  "buckets_path": "foo"
+                }
+              }
+            }
+          }
+---
+"Test change_point with too few buckets":
+
+  - do:
+      catch: /not enough buckets to calculate change_point. Requires at least \[22\]/
+      search:
+        index: empty-store
+        size: 0
+        body: >
+          {
+            "aggs": {
+              "date": {
+                "date_histogram": {
+                  "field": "time",
+                  "fixed_interval": "1s"
+                },
+                "aggs": {
+                  "avg": {
+                    "avg": {
+                      "field": "cost"
+                    }
+                  }
+                }
+              },
+              "change_point": {
+                "change_point": {
+                  "buckets_path": "date"
+                }
+              }
+            }
+          }