Просмотр исходного кода

[8.x] Backport to_aggregate_metric_double and sorting (#126438)

* [ES|QL] ToAggregateMetricDouble function (#124595)

This commit adds a conversion function from numerics (and aggregate
metric doubles) to aggregate metric doubles.

It is most useful when you have multiple indices, where one index uses
aggregate metric double (e.g. a downsampled index) and another uses a
normal numeric type like long or double (e.g. an index prior to
downsampling).

* remove old docs

* [ES|QL] Add ToAggregateMetricDouble example (#125518)

Adds AggregateMetricDouble to the ES|QL CSV tests and examples of how to
use the ToAggregateMetricDouble function

* [ES|QL] Fix sorting when aggregate_metric_double present (#125191)

Previously if an aggregate_metric_double was present amongst fields and
you tried to sort on any (not necessarily even on the agg_metric itself)
field in ES|QL, it would break the results.

This commit doesn't add support for sorting _on_ aggregate_metric_double
(it is unclear what aspect would be sorted), but it fixes the previous
behavior.

* drop old style docs again

* add new style docs

* [CI] Auto commit changes from spotless

---------

Co-authored-by: elasticsearchmachine <infra-root+elasticsearchmachine@elastic.co>
Larisa Motova 6 месяцев назад
Родитель
Сommit
2163ad5d80
35 измененных файлов с 1342 добавлено и 79 удалено
  1. 5 0
      docs/changelog/124595.yaml
  2. 5 0
      docs/changelog/125191.yaml
  3. 5 0
      docs/reference/esql/functions/description/to_aggregate_metric_double.asciidoc
  4. 22 0
      docs/reference/esql/functions/examples/to_aggregate_metric_double.asciidoc
  5. 13 0
      docs/reference/esql/functions/kibana/definition/to_aggregate_metric_double.json
  6. 11 0
      docs/reference/esql/functions/kibana/docs/to_aggregate_metric_double.md
  7. 15 0
      docs/reference/esql/functions/layout/to_aggregate_metric_double.asciidoc
  8. 6 0
      docs/reference/esql/functions/parameters/to_aggregate_metric_double.asciidoc
  9. 1 0
      docs/reference/esql/functions/signature/to_aggregate_metric_double.svg
  10. 9 0
      docs/reference/esql/functions/types/to_aggregate_metric_double.asciidoc
  11. 1 2
      x-pack/plugin/esql-core/src/main/java/org/elasticsearch/xpack/esql/core/type/DataType.java
  12. 38 1
      x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/data/AggregateMetricDoubleBlockBuilder.java
  13. 13 1
      x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/data/BlockUtils.java
  14. 1 0
      x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ResultBuilder.java
  15. 61 0
      x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ResultBuilderForComposite.java
  16. 2 0
      x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ValueExtractor.java
  17. 57 0
      x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ValueExtractorForComposite.java
  18. 7 0
      x-pack/plugin/esql/qa/testFixtures/src/main/java/org/elasticsearch/xpack/esql/CsvAssert.java
  19. 14 1
      x-pack/plugin/esql/qa/testFixtures/src/main/java/org/elasticsearch/xpack/esql/CsvTestUtils.java
  20. 10 2
      x-pack/plugin/esql/qa/testFixtures/src/main/java/org/elasticsearch/xpack/esql/EsqlTestUtils.java
  21. 28 0
      x-pack/plugin/esql/qa/testFixtures/src/main/resources/convert.csv-spec
  22. 11 1
      x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/action/EsqlCapabilities.java
  23. 2 0
      x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/expression/ExpressionWritables.java
  24. 2 0
      x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/expression/function/EsqlFunctionRegistry.java
  25. 570 0
      x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/expression/function/scalar/convert/ToAggregateMetricDouble.java
  26. 3 3
      x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/planner/LocalExecutionPlanner.java
  27. 55 0
      x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/type/EsqlDataTypeConverter.java
  28. 9 0
      x-pack/plugin/esql/src/test/java/org/elasticsearch/xpack/esql/SerializationTestUtils.java
  29. 91 0
      x-pack/plugin/esql/src/test/java/org/elasticsearch/xpack/esql/expression/function/scalar/convert/ToAggregateMetricDoubleTests.java
  30. 2 0
      x-pack/plugin/esql/src/test/java/org/elasticsearch/xpack/esql/plan/physical/AbstractPhysicalPlanSerializationTests.java
  31. 4 4
      x-pack/plugin/mapper-aggregate-metric/src/main/java/org/elasticsearch/xpack/aggregatemetric/mapper/AggregateMetricDoubleFieldMapper.java
  32. 98 0
      x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/40_tsdb.yml
  33. 64 62
      x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/40_unsupported_types.yml
  34. 105 0
      x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/46_downsample.yml
  35. 2 2
      x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/60_usage.yml

+ 5 - 0
docs/changelog/124595.yaml

@@ -0,0 +1,5 @@
+pr: 124595
+summary: '`ToAggregateMetricDouble` function'
+area: "ES|QL"
+type: enhancement
+issues: []

+ 5 - 0
docs/changelog/125191.yaml

@@ -0,0 +1,5 @@
+pr: 125191
+summary: Fix sorting when `aggregate_metric_double` present
+area: ES|QL
+type: enhancement
+issues: []

+ 5 - 0
docs/reference/esql/functions/description/to_aggregate_metric_double.asciidoc

@@ -0,0 +1,5 @@
+// This is generated by ESQL's AbstractFunctionTestCase. Do no edit it. See ../README.md for how to regenerate it.
+
+*Description*
+
+Encode a numeric to an aggregate_metric_double.

+ 22 - 0
docs/reference/esql/functions/examples/to_aggregate_metric_double.asciidoc

@@ -0,0 +1,22 @@
+// This is generated by ESQL's AbstractFunctionTestCase. Do no edit it. See ../README.md for how to regenerate it.
+
+*Examples*
+
+[source.merge.styled,esql]
+----
+include::{esql-specs}/convert.csv-spec[tag=toAggregateMetricDouble]
+----
+[%header.monospaced.styled,format=dsv,separator=|]
+|===
+include::{esql-specs}/convert.csv-spec[tag=toAggregateMetricDouble-result]
+|===
+The expression also accepts multi-values
+[source.merge.styled,esql]
+----
+include::{esql-specs}/convert.csv-spec[tag=toAggregateMetricDoubleMv]
+----
+[%header.monospaced.styled,format=dsv,separator=|]
+|===
+include::{esql-specs}/convert.csv-spec[tag=toAggregateMetricDoubleMv-result]
+|===
+

+ 13 - 0
docs/reference/esql/functions/kibana/definition/to_aggregate_metric_double.json

@@ -0,0 +1,13 @@
+{
+  "comment" : "This is generated by ESQL's AbstractFunctionTestCase. Do no edit it. See ../README.md for how to regenerate it.",
+  "type" : "eval",
+  "name" : "to_aggregate_metric_double",
+  "description" : "Encode a numeric to an aggregate_metric_double.",
+  "signatures" : [ ],
+  "examples" : [
+    "ROW x = 3892095203\n| EVAL agg_metric = TO_AGGREGATE_METRIC_DOUBLE(x)",
+    "ROW x = [5032, 11111, 40814]\n| EVAL agg_metric = TO_AGGREGATE_METRIC_DOUBLE(x)"
+  ],
+  "preview" : false,
+  "snapshot_only" : false
+}

+ 11 - 0
docs/reference/esql/functions/kibana/docs/to_aggregate_metric_double.md

@@ -0,0 +1,11 @@
+<!--
+This is generated by ESQL's AbstractFunctionTestCase. Do no edit it. See ../README.md for how to regenerate it.
+-->
+
+### TO_AGGREGATE_METRIC_DOUBLE
+Encode a numeric to an aggregate_metric_double.
+
+```
+ROW x = 3892095203
+| EVAL agg_metric = TO_AGGREGATE_METRIC_DOUBLE(x)
+```

+ 15 - 0
docs/reference/esql/functions/layout/to_aggregate_metric_double.asciidoc

@@ -0,0 +1,15 @@
+// This is generated by ESQL's AbstractFunctionTestCase. Do no edit it. See ../README.md for how to regenerate it.
+
+[discrete]
+[[esql-to_aggregate_metric_double]]
+=== `TO_AGGREGATE_METRIC_DOUBLE`
+
+*Syntax*
+
+[.text-center]
+image::esql/functions/signature/to_aggregate_metric_double.svg[Embedded,opts=inline]
+
+include::../parameters/to_aggregate_metric_double.asciidoc[]
+include::../description/to_aggregate_metric_double.asciidoc[]
+include::../types/to_aggregate_metric_double.asciidoc[]
+include::../examples/to_aggregate_metric_double.asciidoc[]

+ 6 - 0
docs/reference/esql/functions/parameters/to_aggregate_metric_double.asciidoc

@@ -0,0 +1,6 @@
+// This is generated by ESQL's AbstractFunctionTestCase. Do no edit it. See ../README.md for how to regenerate it.
+
+*Parameters*
+
+`number`::
+Input value. The input can be a single- or multi-valued column or an expression.

+ 1 - 0
docs/reference/esql/functions/signature/to_aggregate_metric_double.svg

@@ -0,0 +1 @@
+<svg version="1.1" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns="http://www.w3.org/2000/svg" width="528" height="46" viewbox="0 0 528 46"><defs><style type="text/css">#guide .c{fill:none;stroke:#222222;}#guide .k{fill:#000000;font-family:Roboto Mono,Sans-serif;font-size:20px;}#guide .s{fill:#e4f4ff;stroke:#222222;}#guide .syn{fill:#8D8D8D;font-family:Roboto Mono,Sans-serif;font-size:20px;}</style></defs><path class="c" d="M0 31h5m332 0h10m32 0h10m92 0h10m32 0h5"/><rect class="s" x="5" y="5" width="332" height="36"/><text class="k" x="15" y="31">TO_AGGREGATE_METRIC_DOUBLE</text><rect class="s" x="347" y="5" width="32" height="36" rx="7"/><text class="syn" x="357" y="31">(</text><rect class="s" x="389" y="5" width="92" height="36" rx="7"/><text class="k" x="399" y="31">number</text><rect class="s" x="491" y="5" width="32" height="36" rx="7"/><text class="syn" x="501" y="31">)</text></svg>

+ 9 - 0
docs/reference/esql/functions/types/to_aggregate_metric_double.asciidoc

@@ -0,0 +1,9 @@
+// This is generated by ESQL's AbstractFunctionTestCase. Do no edit it. See ../README.md for how to regenerate it.
+
+*Supported types*
+
+[%header.monospaced.styled,format=dsv,separator=|]
+|===
+number | result
+aggregate_metric_double
+|===

+ 1 - 2
x-pack/plugin/esql-core/src/main/java/org/elasticsearch/xpack/esql/core/type/DataType.java

@@ -557,7 +557,6 @@ public enum DataType {
             && t != SOURCE
             && t != HALF_FLOAT
             && t != PARTIAL_AGG
-            && t != AGGREGATE_METRIC_DOUBLE
             && t.isCounter() == false;
     }
 
@@ -578,7 +577,7 @@ public enum DataType {
     }
 
     public static boolean isSortable(DataType t) {
-        return false == (t == SOURCE || isCounter(t) || isSpatial(t));
+        return false == (t == SOURCE || isCounter(t) || isSpatial(t) || t == AGGREGATE_METRIC_DOUBLE);
     }
 
     public String nameUpper() {

+ 38 - 1
x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/data/AggregateMetricDoubleBlockBuilder.java

@@ -7,9 +7,17 @@
 
 package org.elasticsearch.compute.data;
 
+import org.elasticsearch.TransportVersion;
+import org.elasticsearch.TransportVersions;
+import org.elasticsearch.common.io.stream.GenericNamedWriteable;
+import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
+import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.common.io.stream.StreamOutput;
 import org.elasticsearch.core.Releasables;
 import org.elasticsearch.index.mapper.BlockLoader;
 
+import java.io.IOException;
+
 public class AggregateMetricDoubleBlockBuilder extends AbstractBlockBuilder implements BlockLoader.AggregateMetricDoubleBuilder {
 
     private DoubleBlockBuilder minBuilder;
@@ -161,11 +169,40 @@ public class AggregateMetricDoubleBlockBuilder extends AbstractBlockBuilder impl
         }
     }
 
-    public record AggregateMetricDoubleLiteral(Double min, Double max, Double sum, Integer count) {
+    public record AggregateMetricDoubleLiteral(Double min, Double max, Double sum, Integer count) implements GenericNamedWriteable {
         public AggregateMetricDoubleLiteral {
             min = min.isNaN() ? null : min;
             max = max.isNaN() ? null : max;
             sum = sum.isNaN() ? null : sum;
         }
+
+        public static final NamedWriteableRegistry.Entry ENTRY = new NamedWriteableRegistry.Entry(
+            GenericNamedWriteable.class,
+            "AggregateMetricDoubleLiteral",
+            AggregateMetricDoubleLiteral::new
+        );
+
+        @Override
+        public String getWriteableName() {
+            return "AggregateMetricDoubleLiteral";
+        }
+
+        public AggregateMetricDoubleLiteral(StreamInput input) throws IOException {
+            this(input.readOptionalDouble(), input.readOptionalDouble(), input.readOptionalDouble(), input.readOptionalInt());
+        }
+
+        @Override
+        public void writeTo(StreamOutput out) throws IOException {
+            out.writeOptionalDouble(min);
+            out.writeOptionalDouble(max);
+            out.writeOptionalDouble(sum);
+            out.writeOptionalInt(count);
+        }
+
+        @Override
+        public TransportVersion getMinimalSupportedVersion() {
+            return TransportVersions.ESQL_AGGREGATE_METRIC_DOUBLE_LITERAL;
+        }
+
     }
 }

+ 13 - 1
x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/data/BlockUtils.java

@@ -285,7 +285,19 @@ public final class BlockUtils {
                 DocVector v = ((DocBlock) block).asVector();
                 yield new Doc(v.shards().getInt(offset), v.segments().getInt(offset), v.docs().getInt(offset));
             }
-            case COMPOSITE -> throw new IllegalArgumentException("can't read values from composite blocks");
+            case COMPOSITE -> {
+                CompositeBlock compositeBlock = (CompositeBlock) block;
+                var minBlock = (DoubleBlock) compositeBlock.getBlock(AggregateMetricDoubleBlockBuilder.Metric.MIN.getIndex());
+                var maxBlock = (DoubleBlock) compositeBlock.getBlock(AggregateMetricDoubleBlockBuilder.Metric.MAX.getIndex());
+                var sumBlock = (DoubleBlock) compositeBlock.getBlock(AggregateMetricDoubleBlockBuilder.Metric.SUM.getIndex());
+                var countBlock = (IntBlock) compositeBlock.getBlock(AggregateMetricDoubleBlockBuilder.Metric.COUNT.getIndex());
+                yield new AggregateMetricDoubleLiteral(
+                    minBlock.getDouble(offset),
+                    maxBlock.getDouble(offset),
+                    sumBlock.getDouble(offset),
+                    countBlock.getInt(offset)
+                );
+            }
             case UNKNOWN -> throw new IllegalArgumentException("can't read values from [" + block + "]");
         };
     }

+ 1 - 0
x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ResultBuilder.java

@@ -54,6 +54,7 @@ interface ResultBuilder extends Releasable {
             case DOUBLE -> new ResultBuilderForDouble(blockFactory, encoder, inKey, positions);
             case NULL -> new ResultBuilderForNull(blockFactory);
             case DOC -> new ResultBuilderForDoc(blockFactory, positions);
+            case COMPOSITE -> new ResultBuilderForComposite(blockFactory, positions);
             default -> {
                 assert false : "Result builder for [" + elementType + "]";
                 throw new UnsupportedOperationException("Result builder for [" + elementType + "]");

+ 61 - 0
x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ResultBuilderForComposite.java

@@ -0,0 +1,61 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.compute.operator.topn;
+
+import org.apache.lucene.util.BytesRef;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
+import org.elasticsearch.compute.data.Block;
+import org.elasticsearch.compute.data.BlockFactory;
+import org.elasticsearch.index.mapper.BlockLoader;
+
+import java.util.List;
+
+public class ResultBuilderForComposite implements ResultBuilder {
+
+    private final AggregateMetricDoubleBlockBuilder builder;
+
+    ResultBuilderForComposite(BlockFactory blockFactory, int positions) {
+        this.builder = blockFactory.newAggregateMetricDoubleBlockBuilder(positions);
+    }
+
+    @Override
+    public void decodeKey(BytesRef keys) {
+        throw new AssertionError("Composite Block can't be a key");
+    }
+
+    @Override
+    public void decodeValue(BytesRef values) {
+        for (BlockLoader.DoubleBuilder subBuilder : List.of(builder.min(), builder.max(), builder.sum())) {
+            if (TopNEncoder.DEFAULT_UNSORTABLE.decodeBoolean(values)) {
+                subBuilder.appendDouble(TopNEncoder.DEFAULT_UNSORTABLE.decodeDouble(values));
+            } else {
+                subBuilder.appendNull();
+            }
+        }
+        if (TopNEncoder.DEFAULT_UNSORTABLE.decodeBoolean(values)) {
+            builder.count().appendInt(TopNEncoder.DEFAULT_UNSORTABLE.decodeInt(values));
+        } else {
+            builder.count().appendNull();
+        }
+    }
+
+    @Override
+    public Block build() {
+        return builder.build();
+    }
+
+    @Override
+    public String toString() {
+        return "ValueExtractorForComposite";
+    }
+
+    @Override
+    public void close() {
+        builder.close();
+    }
+}

+ 2 - 0
x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ValueExtractor.java

@@ -10,6 +10,7 @@ package org.elasticsearch.compute.operator.topn;
 import org.elasticsearch.compute.data.Block;
 import org.elasticsearch.compute.data.BooleanBlock;
 import org.elasticsearch.compute.data.BytesRefBlock;
+import org.elasticsearch.compute.data.CompositeBlock;
 import org.elasticsearch.compute.data.DocBlock;
 import org.elasticsearch.compute.data.DoubleBlock;
 import org.elasticsearch.compute.data.ElementType;
@@ -40,6 +41,7 @@ interface ValueExtractor {
             case DOUBLE -> ValueExtractorForDouble.extractorFor(encoder, inKey, (DoubleBlock) block);
             case NULL -> new ValueExtractorForNull();
             case DOC -> new ValueExtractorForDoc(encoder, ((DocBlock) block).asVector());
+            case COMPOSITE -> new ValueExtractorForComposite(encoder, (CompositeBlock) block);
             default -> {
                 assert false : "No value extractor for [" + block.elementType() + "]";
                 throw new UnsupportedOperationException("No value extractor for [" + block.elementType() + "]");

+ 57 - 0
x-pack/plugin/esql/compute/src/main/java/org/elasticsearch/compute/operator/topn/ValueExtractorForComposite.java

@@ -0,0 +1,57 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.compute.operator.topn;
+
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
+import org.elasticsearch.compute.data.CompositeBlock;
+import org.elasticsearch.compute.data.DoubleBlock;
+import org.elasticsearch.compute.data.IntBlock;
+import org.elasticsearch.compute.operator.BreakingBytesRefBuilder;
+
+import java.util.List;
+
+public class ValueExtractorForComposite implements ValueExtractor {
+    private final CompositeBlock block;
+
+    ValueExtractorForComposite(TopNEncoder encoder, CompositeBlock block) {
+        assert encoder == TopNEncoder.DEFAULT_UNSORTABLE;
+        this.block = block;
+    }
+
+    @Override
+    public void writeValue(BreakingBytesRefBuilder values, int position) {
+        if (block.getBlockCount() != AggregateMetricDoubleBlockBuilder.Metric.values().length) {
+            throw new UnsupportedOperationException("Composite Blocks for non-aggregate-metric-doubles do not have value extractors");
+        }
+        for (AggregateMetricDoubleBlockBuilder.Metric metric : List.of(
+            AggregateMetricDoubleBlockBuilder.Metric.MIN,
+            AggregateMetricDoubleBlockBuilder.Metric.MAX,
+            AggregateMetricDoubleBlockBuilder.Metric.SUM
+        )) {
+            DoubleBlock doubleBlock = block.getBlock(metric.getIndex());
+            if (doubleBlock.isNull(position)) {
+                TopNEncoder.DEFAULT_UNSORTABLE.encodeBoolean(false, values);
+            } else {
+                TopNEncoder.DEFAULT_UNSORTABLE.encodeBoolean(true, values);
+                TopNEncoder.DEFAULT_UNSORTABLE.encodeDouble(doubleBlock.getDouble(position), values);
+            }
+        }
+        IntBlock intBlock = block.getBlock(AggregateMetricDoubleBlockBuilder.Metric.COUNT.getIndex());
+        if (intBlock.isNull(position)) {
+            TopNEncoder.DEFAULT_UNSORTABLE.encodeBoolean(false, values);
+        } else {
+            TopNEncoder.DEFAULT_UNSORTABLE.encodeBoolean(true, values);
+            TopNEncoder.DEFAULT_UNSORTABLE.encodeInt(intBlock.getInt(position), values);
+        }
+    }
+
+    @Override
+    public String toString() {
+        return "ValueExtractorForComposite";
+    }
+}

+ 7 - 0
x-pack/plugin/esql/qa/testFixtures/src/main/java/org/elasticsearch/xpack/esql/CsvAssert.java

@@ -10,6 +10,7 @@ package org.elasticsearch.xpack.esql;
 import org.apache.lucene.util.BytesRef;
 import org.elasticsearch.common.Strings;
 import org.elasticsearch.common.time.DateFormatter;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
 import org.elasticsearch.compute.data.Page;
 import org.elasticsearch.logging.Logger;
 import org.elasticsearch.search.DocValueFormat;
@@ -40,6 +41,7 @@ import static org.elasticsearch.xpack.esql.core.util.DateUtils.UTC_DATE_TIME_FOR
 import static org.elasticsearch.xpack.esql.core.util.NumericUtils.unsignedLongAsNumber;
 import static org.elasticsearch.xpack.esql.core.util.SpatialCoordinateTypes.CARTESIAN;
 import static org.elasticsearch.xpack.esql.core.util.SpatialCoordinateTypes.GEO;
+import static org.elasticsearch.xpack.esql.type.EsqlDataTypeConverter.aggregateMetricDoubleLiteralToString;
 import static org.hamcrest.MatcherAssert.assertThat;
 import static org.hamcrest.Matchers.instanceOf;
 import static org.junit.Assert.assertEquals;
@@ -405,6 +407,11 @@ public final class CsvAssert {
             case VERSION -> // convert BytesRef-packed Version to String
                 rebuildExpected(expectedValue, BytesRef.class, x -> new Version((BytesRef) x).toString());
             case UNSIGNED_LONG -> rebuildExpected(expectedValue, Long.class, x -> unsignedLongAsNumber((long) x));
+            case AGGREGATE_METRIC_DOUBLE -> rebuildExpected(
+                expectedValue,
+                AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral.class,
+                x -> aggregateMetricDoubleLiteralToString((AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral) x)
+            );
             default -> expectedValue;
         };
     }

+ 14 - 1
x-pack/plugin/esql/qa/testFixtures/src/main/java/org/elasticsearch/xpack/esql/CsvTestUtils.java

@@ -15,6 +15,7 @@ import org.elasticsearch.common.network.InetAddresses;
 import org.elasticsearch.common.time.DateFormatters;
 import org.elasticsearch.common.time.DateUtils;
 import org.elasticsearch.common.util.BigArrays;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
 import org.elasticsearch.compute.data.Block;
 import org.elasticsearch.compute.data.BlockFactory;
 import org.elasticsearch.compute.data.BlockUtils;
@@ -62,6 +63,7 @@ import static org.elasticsearch.xpack.esql.core.util.DateUtils.UTC_DATE_TIME_FOR
 import static org.elasticsearch.xpack.esql.core.util.NumericUtils.asLongUnsigned;
 import static org.elasticsearch.xpack.esql.core.util.SpatialCoordinateTypes.CARTESIAN;
 import static org.elasticsearch.xpack.esql.core.util.SpatialCoordinateTypes.GEO;
+import static org.elasticsearch.xpack.esql.type.EsqlDataTypeConverter.stringToAggregateMetricDoubleLiteral;
 
 public final class CsvTestUtils {
     private static final int MAX_WIDTH = 80;
@@ -480,6 +482,10 @@ public final class CsvTestUtils {
         CARTESIAN_POINT(x -> x == null ? null : CARTESIAN.wktToWkb(x), BytesRef.class),
         GEO_SHAPE(x -> x == null ? null : GEO.wktToWkb(x), BytesRef.class),
         CARTESIAN_SHAPE(x -> x == null ? null : CARTESIAN.wktToWkb(x), BytesRef.class),
+        AGGREGATE_METRIC_DOUBLE(
+            x -> x == null ? null : stringToAggregateMetricDoubleLiteral(x),
+            AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral.class
+        ),
         UNSUPPORTED(Type::convertUnsupported, Void.class);
 
         private static Void convertUnsupported(String s) {
@@ -560,11 +566,18 @@ public final class CsvTestUtils {
                 case BYTES_REF -> bytesRefBlockType(actualType);
                 case BOOLEAN -> BOOLEAN;
                 case DOC -> throw new IllegalArgumentException("can't assert on doc blocks");
-                case COMPOSITE -> throw new IllegalArgumentException("can't assert on composite blocks");
+                case COMPOSITE -> compositeBlockType(actualType);
                 case UNKNOWN -> throw new IllegalArgumentException("Unknown block types cannot be handled");
             };
         }
 
+        private static Type compositeBlockType(Type actualType) {
+            return switch (actualType) {
+                case AGGREGATE_METRIC_DOUBLE -> actualType;
+                default -> throw new IllegalArgumentException("can't assert on composite blocks that aren't aggregate metric doubles");
+            };
+        }
+
         private static Type bytesRefBlockType(Type actualType) {
             return switch (actualType) {
                 case NULL -> NULL;

+ 10 - 2
x-pack/plugin/esql/qa/testFixtures/src/main/java/org/elasticsearch/xpack/esql/EsqlTestUtils.java

@@ -21,6 +21,7 @@ import org.elasticsearch.common.bytes.BytesReference;
 import org.elasticsearch.common.regex.Regex;
 import org.elasticsearch.common.settings.Settings;
 import org.elasticsearch.common.util.BigArrays;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
 import org.elasticsearch.compute.data.BlockFactory;
 import org.elasticsearch.compute.data.BlockUtils;
 import org.elasticsearch.compute.data.BytesRefBlock;
@@ -786,6 +787,12 @@ public final class EsqlTestUtils {
             case CARTESIAN_POINT -> CARTESIAN.asWkb(ShapeTestUtils.randomPoint());
             case GEO_SHAPE -> GEO.asWkb(GeometryTestUtils.randomGeometry(randomBoolean()));
             case CARTESIAN_SHAPE -> CARTESIAN.asWkb(ShapeTestUtils.randomGeometry(randomBoolean()));
+            case AGGREGATE_METRIC_DOUBLE -> new AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral(
+                randomDouble(),
+                randomDouble(),
+                randomDouble(),
+                randomInt()
+            );
             case NULL -> null;
             case SOURCE -> {
                 try {
@@ -796,8 +803,9 @@ public final class EsqlTestUtils {
                     throw new UncheckedIOException(e);
                 }
             }
-            case UNSUPPORTED, OBJECT, DOC_DATA_TYPE, TSID_DATA_TYPE, PARTIAL_AGG, AGGREGATE_METRIC_DOUBLE ->
-                throw new IllegalArgumentException("can't make random values for [" + type.typeName() + "]");
+            case UNSUPPORTED, OBJECT, DOC_DATA_TYPE, TSID_DATA_TYPE, PARTIAL_AGG -> throw new IllegalArgumentException(
+                "can't make random values for [" + type.typeName() + "]"
+            );
         }, type);
     }
 

+ 28 - 0
x-pack/plugin/esql/qa/testFixtures/src/main/resources/convert.csv-spec

@@ -451,3 +451,31 @@ emp_no:integer  | birth_date:datetime
 10097           | 1952-02-27T00:00:00.000Z
 10100           | 1953-04-21T00:00:00.000Z
 ;
+
+convertToAggregateMetricDouble
+required_capability: aggregate_metric_double_convert_to
+//tag::toAggregateMetricDouble[]
+ROW x = 3892095203
+| EVAL agg_metric = TO_AGGREGATE_METRIC_DOUBLE(x)
+//end::toAggregateMetricDouble[]
+;
+
+//tag::toAggregateMetricDouble-result[]
+x:long     | agg_metric:aggregate_metric_double
+3892095203 | {"min":3892095203.0,"max":3892095203.0,"sum":3892095203.0,"value_count":1}
+//end::toAggregateMetricDouble-result[]
+;
+
+convertToAggregateMetricDoubleMv
+required_capability: aggregate_metric_double_convert_to
+//tag::toAggregateMetricDoubleMv[]
+ROW x = [5032, 11111, 40814]
+| EVAL agg_metric = TO_AGGREGATE_METRIC_DOUBLE(x)
+//end::toAggregateMetricDoubleMv[]
+;
+
+//tag::toAggregateMetricDoubleMv-result[]
+x:integer            | agg_metric:aggregate_metric_double
+[5032, 11111, 40814] | {"min":5032.0,"max":40814.0,"sum":56957.0,"value_count":3}
+//end::toAggregateMetricDoubleMv-result[]
+;

+ 11 - 1
x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/action/EsqlCapabilities.java

@@ -772,7 +772,17 @@ public class EsqlCapabilities {
         /**
          * Supercedes {@link Cap#MAKE_NUMBER_OF_CHANNELS_CONSISTENT_WITH_LAYOUT}.
          */
-        FIX_REPLACE_MISSING_FIELD_WITH_NULL_DUPLICATE_NAME_ID_IN_LAYOUT;
+        FIX_REPLACE_MISSING_FIELD_WITH_NULL_DUPLICATE_NAME_ID_IN_LAYOUT,
+
+        /**
+         * Support for to_aggregate_metric_double function
+         */
+        AGGREGATE_METRIC_DOUBLE_CONVERT_TO(AGGREGATE_METRIC_DOUBLE_FEATURE_FLAG),
+
+        /**
+         * Support for sorting when aggregate_metric_doubles are present
+         */
+        AGGREGATE_METRIC_DOUBLE_SORTING(AGGREGATE_METRIC_DOUBLE_FEATURE_FLAG);
 
         private final boolean enabled;
 

+ 2 - 0
x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/expression/ExpressionWritables.java

@@ -14,6 +14,7 @@ import org.elasticsearch.xpack.esql.expression.function.aggregate.AggregateWrita
 import org.elasticsearch.xpack.esql.expression.function.fulltext.FullTextWritables;
 import org.elasticsearch.xpack.esql.expression.function.scalar.ScalarFunctionWritables;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.FromBase64;
+import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToAggregateMetricDouble;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToBase64;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToBoolean;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToCartesianPoint;
@@ -180,6 +181,7 @@ public class ExpressionWritables {
         entries.add(StY.ENTRY);
         entries.add(Tan.ENTRY);
         entries.add(Tanh.ENTRY);
+        entries.add(ToAggregateMetricDouble.ENTRY);
         entries.add(ToBase64.ENTRY);
         entries.add(ToBoolean.ENTRY);
         entries.add(ToCartesianPoint.ENTRY);

+ 2 - 0
x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/expression/function/EsqlFunctionRegistry.java

@@ -42,6 +42,7 @@ import org.elasticsearch.xpack.esql.expression.function.scalar.conditional.Case;
 import org.elasticsearch.xpack.esql.expression.function.scalar.conditional.Greatest;
 import org.elasticsearch.xpack.esql.expression.function.scalar.conditional.Least;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.FromBase64;
+import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToAggregateMetricDouble;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToBase64;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToBoolean;
 import org.elasticsearch.xpack.esql.expression.function.scalar.convert.ToCartesianPoint;
@@ -376,6 +377,7 @@ public class EsqlFunctionRegistry {
             // conversion functions
             new FunctionDefinition[] {
                 def(FromBase64.class, FromBase64::new, "from_base64"),
+                def(ToAggregateMetricDouble.class, ToAggregateMetricDouble::new, "to_aggregate_metric_double", "to_aggregatemetricdouble"),
                 def(ToBase64.class, ToBase64::new, "to_base64"),
                 def(ToBoolean.class, ToBoolean::new, "to_boolean", "to_bool"),
                 def(ToCartesianPoint.class, ToCartesianPoint::new, "to_cartesianpoint"),

+ 570 - 0
x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/expression/function/scalar/convert/ToAggregateMetricDouble.java

@@ -0,0 +1,570 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.esql.expression.function.scalar.convert;
+
+import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
+import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
+import org.elasticsearch.compute.data.Block;
+import org.elasticsearch.compute.data.BlockFactory;
+import org.elasticsearch.compute.data.CompositeBlock;
+import org.elasticsearch.compute.data.DoubleBlock;
+import org.elasticsearch.compute.data.DoubleVector;
+import org.elasticsearch.compute.data.IntBlock;
+import org.elasticsearch.compute.data.IntVector;
+import org.elasticsearch.compute.data.LongBlock;
+import org.elasticsearch.compute.data.LongVector;
+import org.elasticsearch.compute.data.Page;
+import org.elasticsearch.compute.data.Vector;
+import org.elasticsearch.compute.operator.DriverContext;
+import org.elasticsearch.compute.operator.EvalOperator;
+import org.elasticsearch.core.Releasable;
+import org.elasticsearch.core.Releasables;
+import org.elasticsearch.search.aggregations.metrics.CompensatedSum;
+import org.elasticsearch.xpack.esql.core.expression.Expression;
+import org.elasticsearch.xpack.esql.core.tree.NodeInfo;
+import org.elasticsearch.xpack.esql.core.tree.Source;
+import org.elasticsearch.xpack.esql.core.type.DataType;
+import org.elasticsearch.xpack.esql.expression.function.Example;
+import org.elasticsearch.xpack.esql.expression.function.FunctionInfo;
+import org.elasticsearch.xpack.esql.expression.function.Param;
+import org.elasticsearch.xpack.esql.type.EsqlDataTypeConverter;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Map;
+
+import static org.elasticsearch.xpack.esql.core.expression.TypeResolutions.ParamOrdinal.DEFAULT;
+import static org.elasticsearch.xpack.esql.core.expression.TypeResolutions.isType;
+import static org.elasticsearch.xpack.esql.core.type.DataType.AGGREGATE_METRIC_DOUBLE;
+import static org.elasticsearch.xpack.esql.core.type.DataType.DOUBLE;
+import static org.elasticsearch.xpack.esql.core.type.DataType.INTEGER;
+import static org.elasticsearch.xpack.esql.core.type.DataType.LONG;
+import static org.elasticsearch.xpack.esql.core.type.DataType.UNSIGNED_LONG;
+
+public class ToAggregateMetricDouble extends AbstractConvertFunction {
+
+    private static final Map<DataType, AbstractConvertFunction.BuildFactory> EVALUATORS = Map.ofEntries(
+        Map.entry(AGGREGATE_METRIC_DOUBLE, (source, fieldEval) -> fieldEval),
+        Map.entry(DOUBLE, DoubleFactory::new),
+        Map.entry(INTEGER, IntFactory::new),
+        Map.entry(LONG, LongFactory::new),
+        Map.entry(UNSIGNED_LONG, UnsignedLongFactory::new)
+    );
+
+    public static final NamedWriteableRegistry.Entry ENTRY = new NamedWriteableRegistry.Entry(
+        Expression.class,
+        "ToAggregateMetricDouble",
+        ToAggregateMetricDouble::new
+    );
+
+    @FunctionInfo(
+        returnType = "aggregate_metric_double",
+        description = "Encode a numeric to an aggregate_metric_double.",
+        examples = {
+            @Example(file = "convert", tag = "toAggregateMetricDouble"),
+            @Example(description = "The expression also accepts multi-values", file = "convert", tag = "toAggregateMetricDoubleMv") }
+    )
+    public ToAggregateMetricDouble(
+        Source source,
+        @Param(
+            name = "number",
+            type = { "double", "long", "unsigned_long", "integer", "aggregate_metric_double" },
+            description = "Input value. The input can be a single- or multi-valued column or an expression."
+        ) Expression field
+    ) {
+        super(source, field);
+    }
+
+    private ToAggregateMetricDouble(StreamInput in) throws IOException {
+        super(in);
+    }
+
+    @Override
+    public String getWriteableName() {
+        return ENTRY.name;
+    }
+
+    @Override
+    protected TypeResolution resolveType() {
+        if (childrenResolved() == false) {
+            return new TypeResolution("Unresolved children");
+        }
+        return isType(
+            field,
+            dt -> dt == DataType.AGGREGATE_METRIC_DOUBLE || dt == DataType.DOUBLE || dt == LONG || dt == INTEGER || dt == UNSIGNED_LONG,
+            sourceText(),
+            DEFAULT,
+            "numeric or aggregate_metric_double"
+        );
+    }
+
+    @Override
+    public DataType dataType() {
+        return AGGREGATE_METRIC_DOUBLE;
+    }
+
+    @Override
+    public Expression replaceChildren(List<Expression> newChildren) {
+        return new ToAggregateMetricDouble(source(), newChildren.get(0));
+    }
+
+    @Override
+    protected NodeInfo<? extends Expression> info() {
+        return NodeInfo.create(this, ToAggregateMetricDouble::new, field);
+    }
+
+    @Override
+    protected Map<DataType, AbstractConvertFunction.BuildFactory> factories() {
+        return EVALUATORS;
+    }
+
+    private static class AggregateMetricDoubleVectorBuilder implements Releasable {
+        private final DoubleVector.FixedBuilder valuesBuilder;
+        private final BlockFactory blockFactory;
+
+        private AggregateMetricDoubleVectorBuilder(int estimatedSize, BlockFactory blockFactory) {
+            this.blockFactory = blockFactory;
+            this.valuesBuilder = blockFactory.newDoubleVectorFixedBuilder(estimatedSize);
+        }
+
+        private void appendValue(double value) {
+            valuesBuilder.appendDouble(value);
+        }
+
+        private Block build() {
+            Block[] blocks = new Block[4];
+            Block block;
+            boolean success = false;
+            try {
+                block = valuesBuilder.build().asBlock();
+                blocks[AggregateMetricDoubleBlockBuilder.Metric.MIN.getIndex()] = block;
+                blocks[AggregateMetricDoubleBlockBuilder.Metric.MAX.getIndex()] = block;
+                block.incRef();
+                blocks[AggregateMetricDoubleBlockBuilder.Metric.SUM.getIndex()] = block;
+                block.incRef();
+                blocks[AggregateMetricDoubleBlockBuilder.Metric.COUNT.getIndex()] = blockFactory.newConstantIntBlockWith(
+                    1,
+                    block.getPositionCount()
+                );
+                CompositeBlock compositeBlock = new CompositeBlock(blocks);
+                success = true;
+                return compositeBlock;
+            } finally {
+                if (success == false) {
+                    Releasables.closeExpectNoException(blocks);
+                }
+            }
+        }
+
+        @Override
+        public void close() {
+            Releasables.closeExpectNoException(valuesBuilder);
+        }
+    }
+
+    public static class DoubleFactory implements EvalOperator.ExpressionEvaluator.Factory {
+        private final Source source;
+
+        private final EvalOperator.ExpressionEvaluator.Factory fieldEvaluator;
+
+        public DoubleFactory(Source source, EvalOperator.ExpressionEvaluator.Factory fieldEvaluator) {
+            this.fieldEvaluator = fieldEvaluator;
+            this.source = source;
+        }
+
+        @Override
+        public String toString() {
+            return "ToAggregateMetricDoubleFromDoubleEvaluator[" + "field=" + fieldEvaluator + "]";
+        }
+
+        @Override
+        public EvalOperator.ExpressionEvaluator get(DriverContext context) {
+            final EvalOperator.ExpressionEvaluator eval = fieldEvaluator.get(context);
+
+            return new EvalOperator.ExpressionEvaluator() {
+                private Block evalBlock(Block block) {
+                    int positionCount = block.getPositionCount();
+                    DoubleBlock doubleBlock = (DoubleBlock) block;
+                    try (
+                        AggregateMetricDoubleBlockBuilder builder = context.blockFactory()
+                            .newAggregateMetricDoubleBlockBuilder(positionCount)
+                    ) {
+                        CompensatedSum compensatedSum = new CompensatedSum();
+                        for (int p = 0; p < positionCount; p++) {
+                            int valueCount = doubleBlock.getValueCount(p);
+                            if (valueCount == 0) {
+                                builder.appendNull();
+                                continue;
+                            }
+                            int start = doubleBlock.getFirstValueIndex(p);
+                            int end = start + valueCount;
+                            if (valueCount == 1) {
+                                double current = doubleBlock.getDouble(start);
+                                builder.min().appendDouble(current);
+                                builder.max().appendDouble(current);
+                                builder.sum().appendDouble(current);
+                                builder.count().appendInt(valueCount);
+                                continue;
+                            }
+                            double min = Double.POSITIVE_INFINITY;
+                            double max = Double.NEGATIVE_INFINITY;
+                            for (int i = start; i < end; i++) {
+                                double current = doubleBlock.getDouble(i);
+                                min = Math.min(min, current);
+                                max = Math.max(max, current);
+                                compensatedSum.add(current);
+                            }
+                            builder.min().appendDouble(min);
+                            builder.max().appendDouble(max);
+                            builder.sum().appendDouble(compensatedSum.value());
+                            builder.count().appendInt(valueCount);
+                            compensatedSum.reset(0, 0);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                private Block evalVector(Vector vector) {
+                    int positionCount = vector.getPositionCount();
+                    DoubleVector doubleVector = (DoubleVector) vector;
+                    try (
+                        AggregateMetricDoubleVectorBuilder builder = new AggregateMetricDoubleVectorBuilder(
+                            positionCount,
+                            context.blockFactory()
+                        )
+                    ) {
+                        for (int p = 0; p < positionCount; p++) {
+                            double value = doubleVector.getDouble(p);
+                            builder.appendValue(value);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                @Override
+                public Block eval(Page page) {
+                    try (Block block = eval.eval(page)) {
+                        Vector vector = block.asVector();
+                        return vector == null ? evalBlock(block) : evalVector(vector);
+                    }
+                }
+
+                @Override
+                public void close() {
+                    Releasables.closeExpectNoException(eval);
+                }
+
+                @Override
+                public String toString() {
+                    return "ToAggregateMetricDoubleFromDoubleEvaluator[field=" + eval + "]";
+                }
+            };
+        }
+    }
+
+    public static class IntFactory implements EvalOperator.ExpressionEvaluator.Factory {
+        private final Source source;
+
+        private final EvalOperator.ExpressionEvaluator.Factory fieldEvaluator;
+
+        public IntFactory(Source source, EvalOperator.ExpressionEvaluator.Factory fieldEvaluator) {
+            this.fieldEvaluator = fieldEvaluator;
+            this.source = source;
+        }
+
+        @Override
+        public String toString() {
+            return "ToAggregateMetricDoubleFromIntEvaluator[" + "field=" + fieldEvaluator + "]";
+        }
+
+        @Override
+        public EvalOperator.ExpressionEvaluator get(DriverContext context) {
+            final EvalOperator.ExpressionEvaluator eval = fieldEvaluator.get(context);
+
+            return new EvalOperator.ExpressionEvaluator() {
+                @Override
+                public Block eval(Page page) {
+                    try (Block block = eval.eval(page)) {
+                        Vector vector = block.asVector();
+                        return vector == null ? evalBlock(block) : evalVector(vector);
+                    }
+                }
+
+                private Block evalBlock(Block block) {
+                    int positionCount = block.getPositionCount();
+                    IntBlock intBlock = (IntBlock) block;
+                    try (
+                        AggregateMetricDoubleBlockBuilder builder = context.blockFactory()
+                            .newAggregateMetricDoubleBlockBuilder(positionCount)
+                    ) {
+                        CompensatedSum sum = new CompensatedSum();
+                        for (int p = 0; p < positionCount; p++) {
+                            int valueCount = intBlock.getValueCount(p);
+                            int start = intBlock.getFirstValueIndex(p);
+                            int end = start + valueCount;
+                            if (valueCount == 0) {
+                                builder.appendNull();
+                                continue;
+                            }
+                            if (valueCount == 1) {
+                                double current = intBlock.getInt(start);
+                                builder.min().appendDouble(current);
+                                builder.max().appendDouble(current);
+                                builder.sum().appendDouble(current);
+                                builder.count().appendInt(valueCount);
+                                continue;
+                            }
+                            double min = Double.POSITIVE_INFINITY;
+                            double max = Double.NEGATIVE_INFINITY;
+                            for (int i = start; i < end; i++) {
+                                double current = intBlock.getInt(i);
+                                min = Math.min(min, current);
+                                max = Math.max(max, current);
+                                sum.add(current);
+                            }
+                            builder.min().appendDouble(min);
+                            builder.max().appendDouble(max);
+                            builder.sum().appendDouble(sum.value());
+                            builder.count().appendInt(valueCount);
+                            sum.reset(0, 0);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                private Block evalVector(Vector vector) {
+                    int positionCount = vector.getPositionCount();
+                    IntVector intVector = (IntVector) vector;
+                    try (
+                        AggregateMetricDoubleVectorBuilder builder = new AggregateMetricDoubleVectorBuilder(
+                            positionCount,
+                            context.blockFactory()
+                        )
+                    ) {
+                        for (int p = 0; p < positionCount; p++) {
+                            double value = intVector.getInt(p);
+                            builder.appendValue(value);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                @Override
+                public void close() {
+                    Releasables.closeExpectNoException(eval);
+                }
+
+                @Override
+                public String toString() {
+                    return "ToAggregateMetricDoubleFromIntEvaluator[field=" + eval + "]";
+                }
+            };
+        }
+    }
+
+    public static class LongFactory implements EvalOperator.ExpressionEvaluator.Factory {
+        private final Source source;
+
+        private final EvalOperator.ExpressionEvaluator.Factory fieldEvaluator;
+
+        public LongFactory(Source source, EvalOperator.ExpressionEvaluator.Factory fieldEvaluator) {
+            this.fieldEvaluator = fieldEvaluator;
+            this.source = source;
+        }
+
+        @Override
+        public String toString() {
+            return "ToAggregateMetricDoubleFromLongEvaluator[" + "field=" + fieldEvaluator + "]";
+        }
+
+        @Override
+        public EvalOperator.ExpressionEvaluator get(DriverContext context) {
+            final EvalOperator.ExpressionEvaluator eval = fieldEvaluator.get(context);
+
+            return new EvalOperator.ExpressionEvaluator() {
+                private Block evalBlock(Block block) {
+                    int positionCount = block.getPositionCount();
+                    LongBlock longBlock = (LongBlock) block;
+                    try (
+                        AggregateMetricDoubleBlockBuilder builder = context.blockFactory()
+                            .newAggregateMetricDoubleBlockBuilder(positionCount)
+                    ) {
+                        CompensatedSum sum = new CompensatedSum();
+                        for (int p = 0; p < positionCount; p++) {
+                            int valueCount = longBlock.getValueCount(p);
+                            int start = longBlock.getFirstValueIndex(p);
+                            int end = start + valueCount;
+                            if (valueCount == 0) {
+                                builder.appendNull();
+                                continue;
+                            }
+                            if (valueCount == 1) {
+                                double current = longBlock.getLong(start);
+                                builder.min().appendDouble(current);
+                                builder.max().appendDouble(current);
+                                builder.sum().appendDouble(current);
+                                builder.count().appendInt(valueCount);
+                                continue;
+                            }
+                            double min = Double.POSITIVE_INFINITY;
+                            double max = Double.NEGATIVE_INFINITY;
+                            for (int i = start; i < end; i++) {
+                                double current = longBlock.getLong(i);
+                                min = Math.min(min, current);
+                                max = Math.max(max, current);
+                                sum.add(current);
+                            }
+                            builder.min().appendDouble(min);
+                            builder.max().appendDouble(max);
+                            builder.sum().appendDouble(sum.value());
+                            builder.count().appendInt(valueCount);
+                            sum.reset(0, 0);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                private Block evalVector(Vector vector) {
+                    int positionCount = vector.getPositionCount();
+                    LongVector longVector = (LongVector) vector;
+                    try (
+                        AggregateMetricDoubleVectorBuilder builder = new AggregateMetricDoubleVectorBuilder(
+                            positionCount,
+                            context.blockFactory()
+                        )
+                    ) {
+                        for (int p = 0; p < positionCount; p++) {
+                            double value = longVector.getLong(p);
+                            builder.appendValue(value);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                @Override
+                public Block eval(Page page) {
+                    try (Block block = eval.eval(page)) {
+                        Vector vector = block.asVector();
+                        return vector == null ? evalBlock(block) : evalVector(vector);
+                    }
+                }
+
+                @Override
+                public void close() {
+                    Releasables.closeExpectNoException(eval);
+                }
+
+                @Override
+                public String toString() {
+                    return "ToAggregateMetricDoubleFromLongEvaluator[field=" + eval + "]";
+                }
+            };
+        }
+    }
+
+    public static class UnsignedLongFactory implements EvalOperator.ExpressionEvaluator.Factory {
+        private final Source source;
+
+        private final EvalOperator.ExpressionEvaluator.Factory fieldEvaluator;
+
+        public UnsignedLongFactory(Source source, EvalOperator.ExpressionEvaluator.Factory fieldEvaluator) {
+            this.fieldEvaluator = fieldEvaluator;
+            this.source = source;
+        }
+
+        @Override
+        public String toString() {
+            return "ToAggregateMetricDoubleFromUnsignedLongEvaluator[" + "field=" + fieldEvaluator + "]";
+        }
+
+        @Override
+        public EvalOperator.ExpressionEvaluator get(DriverContext context) {
+            final EvalOperator.ExpressionEvaluator eval = fieldEvaluator.get(context);
+
+            return new EvalOperator.ExpressionEvaluator() {
+                private Block evalBlock(Block block) {
+                    int positionCount = block.getPositionCount();
+                    LongBlock longBlock = (LongBlock) block;
+                    try (
+                        AggregateMetricDoubleBlockBuilder builder = context.blockFactory()
+                            .newAggregateMetricDoubleBlockBuilder(positionCount)
+                    ) {
+                        CompensatedSum sum = new CompensatedSum();
+                        for (int p = 0; p < positionCount; p++) {
+                            int valueCount = longBlock.getValueCount(p);
+                            int start = longBlock.getFirstValueIndex(p);
+                            int end = start + valueCount;
+                            if (valueCount == 0) {
+                                builder.appendNull();
+                                continue;
+                            }
+                            if (valueCount == 1) {
+                                double current = EsqlDataTypeConverter.unsignedLongToDouble(longBlock.getLong(p));
+                                builder.min().appendDouble(current);
+                                builder.max().appendDouble(current);
+                                builder.sum().appendDouble(current);
+                                builder.count().appendInt(valueCount);
+                                continue;
+                            }
+                            double min = Double.POSITIVE_INFINITY;
+                            double max = Double.NEGATIVE_INFINITY;
+                            for (int i = start; i < end; i++) {
+                                double current = EsqlDataTypeConverter.unsignedLongToDouble(longBlock.getLong(p));
+                                min = Math.min(min, current);
+                                max = Math.max(max, current);
+                                sum.add(current);
+                            }
+                            builder.min().appendDouble(min);
+                            builder.max().appendDouble(max);
+                            builder.sum().appendDouble(sum.value());
+                            builder.count().appendInt(valueCount);
+                            sum.reset(0, 0);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                private Block evalVector(Vector vector) {
+                    int positionCount = vector.getPositionCount();
+                    LongVector longVector = (LongVector) vector;
+                    try (
+                        AggregateMetricDoubleVectorBuilder builder = new AggregateMetricDoubleVectorBuilder(
+                            positionCount,
+                            context.blockFactory()
+                        )
+                    ) {
+                        for (int p = 0; p < positionCount; p++) {
+                            double value = EsqlDataTypeConverter.unsignedLongToDouble(longVector.getLong(p));
+                            builder.appendValue(value);
+                        }
+                        return builder.build();
+                    }
+                }
+
+                @Override
+                public Block eval(Page page) {
+                    try (Block block = eval.eval(page)) {
+                        Vector vector = block.asVector();
+                        return vector == null ? evalBlock(block) : evalVector(vector);
+                    }
+                }
+
+                @Override
+                public void close() {
+                    Releasables.closeExpectNoException(eval);
+                }
+
+                @Override
+                public String toString() {
+                    return "ToAggregateMetricDoubleFromUnsignedLongEvaluator[field=" + eval + "]";
+                }
+            };
+        }
+    }
+}

+ 3 - 3
x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/planner/LocalExecutionPlanner.java

@@ -372,10 +372,10 @@ public class LocalExecutionPlanner {
                 case VERSION -> TopNEncoder.VERSION;
                 case BOOLEAN, NULL, BYTE, SHORT, INTEGER, LONG, DOUBLE, FLOAT, HALF_FLOAT, DATETIME, DATE_NANOS, DATE_PERIOD, TIME_DURATION,
                     OBJECT, SCALED_FLOAT, UNSIGNED_LONG, DOC_DATA_TYPE, TSID_DATA_TYPE -> TopNEncoder.DEFAULT_SORTABLE;
-                case GEO_POINT, CARTESIAN_POINT, GEO_SHAPE, CARTESIAN_SHAPE, COUNTER_LONG, COUNTER_INTEGER, COUNTER_DOUBLE, SOURCE ->
-                    TopNEncoder.DEFAULT_UNSORTABLE;
+                case GEO_POINT, CARTESIAN_POINT, GEO_SHAPE, CARTESIAN_SHAPE, COUNTER_LONG, COUNTER_INTEGER, COUNTER_DOUBLE, SOURCE,
+                    AGGREGATE_METRIC_DOUBLE -> TopNEncoder.DEFAULT_UNSORTABLE;
                 // unsupported fields are encoded as BytesRef, we'll use the same encoder; all values should be null at this point
-                case PARTIAL_AGG, UNSUPPORTED, AGGREGATE_METRIC_DOUBLE -> TopNEncoder.UNSUPPORTED;
+                case PARTIAL_AGG, UNSUPPORTED -> TopNEncoder.UNSUPPORTED;
             };
         }
         List<TopNOperator.SortOrder> orders = topNExec.order().stream().map(order -> {

+ 55 - 0
x-pack/plugin/esql/src/main/java/org/elasticsearch/xpack/esql/type/EsqlDataTypeConverter.java

@@ -16,6 +16,7 @@ import org.elasticsearch.common.lucene.BytesRefs;
 import org.elasticsearch.common.time.DateFormatter;
 import org.elasticsearch.common.time.DateFormatters;
 import org.elasticsearch.common.time.DateUtils;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
 import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder.Metric;
 import org.elasticsearch.compute.data.CompositeBlock;
 import org.elasticsearch.compute.data.DoubleBlock;
@@ -708,6 +709,60 @@ public class EsqlDataTypeConverter {
         }
     }
 
+    public static String aggregateMetricDoubleLiteralToString(AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral aggMetric) {
+        try (XContentBuilder builder = JsonXContent.contentBuilder()) {
+            builder.startObject();
+            if (aggMetric.min() != null) {
+                builder.field(Metric.MIN.getLabel(), aggMetric.min());
+            }
+            if (aggMetric.max() != null) {
+                builder.field(Metric.MAX.getLabel(), aggMetric.max());
+            }
+            if (aggMetric.sum() != null) {
+                builder.field(Metric.SUM.getLabel(), aggMetric.sum());
+            }
+            if (aggMetric.count() != null) {
+                builder.field(Metric.COUNT.getLabel(), aggMetric.count());
+            }
+            builder.endObject();
+            return Strings.toString(builder);
+        } catch (IOException e) {
+            throw new IllegalStateException("error rendering aggregate metric double", e);
+        }
+    }
+
+    public static AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral stringToAggregateMetricDoubleLiteral(String s) {
+        Double min = null;
+        Double max = null;
+        Double sum = null;
+        Integer count = null;
+        String[] values = s.substring(1, s.length() - 1).split(",");
+        for (String v : values) {
+            var pair = v.split(":");
+            String type = pair[0];
+            String number = pair[1];
+            switch (type) {
+                case "min":
+                    min = Double.parseDouble(number);
+                    break;
+                case "max":
+                    max = Double.parseDouble(number);
+                    break;
+                case "sum":
+                    sum = Double.parseDouble(number);
+                    break;
+                case "value_count":
+                    count = Integer.parseInt(number);
+                    break;
+                default:
+                    throw new IllegalArgumentException(
+                        "Received a metric that wasn't min, max, sum, or value_count: " + type + " with value: " + number
+                    );
+            }
+        }
+        return new AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral(min, max, sum, count);
+    }
+
     public enum EsqlConverter implements Converter {
 
         STRING_TO_DATE_PERIOD(x -> EsqlDataTypeConverter.parseTemporalAmount(x, DataType.DATE_PERIOD)),

+ 9 - 0
x-pack/plugin/esql/src/test/java/org/elasticsearch/xpack/esql/SerializationTestUtils.java

@@ -10,9 +10,11 @@ package org.elasticsearch.xpack.esql;
 import org.elasticsearch.common.bytes.BytesReference;
 import org.elasticsearch.common.io.stream.ByteBufferStreamInput;
 import org.elasticsearch.common.io.stream.BytesStreamOutput;
+import org.elasticsearch.common.io.stream.GenericNamedWriteable;
 import org.elasticsearch.common.io.stream.NamedWriteableAwareStreamInput;
 import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
 import org.elasticsearch.common.io.stream.StreamInput;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
 import org.elasticsearch.index.query.BoolQueryBuilder;
 import org.elasticsearch.index.query.ExistsQueryBuilder;
 import org.elasticsearch.index.query.MatchAllQueryBuilder;
@@ -112,6 +114,13 @@ public class SerializationTestUtils {
         entries.add(SingleValueQuery.ENTRY);
         entries.addAll(ExpressionWritables.getNamedWriteables());
         entries.addAll(PlanWritables.getNamedWriteables());
+        entries.add(
+            new NamedWriteableRegistry.Entry(
+                GenericNamedWriteable.class,
+                AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral.ENTRY.name,
+                AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral::new
+            )
+        );
         return new NamedWriteableRegistry(entries);
     }
 }

+ 91 - 0
x-pack/plugin/esql/src/test/java/org/elasticsearch/xpack/esql/expression/function/scalar/convert/ToAggregateMetricDoubleTests.java

@@ -0,0 +1,91 @@
+/*
+ * Copyright Elasticsearch B.V. and/or licensed to Elasticsearch B.V. under one
+ * or more contributor license agreements. Licensed under the Elastic License
+ * 2.0; you may not use this file except in compliance with the Elastic License
+ * 2.0.
+ */
+
+package org.elasticsearch.xpack.esql.expression.function.scalar.convert;
+
+import com.carrotsearch.randomizedtesting.annotations.Name;
+import com.carrotsearch.randomizedtesting.annotations.ParametersFactory;
+
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
+import org.elasticsearch.xpack.esql.core.expression.Expression;
+import org.elasticsearch.xpack.esql.core.tree.Source;
+import org.elasticsearch.xpack.esql.core.type.DataType;
+import org.elasticsearch.xpack.esql.expression.function.AbstractScalarFunctionTestCase;
+import org.elasticsearch.xpack.esql.expression.function.FunctionName;
+import org.elasticsearch.xpack.esql.expression.function.TestCaseSupplier;
+
+import java.math.BigInteger;
+import java.util.ArrayList;
+import java.util.List;
+import java.util.function.Supplier;
+
+import static java.util.Collections.emptyList;
+
+@FunctionName("to_aggregate_metric_double")
+public class ToAggregateMetricDoubleTests extends AbstractScalarFunctionTestCase {
+    public ToAggregateMetricDoubleTests(@Name("TestCase") Supplier<TestCaseSupplier.TestCase> testCaseSupplier) {
+        this.testCase = testCaseSupplier.get();
+    }
+
+    @Override
+    protected Expression build(Source source, List<Expression> args) {
+        if (args.get(0).dataType() == DataType.AGGREGATE_METRIC_DOUBLE) {
+            assumeTrue("Test sometimes wraps literals as fields", args.get(0).foldable());
+        }
+        return new ToAggregateMetricDouble(source, args.get(0));
+    }
+
+    @ParametersFactory
+    public static Iterable<Object[]> parameters() {
+        final String evaluatorStringLeft = "ToAggregateMetricDoubleFrom";
+        final String evaluatorStringRight = "Evaluator[field=Attribute[channel=0]]";
+        final List<TestCaseSupplier> suppliers = new ArrayList<>();
+
+        TestCaseSupplier.forUnaryInt(
+            suppliers,
+            evaluatorStringLeft + "Int" + evaluatorStringRight,
+            DataType.AGGREGATE_METRIC_DOUBLE,
+            i -> new AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral((double) i, (double) i, (double) i, 1),
+            Integer.MIN_VALUE,
+            Integer.MAX_VALUE,
+            emptyList()
+        );
+        TestCaseSupplier.forUnaryLong(
+            suppliers,
+            evaluatorStringLeft + "Long" + evaluatorStringRight,
+            DataType.AGGREGATE_METRIC_DOUBLE,
+            l -> new AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral((double) l, (double) l, (double) l, 1),
+            Long.MIN_VALUE,
+            Long.MAX_VALUE,
+            emptyList()
+        );
+        TestCaseSupplier.forUnaryUnsignedLong(
+            suppliers,
+            evaluatorStringLeft + "UnsignedLong" + evaluatorStringRight,
+            DataType.AGGREGATE_METRIC_DOUBLE,
+            ul -> {
+                var newVal = ul.doubleValue();
+                return new AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral(newVal, newVal, newVal, 1);
+            },
+            BigInteger.ZERO,
+            UNSIGNED_LONG_MAX,
+            emptyList()
+        );
+        TestCaseSupplier.forUnaryDouble(
+            suppliers,
+            evaluatorStringLeft + "Double" + evaluatorStringRight,
+            DataType.AGGREGATE_METRIC_DOUBLE,
+            d -> new AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral(d, d, d, 1),
+            Double.NEGATIVE_INFINITY,
+            Double.POSITIVE_INFINITY,
+            emptyList()
+        );
+
+        return parameterSuppliersFromTypedDataWithDefaultChecksNoErrors(true, suppliers);
+    }
+
+}

+ 2 - 0
x-pack/plugin/esql/src/test/java/org/elasticsearch/xpack/esql/plan/physical/AbstractPhysicalPlanSerializationTests.java

@@ -9,6 +9,7 @@ package org.elasticsearch.xpack.esql.plan.physical;
 
 import org.elasticsearch.common.io.stream.NamedWriteableRegistry;
 import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.compute.data.AggregateMetricDoubleBlockBuilder;
 import org.elasticsearch.search.SearchModule;
 import org.elasticsearch.xpack.esql.core.tree.Node;
 import org.elasticsearch.xpack.esql.expression.ExpressionWritables;
@@ -51,6 +52,7 @@ public abstract class AbstractPhysicalPlanSerializationTests<T extends PhysicalP
         entries.addAll(ExpressionWritables.allExpressions());
         entries.addAll(new SearchModule(Settings.EMPTY, List.of()).getNamedWriteables()); // Query builders
         entries.add(Add.ENTRY); // Used by the eval tests
+        entries.add(AggregateMetricDoubleBlockBuilder.AggregateMetricDoubleLiteral.ENTRY);
         return new NamedWriteableRegistry(entries);
     }
 

+ 4 - 4
x-pack/plugin/mapper-aggregate-metric/src/main/java/org/elasticsearch/xpack/aggregatemetric/mapper/AggregateMetricDoubleFieldMapper.java

@@ -627,22 +627,22 @@ public class AggregateMetricDoubleFieldMapper extends FieldMapper {
                     }
 
                     private void readSingleRow(int docId, AggregateMetricDoubleBuilder builder) throws IOException {
-                        if (minValues.advanceExact(docId)) {
+                        if (minValues != null && minValues.advanceExact(docId)) {
                             builder.min().appendDouble(NumericUtils.sortableLongToDouble(minValues.longValue()));
                         } else {
                             builder.min().appendNull();
                         }
-                        if (maxValues.advanceExact(docId)) {
+                        if (maxValues != null && maxValues.advanceExact(docId)) {
                             builder.max().appendDouble(NumericUtils.sortableLongToDouble(maxValues.longValue()));
                         } else {
                             builder.max().appendNull();
                         }
-                        if (sumValues.advanceExact(docId)) {
+                        if (sumValues != null && sumValues.advanceExact(docId)) {
                             builder.sum().appendDouble(NumericUtils.sortableLongToDouble(sumValues.longValue()));
                         } else {
                             builder.sum().appendNull();
                         }
-                        if (valueCountValues.advanceExact(docId)) {
+                        if (valueCountValues != null && valueCountValues.advanceExact(docId)) {
                             builder.count().appendInt(Math.toIntExact(valueCountValues.longValue()));
                         } else {
                             builder.count().appendNull();

+ 98 - 0
x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/40_tsdb.yml

@@ -377,6 +377,54 @@ grouping stats on aggregate_metric_double:
   - match: {values.1.3: 16.0}
   - match: {values.1.4: "B"}
 
+---
+sorting with aggregate_metric_double with partial submetrics:
+  - requires:
+      test_runner_features: [capabilities]
+      capabilities:
+        - method: POST
+          path: /_query
+          parameters: []
+          capabilities: [aggregate_metric_double_sorting]
+      reason: "Support for sorting when aggregate_metric_double present"
+  - do:
+      allowed_warnings_regex:
+        - "No limit defined, adding default limit of \\[.*\\]"
+      esql.query:
+        body:
+          query: 'FROM test3 | SORT @timestamp | KEEP @timestamp, agg_metric'
+
+  - length: {values: 4}
+  - length: {values.0: 2}
+  - match: {columns.0.name: "@timestamp"}
+  - match: {columns.0.type: "date"}
+  - match: {columns.1.name: "agg_metric"}
+  - match: {columns.1.type: "aggregate_metric_double"}
+  - match: {values.0.0: "2021-04-28T19:50:04.467Z"}
+  - match: {values.1.0: "2021-04-28T19:50:24.467Z"}
+  - match: {values.2.0: "2021-04-28T19:50:44.467Z"}
+  - match: {values.3.0: "2021-04-28T19:51:04.467Z"}
+  - match: {values.0.1: '{"min":-3.0,"max":1.0}'}
+  - match: {values.1.1: '{"min":3.0,"max":10.0}'}
+  - match: {values.2.1: '{"min":2.0,"max":17.0}'}
+  - match: {values.3.1: null}
+
+---
+aggregate_metric_double unsortable:
+  - requires:
+      test_runner_features: [capabilities]
+      capabilities:
+        - method: POST
+          path: /_query
+          parameters: []
+          capabilities: [aggregate_metric_double_sorting]
+      reason: "Support for sorting when aggregate_metric_double present"
+  - do:
+      catch: /cannot sort on aggregate_metric_double/
+      esql.query:
+        body:
+          query: 'FROM test2 | sort agg_metric'
+
 ---
 stats on aggregate_metric_double with partial submetrics:
   - requires:
@@ -611,3 +659,53 @@ _source:
               rx: 530600088
               tx: 1434577921
             uid: df3145b3-0563-4d3b-a0f7-897eb2876ea9
+
+---
+to_aggregate_metric_double with multi_values:
+  - requires:
+      test_runner_features: [ capabilities ]
+      capabilities:
+        - method: POST
+          path: /_query
+          parameters: [ ]
+          capabilities: [ aggregate_metric_double_convert_to ]
+      reason: "Support for to_aggregate_metric_double function"
+
+  - do:
+      indices.create:
+        index: convert_test
+        body:
+          mappings:
+            properties:
+              "some_long_field":
+                type: long
+              "some_double_field":
+                type: double
+              "some_int_field":
+                type: integer
+              "some_unsigned_long_field":
+                type: unsigned_long
+  - do:
+      bulk:
+        refresh: true
+        index: new_test
+        body:
+          - {"index": {}}
+          - {"some_long_field": [20385, 182941, -10958], "some_double_field": [195.1, 102.444], "some_int_field": [64, 121, 498, 1456], "some_unsigned_long_field": [13985, 19418924, 123]}
+  - do:
+      esql.query:
+        body:
+          query: 'FROM new_test | EVAL from_long=TO_AGGREGATE_METRIC_DOUBLE(some_long_field), from_double=TO_AGGREGATE_METRIC_DOUBLE(some_double_field), from_int=TO_AGGREGATE_METRIC_DOUBLE(some_int_field), from_ulong=TO_AGGREGATE_METRIC_DOUBLE(some_unsigned_long_field) | KEEP from_long, from_double, from_int, from_ulong | LIMIT 1'
+
+  - match: {columns.0.name: "from_long"}
+  - match: {columns.0.type: "aggregate_metric_double"}
+  - match: {columns.1.name: "from_double"}
+  - match: {columns.1.type: "aggregate_metric_double"}
+  - match: {columns.2.name: "from_int"}
+  - match: {columns.2.type: "aggregate_metric_double"}
+  - match: {columns.3.name: "from_ulong"}
+  - match: {columns.3.type: "aggregate_metric_double"}
+  - match: {values.0.0: '{"min":-10958.0,"max":182941.0,"sum":192368.0,"value_count":3}'}
+  - match: {values.0.1: '{"min":102.44400024414062,"max":195.10000610351562,"sum":297.54400634765625,"value_count":2}'}
+  - match: {values.0.2: '{"min":64.0,"max":1456.0,"sum":2139.0,"value_count":4}'}
+  - match: {values.0.3: '{"min":123.0,"max":1.9418924E7,"sum":1.9433032E7,"value_count":3}'}

+ 64 - 62
x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/40_unsupported_types.yml

@@ -308,8 +308,8 @@ unsupported with sort:
         - method: POST
           path: /_query
           parameters: [ ]
-          capabilities: [ aggregate_metric_double ]
-      reason: "support for aggregate_metric_double type"
+          capabilities: [ aggregate_metric_double_sorting ]
+      reason: "support for sorting when aggregate_metric_double present"
 
   - do:
       allowed_warnings_regex:
@@ -317,95 +317,97 @@ unsupported with sort:
         - "No limit defined, adding default limit of \\[.*\\]"
       esql.query:
         body:
-          query: 'from test | drop aggregate_metric_double | sort some_doc.bar'
+          query: 'from test | sort some_doc.bar'
 
-  - match: { columns.0.name: binary }
-  - match: { columns.0.type: unsupported }
-  - match: { columns.1.name: completion }
+  - match: { columns.0.name: aggregate_metric_double }
+  - match: { columns.0.type: aggregate_metric_double }
+  - match: { columns.1.name: binary }
   - match: { columns.1.type: unsupported }
-  - match: { columns.2.name: date_nanos }
-  - match: { columns.2.type: date_nanos }
-  - match: { columns.3.name: date_range }
-  - match: { columns.3.type: unsupported }
-  - match: { columns.4.name: dense_vector }
+  - match: { columns.2.name: completion }
+  - match: { columns.2.type: unsupported }
+  - match: { columns.3.name: date_nanos }
+  - match: { columns.3.type: date_nanos }
+  - match: { columns.4.name: date_range }
   - match: { columns.4.type: unsupported }
-  - match: { columns.5.name: double_range }
+  - match: { columns.5.name: dense_vector }
   - match: { columns.5.type: unsupported }
-  - match: { columns.6.name: float_range }
+  - match: { columns.6.name: double_range }
   - match: { columns.6.type: unsupported }
-  - match: { columns.7.name: geo_point }
-  - match: { columns.7.type: geo_point }
-  - match: { columns.8.name: geo_point_alias }
+  - match: { columns.7.name: float_range }
+  - match: { columns.7.type: unsupported }
+  - match: { columns.8.name: geo_point }
   - match: { columns.8.type: geo_point }
-  - match: { columns.9.name: geo_shape }
-  - match: { columns.9.type: geo_shape }
-  - match: { columns.10.name: histogram }
-  - match: { columns.10.type: unsupported }
-  - match: { columns.11.name: integer_range }
+  - match: { columns.9.name: geo_point_alias }
+  - match: { columns.9.type: geo_point }
+  - match: { columns.10.name: geo_shape }
+  - match: { columns.10.type: geo_shape }
+  - match: { columns.11.name: histogram }
   - match: { columns.11.type: unsupported }
-  - match: { columns.12.name: ip_range }
+  - match: { columns.12.name: integer_range }
   - match: { columns.12.type: unsupported }
-  - match: { columns.13.name: long_range }
+  - match: { columns.13.name: ip_range }
   - match: { columns.13.type: unsupported }
-  - match: { columns.14.name: match_only_text }
-  - match: { columns.14.type: text }
-  - match: { columns.15.name: name }
-  - match: { columns.15.type: keyword }
-  - match: { columns.16.name: point }
-  - match: { columns.16.type: cartesian_point }
-  - match: { columns.17.name: rank_feature }
-  - match: { columns.17.type: unsupported }
-  - match: { columns.18.name: rank_features }
+  - match: { columns.14.name: long_range }
+  - match: { columns.14.type: unsupported }
+  - match: { columns.15.name: match_only_text }
+  - match: { columns.15.type: text }
+  - match: { columns.16.name: name }
+  - match: { columns.16.type: keyword }
+  - match: { columns.17.name: point }
+  - match: { columns.17.type: cartesian_point }
+  - match: { columns.18.name: rank_feature }
   - match: { columns.18.type: unsupported }
-  - match: { columns.19.name: search_as_you_type }
+  - match: { columns.19.name: rank_features }
   - match: { columns.19.type: unsupported }
-  - match: { columns.20.name: search_as_you_type._2gram }
+  - match: { columns.20.name: search_as_you_type }
   - match: { columns.20.type: unsupported }
-  - match: { columns.21.name: search_as_you_type._3gram }
+  - match: { columns.21.name: search_as_you_type._2gram }
   - match: { columns.21.type: unsupported }
-  - match: { columns.22.name: search_as_you_type._index_prefix }
+  - match: { columns.22.name: search_as_you_type._3gram }
   - match: { columns.22.type: unsupported }
-  - match: { columns.23.name: shape }
-  - match: { columns.23.type: cartesian_shape }
-  - match: { columns.24.name: some_doc.bar }
-  - match: { columns.24.type: long }
-  - match: { columns.25.name: some_doc.foo }
-  - match: { columns.25.type: keyword }
-  - match: { columns.26.name: text }
-  - match: { columns.26.type: text }
-  - match: { columns.27.name: token_count }
-  - match: { columns.27.type: integer }
+  - match: { columns.23.name: search_as_you_type._index_prefix }
+  - match: { columns.23.type: unsupported }
+  - match: { columns.24.name: shape }
+  - match: { columns.24.type: cartesian_shape }
+  - match: { columns.25.name: some_doc.bar }
+  - match: { columns.25.type: long }
+  - match: { columns.26.name: some_doc.foo }
+  - match: { columns.26.type: keyword }
+  - match: { columns.27.name: text }
+  - match: { columns.27.type: text }
+  - match: { columns.28.name: token_count }
+  - match: { columns.28.type: integer }
 
   - length: { values: 1 }
-  - match: { values.0.0: null }
+  - match: { values.0.0: '{"min":1.0,"max":3.0,"sum":10.1,"value_count":5}' }
   - match: { values.0.1: null }
-  - match: { values.0.2: "2015-01-01T12:10:30.123456789Z" }
-  - match: { values.0.3: null }
+  - match: { values.0.2: null }
+  - match: { values.0.3: "2015-01-01T12:10:30.123456789Z" }
   - match: { values.0.4: null }
   - match: { values.0.5: null }
   - match: { values.0.6: null }
-  - match: { values.0.7: "POINT (10.0 12.0)" }
+  - match: { values.0.7: null }
   - match: { values.0.8: "POINT (10.0 12.0)" }
-  - match: { values.0.9: "LINESTRING (-97.154 25.996, -97.159 25.998, -97.181 25.991, -97.187 25.985)" }
-  - match: { values.0.10: null }
+  - match: { values.0.9: "POINT (10.0 12.0)" }
+  - match: { values.0.10: "LINESTRING (-97.154 25.996, -97.159 25.998, -97.181 25.991, -97.187 25.985)" }
   - match: { values.0.11: null }
   - match: { values.0.12: null }
   - match: { values.0.13: null }
-  - match: { values.0.14: "foo bar baz" }
-  - match: { values.0.15: Alice }
-  - match: { values.0.16: "POINT (-97.15447 25.9961525)" }
-  - match: { values.0.17: null }
+  - match: { values.0.14: null }
+  - match: { values.0.15: "foo bar baz" }
+  - match: { values.0.16: Alice }
+  - match: { values.0.17: "POINT (-97.15447 25.9961525)" }
   - match: { values.0.18: null }
   - match: { values.0.19: null }
   - match: { values.0.20: null }
   - match: { values.0.21: null }
   - match: { values.0.22: null }
-  - match: { values.0.23: "LINESTRING (-377.03653 389.897676, -377.009051 389.889939)" }
-  - match: { values.0.24: 12 }
-  - match: { values.0.25: xy }
-  - match: { values.0.26: "foo bar" }
-  - match: { values.0.27: 3 }
-
+  - match: { values.0.23: null }
+  - match: { values.0.24: "LINESTRING (-377.03653 389.897676, -377.009051 389.889939)" }
+  - match: { values.0.25: 12 }
+  - match: { values.0.26: xy }
+  - match: { values.0.27: "foo bar" }
+  - match: { values.0.28: 3 }
 ---
 nested declared inline:
   - do:

+ 105 - 0
x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/46_downsample.yml

@@ -146,3 +146,108 @@ setup:
   - match: {columns.0.name: "k8s.pod.network.rx"}
   - match: {columns.0.type: "aggregate_metric_double"}
   - match: {values.0.0: '{"min":530604.0,"max":530605.0,"sum":1061209.0,"value_count":2}'}
+
+---
+"Stats from downsampled and non-downsampled index simultaneously":
+  - requires:
+      test_runner_features: [capabilities]
+      capabilities:
+        - method: POST
+          path: /_query
+          parameters: []
+          capabilities: [aggregate_metric_double_convert_to]
+      reason: "Support for to_aggregate_metric_double function"
+
+  - do:
+      indices.downsample:
+        index: test
+        target_index: test-downsample
+        body: >
+          {
+            "fixed_interval": "1h"
+          }
+  - is_true: acknowledged
+
+  - do:
+      indices.create:
+        index: test-2
+        body:
+          settings:
+            number_of_shards: 1
+            index:
+              mode: time_series
+              routing_path: [ metricset, k8s.pod.uid ]
+              time_series:
+                start_time: 2021-04-29T00:00:00Z
+                end_time: 2021-04-30T00:00:00Z
+          mappings:
+            properties:
+              "@timestamp":
+                type: date
+              metricset:
+                type: keyword
+                time_series_dimension: true
+              k8s:
+                properties:
+                  pod:
+                    properties:
+                      uid:
+                        type: keyword
+                        time_series_dimension: true
+                      name:
+                        type: keyword
+                      created_at:
+                        type: date_nanos
+                      running:
+                        type: boolean
+                      number_of_containers:
+                        type: integer
+                      ip:
+                        type: ip
+                      tags:
+                        type: keyword
+                      values:
+                        type: integer
+                      network:
+                        properties:
+                          tx:
+                            type: long
+                            time_series_metric: gauge
+                          rx:
+                            type: long
+                            time_series_metric: gauge
+
+  - do:
+      bulk:
+        refresh: true
+        index: test-2
+        body:
+          - '{"index": {}}'
+          - '{"@timestamp": "2021-04-29T21:50:04.467Z", "metricset": "pod", "k8s": {"pod": {"name": "cat", "uid":"947e4ced-1786-4e53-9e0c-5c447e959507", "ip": "10.10.55.1", "network": {"tx": 2001810, "rx": 802339}, "created_at": "2021-04-28T19:34:00.000Z", "running": false, "number_of_containers": 2, "tags": ["backend", "prod"], "values": [2, 3, 6]}}}'
+          - '{"index": {}}'
+          - '{"@timestamp": "2021-04-29T21:50:24.467Z", "metricset": "pod", "k8s": {"pod": {"name": "cat", "uid":"947e4ced-1786-4e53-9e0c-5c447e959507", "ip": "10.10.55.26", "network": {"tx": 2000177, "rx": 800479}, "created_at": "2021-04-28T19:35:00.000Z", "running": true, "number_of_containers": 2, "tags": ["backend", "prod", "us-west1"], "values": [1, 1, 3]}}}'
+          - '{"index": {}}'
+
+  - do:
+      esql.query:
+        body:
+          query: "FROM test-* |
+          WHERE k8s.pod.uid == \"947e4ced-1786-4e53-9e0c-5c447e959507\" |
+          EVAL rx = to_aggregate_metric_double(k8s.pod.network.rx) |
+          STATS max(rx), min(rx), sum(rx), count(rx) |
+          LIMIT 100"
+
+  - length: {values: 1}
+  - length: {values.0: 4}
+  - match: {columns.0.name: "max(rx)"}
+  - match: {columns.0.type: "double"}
+  - match: {columns.1.name: "min(rx)"}
+  - match: {columns.1.type: "double"}
+  - match: {columns.2.name: "sum(rx)"}
+  - match: {columns.2.type: "double"}
+  - match: {columns.3.name: "count(rx)"}
+  - match: {columns.3.type: "long"}
+  - match: {values.0.0: 803685.0}
+  - match: {values.0.1: 800479.0}
+  - match: {values.0.2: 4812452.0}
+  - match: {values.0.3: 6}

+ 2 - 2
x-pack/plugin/src/yamlRestTest/resources/rest-api-spec/test/esql/60_usage.yml

@@ -93,7 +93,7 @@ setup:
   - gt: {esql.functions.to_long: $functions_to_long}
   - match: {esql.functions.coalesce: $functions_coalesce}
   # Testing for the entire function set isn't feasbile, so we just check that we return the correct count as an approximation.
-  - length: {esql.functions: 133} # check the "sister" test below for a likely update to the same esql.functions length check
+  - length: {esql.functions: 134} # check the "sister" test below for a likely update to the same esql.functions length check
 
 ---
 "Basic ESQL usage output (telemetry) non-snapshot version":
@@ -164,4 +164,4 @@ setup:
   - match: {esql.functions.cos: $functions_cos}
   - gt: {esql.functions.to_long: $functions_to_long}
   - match: {esql.functions.coalesce: $functions_coalesce}
-  - length: {esql.functions: 130} # check the "sister" test above for a likely update to the same esql.functions length check
+  - length: {esql.functions: 131} # check the "sister" test above for a likely update to the same esql.functions length check