Pārlūkot izejas kodu

Filter cache: add a `_cache: auto` option and make it the default.

Up to now, all filters could be cached using the `_cache` flag that could be
set to `true` or `false` and the default was set depending on the type of the
`filter`. For instance, `script` filters are not cached by default while
`terms` are. For some filters, the default is more complicated and eg. date
range filters are cached unless they use `now` in a non-rounded fashion.

This commit adds a 3rd option called `auto`, which becomes the default for
all filters. So for all filters a cache wrapper will be returned, and the
decision will be made at caching time, per-segment. Here is the default logic:
 - if there is already a cache entry for this filter in the current segment,
   then return the cache entry.
 - else if the doc id set cannot iterate (eg. script filter) then do not cache.
 - else if the doc id set is already cacheable and it has been used twice or
   more in the last 1000 filters then cache it.
 - else if the filter is costly (eg. multi-term) and has been used twice or more
   in the last 1000 filters then cache it.
 - else if the doc id set is not cacheable and it has been used 5 times or more
   in the last 1000 filters, then load it into a cacheable set and cache it.
 - else return the uncached set.

So for instance geo-distance filters and script filters are going to use this
new default and are not going to be cached because of their iterators.

Similarly, date range filters are going to use this default all the time, but
it is very unlikely that those that use `now` in a not rounded fashion will get
reused so in practice they won't be cached.

`terms`, `range`, ... filters produce cacheable doc id sets with good iterators
so they will be cached as soon as they have been used twice.

Filters that don't produce cacheable doc id sets such as the `term` filter will
need to be used 5 times before being cached. This ensures that we don't spend
CPU iterating over all documents matching such filters unless we have good
evidence of reuse.

One last interesting point about this change is that it also applies to compound
filters. So if you keep on repeating the same `bool` filter with the same
underlying clauses, it will be cached on its own while up to now it used to
never be cached by default.

`_cache: true` has been changed to only cache on large segments, in order to not
pollute the cache since small segments should not be the bottleneck anyway.
However `_cache: false` still has the same semantics.

Close #8449
Adrien Grand 11 gadi atpakaļ
vecāks
revīzija
ce11e0ee6d
81 mainītis faili ar 608 papildinājumiem un 1165 dzēšanām
  1. 6 1
      docs/reference/query-dsl/filters.asciidoc
  2. 4 4
      docs/reference/query-dsl/filters/and-filter.asciidoc
  3. 0 6
      docs/reference/query-dsl/filters/bool-filter.asciidoc
  4. 0 5
      docs/reference/query-dsl/filters/exists-filter.asciidoc
  5. 0 6
      docs/reference/query-dsl/filters/missing-filter.asciidoc
  6. 3 3
      docs/reference/query-dsl/filters/not-filter.asciidoc
  7. 2 1
      docs/reference/query-dsl/filters/or-filter.asciidoc
  8. 3 3
      docs/reference/query-dsl/filters/prefix-filter.asciidoc
  9. 3 1
      docs/reference/query-dsl/filters/query-filter.asciidoc
  10. 3 3
      docs/reference/query-dsl/filters/range-filter.asciidoc
  11. 2 2
      docs/reference/query-dsl/filters/term-filter.asciidoc
  12. 3 2
      docs/reference/query-dsl/filters/terms-filter.asciidoc
  13. 103 0
      src/main/java/org/elasticsearch/index/cache/filter/AutoFilterCachingPolicy.java
  14. 4 1
      src/main/java/org/elasticsearch/index/cache/filter/FilterCache.java
  15. 6 0
      src/main/java/org/elasticsearch/index/cache/filter/FilterCacheModule.java
  16. 6 3
      src/main/java/org/elasticsearch/index/cache/filter/none/NoneFilterCache.java
  17. 0 114
      src/main/java/org/elasticsearch/index/cache/filter/support/CacheKeyFilter.java
  18. 0 33
      src/main/java/org/elasticsearch/index/cache/filter/support/FilterCacheValue.java
  19. 52 38
      src/main/java/org/elasticsearch/index/cache/filter/weighted/WeightedFilterCache.java
  20. 16 62
      src/main/java/org/elasticsearch/index/mapper/core/DateFieldMapper.java
  21. 1 1
      src/main/java/org/elasticsearch/index/mapper/internal/TypeFieldMapper.java
  22. 3 1
      src/main/java/org/elasticsearch/index/percolator/PercolatorQueriesRegistry.java
  23. 8 7
      src/main/java/org/elasticsearch/index/query/AndFilterParser.java
  24. 8 7
      src/main/java/org/elasticsearch/index/query/BoolFilterParser.java
  25. 8 7
      src/main/java/org/elasticsearch/index/query/ConstantScoreQueryParser.java
  26. 2 2
      src/main/java/org/elasticsearch/index/query/ExistsFilterParser.java
  27. 8 7
      src/main/java/org/elasticsearch/index/query/FQueryFilterParser.java
  28. 8 7
      src/main/java/org/elasticsearch/index/query/FilteredQueryParser.java
  29. 8 7
      src/main/java/org/elasticsearch/index/query/GeoBoundingBoxFilterParser.java
  30. 8 7
      src/main/java/org/elasticsearch/index/query/GeoDistanceFilterParser.java
  31. 8 7
      src/main/java/org/elasticsearch/index/query/GeoDistanceRangeFilterParser.java
  32. 9 7
      src/main/java/org/elasticsearch/index/query/GeoPolygonFilterParser.java
  33. 8 7
      src/main/java/org/elasticsearch/index/query/GeoShapeFilterParser.java
  34. 8 7
      src/main/java/org/elasticsearch/index/query/GeohashCellFilter.java
  35. 2 2
      src/main/java/org/elasticsearch/index/query/HasChildFilterParser.java
  36. 2 2
      src/main/java/org/elasticsearch/index/query/HasChildQueryParser.java
  37. 2 2
      src/main/java/org/elasticsearch/index/query/HasParentQueryParser.java
  38. 10 0
      src/main/java/org/elasticsearch/index/query/IndexQueryParserService.java
  39. 5 5
      src/main/java/org/elasticsearch/index/query/MissingFilterParser.java
  40. 4 4
      src/main/java/org/elasticsearch/index/query/NestedFilterParser.java
  41. 4 4
      src/main/java/org/elasticsearch/index/query/NotFilterParser.java
  42. 8 7
      src/main/java/org/elasticsearch/index/query/OrFilterParser.java
  43. 8 7
      src/main/java/org/elasticsearch/index/query/PrefixFilterParser.java
  44. 29 11
      src/main/java/org/elasticsearch/index/query/QueryParseContext.java
  45. 10 21
      src/main/java/org/elasticsearch/index/query/RangeFilterParser.java
  46. 8 7
      src/main/java/org/elasticsearch/index/query/RegexpFilterParser.java
  47. 11 10
      src/main/java/org/elasticsearch/index/query/ScriptFilterParser.java
  48. 10 9
      src/main/java/org/elasticsearch/index/query/TermFilterParser.java
  49. 19 44
      src/main/java/org/elasticsearch/index/query/TermsFilterParser.java
  50. 1 1
      src/main/java/org/elasticsearch/index/query/TopChildrenQueryParser.java
  51. 1 1
      src/main/java/org/elasticsearch/index/query/TypeFilterParser.java
  52. 2 2
      src/main/java/org/elasticsearch/index/query/support/QueryParsers.java
  53. 2 1
      src/main/java/org/elasticsearch/index/shard/IndexShard.java
  54. 11 10
      src/main/java/org/elasticsearch/indices/cache/filter/terms/IndicesTermsFilterCache.java
  55. 2 1
      src/main/java/org/elasticsearch/percolator/PercolatorService.java
  56. 2 2
      src/main/java/org/elasticsearch/search/aggregations/bucket/children/ChildrenParser.java
  57. 3 5
      src/main/java/org/elasticsearch/search/aggregations/bucket/children/ParentToChildrenAggregator.java
  58. 12 12
      src/main/java/org/elasticsearch/search/aggregations/bucket/nested/NestedAggregator.java
  59. 1 1
      src/main/java/org/elasticsearch/search/aggregations/bucket/nested/NestedParser.java
  60. 1 1
      src/main/java/org/elasticsearch/search/fetch/innerhits/InnerHitsContext.java
  61. 1 1
      src/main/java/org/elasticsearch/search/internal/DefaultSearchContext.java
  62. 0 275
      src/test/java/org/elasticsearch/index/query/IndexQueryParserFilterCachingTests.java
  63. 2 6
      src/test/java/org/elasticsearch/index/query/SimpleIndexQueryParserTests.java
  64. 0 25
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean.json
  65. 0 26
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached.json
  66. 0 26
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_complex_now.json
  67. 0 26
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_complex_now_with_rounding.json
  68. 0 26
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_now.json
  69. 0 26
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_now_with_rounding.json
  70. 0 25
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean_with_long_value.json
  71. 0 26
      src/test/java/org/elasticsearch/index/query/date_range_in_boolean_with_long_value_not_cached.json
  72. 1 1
      src/test/java/org/elasticsearch/index/search/child/AbstractChildTests.java
  73. 6 5
      src/test/java/org/elasticsearch/indices/stats/IndexStatsTests.java
  74. 17 1
      src/test/java/org/elasticsearch/search/child/SimpleChildQuerySearchTests.java
  75. 0 81
      src/test/java/org/elasticsearch/search/query/SimpleQueryTests.java
  76. 13 5
      src/test/java/org/elasticsearch/search/scriptfilter/ScriptFilterSearchTests.java
  77. 0 5
      src/test/java/org/elasticsearch/test/CompositeTestCluster.java
  78. 0 5
      src/test/java/org/elasticsearch/test/ExternalTestCluster.java
  79. 42 14
      src/test/java/org/elasticsearch/test/InternalTestCluster.java
  80. 0 5
      src/test/java/org/elasticsearch/test/TestCluster.java
  81. 45 14
      src/test/java/org/elasticsearch/validate/SimpleValidateQueryTests.java

+ 6 - 1
docs/reference/query-dsl/filters.asciidoc

@@ -42,7 +42,12 @@ The last type of filters are those working with other filters. The
 cached as they basically just manipulate the internal filters.
 
 All filters allow to set `_cache` element on them to explicitly control
-caching. They also allow to set `_cache_key` which will be used as the
+caching. It accepts 3 values: `true` in order to cache the filter, `false`
+to make sure that the filter will not be cached, and `auto`, which is the
+default and will decide on whether to cache the filter based on the cost
+to cache the filter and how often the filter has been used.
+
+Filters also allow to set `_cache_key` which will be used as the
 caching key for that filter. This can be handy when using very large
 filters (like a terms filter with many elements in it).
 

+ 4 - 4
docs/reference/query-dsl/filters/and-filter.asciidoc

@@ -33,10 +33,10 @@ filters. Can be placed within queries that accept a filter.
 [float]
 ==== Caching
 
-The result of the filter is not cached by default. The `_cache` can be
-set to `true` in order to cache it (though usually not needed). Since
-the `_cache` element requires to be set on the `and` filter itself, the
-structure then changes a bit to have the filters provided within a
+The result of the filter is only cached by default if there is evidence of
+reuse. It is possible to opt-in explicitely for caching by setting `_cache`
+to `true`. Since the `_cache` element requires to be set on the `and` filter
+itself, the structure then changes a bit to have the filters provided within a
 `filters` element:
 
 [source,js]

+ 0 - 6
docs/reference/query-dsl/filters/bool-filter.asciidoc

@@ -41,9 +41,3 @@ accept a filter.
 }    
 --------------------------------------------------
 
-[float]
-==== Caching
-
-The result of the `bool` filter is not cached by default (though
-internal filters might be). The `_cache` can be set to `true` in order
-to enable caching.

+ 0 - 5
docs/reference/query-dsl/filters/exists-filter.asciidoc

@@ -74,8 +74,3 @@ no values in the `user` field and thus would not match the `exists` filter:
 { "foo": "bar" }
 --------------------------------------------------
 
-
-[float]
-==== Caching
-
-The result of the filter is always cached.

+ 0 - 6
docs/reference/query-dsl/filters/missing-filter.asciidoc

@@ -130,9 +130,3 @@ When set to `false` (the default), these documents will not be included.
 --
 
 NOTE: Either `existence` or `null_value` or both must be set to `true`.
-
-
-[float]
-==== Caching
-
-The result of the filter is always cached.

+ 3 - 3
docs/reference/query-dsl/filters/not-filter.asciidoc

@@ -53,9 +53,9 @@ Or, in a longer form with a `filter` element:
 [float]
 ==== Caching
 
-The result of the filter is not cached by default. The `_cache` can be
-set to `true` in order to cache it (though usually not needed). Here is
-an example:
+The result of the filter is only cached if there is evidence of reuse.
+The `_cache` can be set to `true` in order to cache it (though usually
+not needed). Here is an example:
 
 [source,js]
 --------------------------------------------------

+ 2 - 1
docs/reference/query-dsl/filters/or-filter.asciidoc

@@ -28,7 +28,8 @@ filters. Can be placed within queries that accept a filter.
 [float]
 ==== Caching
 
-The result of the filter is not cached by default. The `_cache` can be
+The result of the filter is only cached by default if there is evidence
+of reuse. The `_cache` can be
 set to `true` in order to cache it (though usually not needed). Since
 the `_cache` element requires to be set on the `or` filter itself, the
 structure then changes a bit to have the filters provided within a

+ 3 - 3
docs/reference/query-dsl/filters/prefix-filter.asciidoc

@@ -19,8 +19,8 @@ a filter. Can be placed within queries that accept a filter.
 [float]
 ==== Caching
 
-The result of the filter is cached by default. The `_cache` can be set
-to `false` in order not to cache it. Here is an example:
+The result of the filter is cached by default if there is evidence of reuse.
+The `_cache` can be set to `true` in order to cache it. Here is an example:
 
 [source,js]
 --------------------------------------------------
@@ -29,7 +29,7 @@ to `false` in order not to cache it. Here is an example:
         "filter" : {
             "prefix" : { 
                 "user" : "ki",
-                "_cache" : false
+                "_cache" : true
             }
         }
     }

+ 3 - 1
docs/reference/query-dsl/filters/query-filter.asciidoc

@@ -22,7 +22,9 @@ that accept a filter.
 [float]
 ==== Caching
 
-The result of the filter is not cached by default. The `_cache` can be
+The result of the filter is only cached by default if there is evidence of reuse.
+
+The `_cache` can be
 set to `true` to cache the *result* of the filter. This is handy when
 the same query is used on several (many) other queries. Note, the
 process of caching the first execution is higher when not caching (since

+ 3 - 3
docs/reference/query-dsl/filters/range-filter.asciidoc

@@ -98,8 +98,8 @@ you're already aggregating or sorting by.
 [float]
 ==== Caching
 
-The result of the filter is only automatically cached by default if the `execution` is set to `index`. The
+The result of the filter is only cached by default if there is evidence of reuse. The
 `_cache` can be set to `false` to turn it off.
 
-If the `now` date math expression is used without rounding then a range filter will never be cached even if `_cache` is
-set to `true`. Also any filter that wraps this filter will never be cached.
+Having the `now` expression used without rounding will make the filter unlikely to be
+cached since reuse is very unlikely.

+ 2 - 2
docs/reference/query-dsl/filters/term-filter.asciidoc

@@ -20,8 +20,8 @@ accept a filter, for example:
 [float]
 ==== Caching
 
-The result of the filter is automatically cached by default. The
-`_cache` can be set to `false` to turn it off. Here is an example:
+The result of the filter is only cached by default if there is evidence of reuse.
+The `_cache` can be set to `false` to turn it off. Here is an example:
 
 [source,js]
 --------------------------------------------------

+ 3 - 2
docs/reference/query-dsl/filters/terms-filter.asciidoc

@@ -86,8 +86,9 @@ For example:
 [float]
 ==== Caching
 
-The result of the filter is automatically cached by default. The
-`_cache` can be set to `false` to turn it off.
+The result of the filter is cached if there is evidence of reuse. It is
+possible to enable caching explicitely by setting `_cache` to `true` and
+to disable caching by setting `_cache` to `false`.
 
 [float]
 ==== Terms lookup mechanism

+ 103 - 0
src/main/java/org/elasticsearch/index/cache/filter/AutoFilterCachingPolicy.java

@@ -0,0 +1,103 @@
+/*
+ * Licensed to Elasticsearch under one or more contributor
+ * license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright
+ * ownership. Elasticsearch licenses this file to you under
+ * the Apache License, Version 2.0 (the "License"); you may
+ * not use this file except in compliance with the License.
+ * You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied.  See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+package org.elasticsearch.index.cache.filter;
+
+import org.apache.lucene.index.LeafReaderContext;
+import org.apache.lucene.search.DocIdSet;
+import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
+import org.apache.lucene.search.UsageTrackingFilterCachingPolicy;
+import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.docset.DocIdSets;
+import org.elasticsearch.common.settings.ImmutableSettings;
+import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.index.AbstractIndexComponent;
+import org.elasticsearch.index.Index;
+import org.elasticsearch.index.settings.IndexSettings;
+
+import java.io.IOException;
+
+/**
+ * This class is a wrapper around {@link UsageTrackingFilterCachingPolicy}
+ * which wires parameters through index settings and makes sure to not
+ * cache {@link DocIdSet}s which have a {@link DocIdSets#isBroken(DocIdSetIterator) broken}
+ * iterator.
+ */
+public class AutoFilterCachingPolicy extends AbstractIndexComponent implements FilterCachingPolicy {
+
+    // These settings don't have the purpose of being documented. They are only here so that
+    // if anyone ever hits an issue with elasticsearch that is due to the value of one of these
+    // parameters, then it might be possible to temporarily work around the issue without having
+    // to wait for a new release
+
+    // number of times a filter that is expensive to compute should be seen before the doc id sets are cached
+    public static final String MIN_FREQUENCY_COSTLY = "index.cache.filter.policy.min_frequency.costly";
+    // number of times a filter that produces cacheable filters should be seen before the doc id sets are cached
+    public static final String MIN_FREQUENCY_CACHEABLE = "index.cache.filter.policy.min_frequency.cacheable";
+    // same for filters that produce doc id sets that are not directly cacheable
+    public static final String MIN_FREQUENCY_OTHER = "index.cache.filter.policy.min_frequency.other";
+    // sources of segments that should be cached
+    public static final String MIN_SEGMENT_SIZE_RATIO = "index.cache.filter.policy.min_segment_size_ratio";
+    // size of the history to keep for filters. A filter will be cached if it has been seen more than a given
+    // number of times (depending on the filter, the segment and the produced DocIdSet) in the most
+    // ${history_size} recently used filters
+    public static final String HISTORY_SIZE = "index.cache.filter.policy.history_size";
+
+    public static Settings AGGRESSIVE_CACHING_SETTINGS = ImmutableSettings.builder()
+            .put(MIN_FREQUENCY_CACHEABLE, 1)
+            .put(MIN_FREQUENCY_COSTLY, 1)
+            .put(MIN_FREQUENCY_OTHER, 1)
+            .put(MIN_SEGMENT_SIZE_RATIO, 0.000000001f)
+            .build();
+
+    private final FilterCachingPolicy in;
+
+    @Inject
+    public AutoFilterCachingPolicy(Index index, @IndexSettings Settings indexSettings) {
+        super(index, indexSettings);
+        final int historySize = indexSettings.getAsInt(HISTORY_SIZE, 1000);
+        // cache aggressively filters that produce sets that are already cacheable,
+        // ie. if the filter has been used twice or more among the most 1000 recently
+        // used filters
+        final int minFrequencyCacheable = indexSettings.getAsInt(MIN_FREQUENCY_CACHEABLE, 2);
+        // cache aggressively filters whose getDocIdSet method is costly
+        final int minFrequencyCostly = indexSettings.getAsInt(MIN_FREQUENCY_COSTLY, 2);
+        // be a bit less aggressive when the produced doc id sets are not cacheable
+        final int minFrequencyOther = indexSettings.getAsInt(MIN_FREQUENCY_OTHER, 5);
+        final float minSegmentSizeRatio = indexSettings.getAsFloat(MIN_SEGMENT_SIZE_RATIO, 0.01f);
+        in = new UsageTrackingFilterCachingPolicy(minSegmentSizeRatio, historySize, minFrequencyCostly, minFrequencyCacheable, minFrequencyOther);
+    }
+
+    @Override
+    public void onUse(Filter filter) {
+        in.onUse(filter);
+    }
+
+    @Override
+    public boolean shouldCache(Filter filter, LeafReaderContext context, DocIdSet set) throws IOException {
+        if (set != null && DocIdSets.isBroken(set.iterator())) {
+            // O(maxDoc) to cache, no thanks.
+            return false;
+        }
+
+        return in.shouldCache(filter, context, set);
+    }
+
+}

+ 4 - 1
src/main/java/org/elasticsearch/index/cache/filter/FilterCache.java

@@ -20,7 +20,10 @@
 package org.elasticsearch.index.cache.filter;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
+import org.elasticsearch.common.Nullable;
 import org.elasticsearch.common.component.CloseableComponent;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.index.IndexComponent;
 import org.elasticsearch.index.IndexService;
 
@@ -44,7 +47,7 @@ public interface FilterCache extends IndexComponent, CloseableComponent {
 
     String type();
 
-    Filter cache(Filter filterToCache);
+    Filter cache(Filter filterToCache, @Nullable HashedBytesRef cacheKey, FilterCachingPolicy policy);
 
     void clear(Object reader);
 

+ 6 - 0
src/main/java/org/elasticsearch/index/cache/filter/FilterCacheModule.java

@@ -19,6 +19,7 @@
 
 package org.elasticsearch.index.cache.filter;
 
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.inject.AbstractModule;
 import org.elasticsearch.common.inject.Scopes;
 import org.elasticsearch.common.settings.Settings;
@@ -44,5 +45,10 @@ public class FilterCacheModule extends AbstractModule {
         bind(FilterCache.class)
                 .to(settings.getAsClass(FilterCacheSettings.FILTER_CACHE_TYPE, WeightedFilterCache.class, "org.elasticsearch.index.cache.filter.", "FilterCache"))
                 .in(Scopes.SINGLETON);
+        // the filter cache is a node-level thing, however we want the most popular filters
+        // to be computed on a per-index basis, that is why we don't use the SINGLETON
+        // scope below
+        bind(FilterCachingPolicy.class)
+                .to(AutoFilterCachingPolicy.class);
     }
 }

+ 6 - 3
src/main/java/org/elasticsearch/index/cache/filter/none/NoneFilterCache.java

@@ -20,12 +20,15 @@
 package org.elasticsearch.index.cache.filter.none;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
+import org.elasticsearch.common.Nullable;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.settings.Settings;
 import org.elasticsearch.index.AbstractIndexComponent;
 import org.elasticsearch.index.Index;
-import org.elasticsearch.index.cache.filter.FilterCache;
 import org.elasticsearch.index.IndexService;
+import org.elasticsearch.index.cache.filter.FilterCache;
 import org.elasticsearch.index.settings.IndexSettings;
 
 /**
@@ -55,7 +58,7 @@ public class NoneFilterCache extends AbstractIndexComponent implements FilterCac
     }
 
     @Override
-    public Filter cache(Filter filterToCache) {
+    public Filter cache(Filter filterToCache, @Nullable HashedBytesRef cacheKey, FilterCachingPolicy policy) {
         return filterToCache;
     }
 
@@ -73,4 +76,4 @@ public class NoneFilterCache extends AbstractIndexComponent implements FilterCac
     public void clear(Object reader) {
         // nothing to do here
     }
-}
+}

+ 0 - 114
src/main/java/org/elasticsearch/index/cache/filter/support/CacheKeyFilter.java

@@ -1,114 +0,0 @@
-/*
- * Licensed to Elasticsearch under one or more contributor
- * license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright
- * ownership. Elasticsearch licenses this file to you under
- * the Apache License, Version 2.0 (the "License"); you may
- * not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.elasticsearch.index.cache.filter.support;
-
-import org.apache.lucene.index.LeafReaderContext;
-import org.apache.lucene.search.DocIdSet;
-import org.apache.lucene.search.Filter;
-import org.apache.lucene.util.Bits;
-import org.elasticsearch.common.Strings;
-
-import java.io.IOException;
-import java.util.Arrays;
-
-public interface CacheKeyFilter {
-
-    public static class Key {
-
-        private final byte[] bytes;
-
-        // we pre-compute the hashCode for better performance (especially in IdCache)
-        private final int hashCode;
-
-        public Key(byte[] bytes) {
-            this.bytes = bytes;
-            this.hashCode = Arrays.hashCode(bytes);
-        }
-
-        public Key(String str) {
-            this(Strings.toUTF8Bytes(str));
-        }
-
-        public byte[] bytes() {
-            return this.bytes;
-        }
-
-        @Override
-        public boolean equals(Object o) {
-            if (this == o) return true;
-            if (o.getClass() != this.getClass()) {
-                return false;
-            }
-            Key bytesWrap = (Key) o;
-            return Arrays.equals(bytes, bytesWrap.bytes);
-        }
-
-        @Override
-        public int hashCode() {
-            return hashCode;
-        }
-    }
-
-    public static class Wrapper extends Filter implements CacheKeyFilter {
-
-        private final Filter filter;
-
-        private final Key key;
-
-        public Wrapper(Filter filter, Key key) {
-            this.filter = filter;
-            this.key = key;
-        }
-
-        @Override
-        public Key cacheKey() {
-            return key;
-        }
-
-        public Filter wrappedFilter() {
-            return filter;
-        }
-
-        @Override
-        public DocIdSet getDocIdSet(LeafReaderContext context, Bits acceptDocs) throws IOException {
-            return filter.getDocIdSet(context, acceptDocs);
-        }
-
-        @Override
-        public int hashCode() {
-            return filter.hashCode();
-        }
-
-        @Override
-        public boolean equals(Object obj) {
-            if (obj instanceof Wrapper == false) {
-                return false;
-            }
-            return filter.equals(((Wrapper) obj).filter);
-        }
-
-        @Override
-        public String toString() {
-            return filter.toString();
-        }
-    }
-
-    Object cacheKey();
-}

+ 0 - 33
src/main/java/org/elasticsearch/index/cache/filter/support/FilterCacheValue.java

@@ -1,33 +0,0 @@
-/*
- * Licensed to Elasticsearch under one or more contributor
- * license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright
- * ownership. Elasticsearch licenses this file to you under
- * the Apache License, Version 2.0 (the "License"); you may
- * not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.elasticsearch.index.cache.filter.support;
-
-public class FilterCacheValue<T> {
-
-    private final T value;
-
-    public FilterCacheValue(T value) {
-        this.value = value;
-    }
-
-    public T value() {
-        return value;
-    }
-}

+ 52 - 38
src/main/java/org/elasticsearch/index/cache/filter/weighted/WeightedFilterCache.java

@@ -22,32 +22,33 @@ package org.elasticsearch.index.cache.filter.weighted;
 import com.google.common.cache.Cache;
 import com.google.common.cache.RemovalListener;
 import com.google.common.cache.Weigher;
+
 import org.apache.lucene.index.IndexReader;
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.index.SegmentReader;
 import org.apache.lucene.search.BitsFilteredDocIdSet;
 import org.apache.lucene.search.DocIdSet;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.util.Bits;
-import org.apache.lucene.util.BytesRefBuilder;
 import org.elasticsearch.ElasticsearchException;
 import org.elasticsearch.common.Nullable;
-import org.elasticsearch.common.Strings;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.docset.DocIdSets;
 import org.elasticsearch.common.lucene.search.CachedFilter;
 import org.elasticsearch.common.lucene.search.NoCacheFilter;
+import org.elasticsearch.common.lucene.search.ResolvableFilter;
 import org.elasticsearch.common.settings.Settings;
 import org.elasticsearch.common.util.concurrent.ConcurrentCollections;
 import org.elasticsearch.index.AbstractIndexComponent;
 import org.elasticsearch.index.Index;
-import org.elasticsearch.index.cache.filter.FilterCache;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.IndexService;
+import org.elasticsearch.index.cache.filter.FilterCache;
 import org.elasticsearch.index.settings.IndexSettings;
+import org.elasticsearch.index.shard.IndexShard;
 import org.elasticsearch.index.shard.ShardId;
 import org.elasticsearch.index.shard.ShardUtils;
-import org.elasticsearch.index.shard.IndexShard;
 import org.elasticsearch.indices.cache.filter.IndicesFilterCache;
 
 import java.io.IOException;
@@ -102,11 +103,10 @@ public class WeightedFilterCache extends AbstractIndexComponent implements Filte
     @Override
     public void clear(String reason, String[] keys) {
         logger.debug("clear keys [], reason [{}]", reason, keys);
-        final BytesRefBuilder spare = new BytesRefBuilder();
         for (String key : keys) {
-            final byte[] keyBytes = Strings.toUTF8Bytes(key, spare);
+            final HashedBytesRef keyBytes = new HashedBytesRef(key);
             for (Object readerKey : seenReaders.keySet()) {
-                indicesFilterCache.cache().invalidate(new FilterCacheKey(readerKey, new CacheKeyFilter.Key(keyBytes)));
+                indicesFilterCache.cache().invalidate(new FilterCacheKey(readerKey, keyBytes));
             }
         }
     }
@@ -128,7 +128,7 @@ public class WeightedFilterCache extends AbstractIndexComponent implements Filte
     }
 
     @Override
-    public Filter cache(Filter filterToCache) {
+    public Filter cache(Filter filterToCache, @Nullable HashedBytesRef cacheKey, FilterCachingPolicy cachePolicy) {
         if (filterToCache == null) {
             return null;
         }
@@ -138,57 +138,71 @@ public class WeightedFilterCache extends AbstractIndexComponent implements Filte
         if (CachedFilter.isCached(filterToCache)) {
             return filterToCache;
         }
-        return new FilterCacheFilterWrapper(filterToCache, this);
+        if (filterToCache instanceof ResolvableFilter) {
+            throw new IllegalArgumentException("Cannot cache instances of ResolvableFilter: " + filterToCache);
+        }
+        return new FilterCacheFilterWrapper(filterToCache, cacheKey, cachePolicy, this);
     }
 
     static class FilterCacheFilterWrapper extends CachedFilter {
 
         private final Filter filter;
-
+        private final Object filterCacheKey;
+        private final FilterCachingPolicy cachePolicy;
         private final WeightedFilterCache cache;
 
-        FilterCacheFilterWrapper(Filter filter, WeightedFilterCache cache) {
+        FilterCacheFilterWrapper(Filter filter, Object cacheKey, FilterCachingPolicy cachePolicy, WeightedFilterCache cache) {
             this.filter = filter;
+            this.filterCacheKey = cacheKey != null ? cacheKey : filter;
+            this.cachePolicy = cachePolicy;
             this.cache = cache;
         }
 
-
         @Override
         public DocIdSet getDocIdSet(LeafReaderContext context, Bits acceptDocs) throws IOException {
-            Object filterKey = filter;
-            if (filter instanceof CacheKeyFilter) {
-                filterKey = ((CacheKeyFilter) filter).cacheKey();
+            if (context.ord == 0) {
+                cachePolicy.onUse(filter);
             }
-            FilterCacheKey cacheKey = new FilterCacheKey(context.reader().getCoreCacheKey(), filterKey);
+            FilterCacheKey cacheKey = new FilterCacheKey(context.reader().getCoreCacheKey(), filterCacheKey);
             Cache<FilterCacheKey, DocIdSet> innerCache = cache.indicesFilterCache.cache();
 
             DocIdSet cacheValue = innerCache.getIfPresent(cacheKey);
-            if (cacheValue == null) {
-                if (!cache.seenReaders.containsKey(context.reader().getCoreCacheKey())) {
-                    Boolean previous = cache.seenReaders.putIfAbsent(context.reader().getCoreCacheKey(), Boolean.TRUE);
-                    if (previous == null) {
-                        // we add a core closed listener only, for non core IndexReaders we rely on clear being called (percolator for example)
-                        context.reader().addCoreClosedListener(cache);
+            final DocIdSet ret;
+            if (cacheValue != null) {
+                ret = cacheValue;
+            } else {
+                final DocIdSet uncached = filter.getDocIdSet(context, null);
+                if (cachePolicy.shouldCache(filter, context, uncached)) {
+                    if (!cache.seenReaders.containsKey(context.reader().getCoreCacheKey())) {
+                        Boolean previous = cache.seenReaders.putIfAbsent(context.reader().getCoreCacheKey(), Boolean.TRUE);
+                        if (previous == null) {
+                            // we add a core closed listener only, for non core IndexReaders we rely on clear being called (percolator for example)
+                            context.reader().addCoreClosedListener(cache);
+                        }
                     }
-                }
-                // we can't pass down acceptedDocs provided, because we are caching the result, and acceptedDocs
-                // might be specific to a query. We don't pass the live docs either because a cache built for a specific
-                // generation of a segment might be reused by an older generation which has fewer deleted documents
-                cacheValue = DocIdSets.toCacheable(context.reader(), filter.getDocIdSet(context, null));
-                // we might put the same one concurrently, that's fine, it will be replaced and the removal
-                // will be called
-                ShardId shardId = ShardUtils.extractShardId(context.reader());
-                if (shardId != null) {
-                    IndexShard shard = cache.indexService.shard(shardId.id());
-                    if (shard != null) {
-                        cacheKey.removalListener = shard.filterCache();
-                        shard.filterCache().onCached(DocIdSets.sizeInBytes(cacheValue));
+                    // we can't pass down acceptedDocs provided, because we are caching the result, and acceptedDocs
+                    // might be specific to a query. We don't pass the live docs either because a cache built for a specific
+                    // generation of a segment might be reused by an older generation which has fewer deleted documents
+                    cacheValue = DocIdSets.toCacheable(context.reader(), uncached);
+                    // we might put the same one concurrently, that's fine, it will be replaced and the removal
+                    // will be called
+                    ShardId shardId = ShardUtils.extractShardId(context.reader());
+                    if (shardId != null) {
+                        IndexShard shard = cache.indexService.shard(shardId.id());
+                        if (shard != null) {
+                            cacheKey.removalListener = shard.filterCache();
+                            shard.filterCache().onCached(DocIdSets.sizeInBytes(cacheValue));
+                        }
                     }
+                    innerCache.put(cacheKey, cacheValue);
+                    ret = cacheValue;
+                } else {
+                    // uncached
+                    ret = uncached;
                 }
-                innerCache.put(cacheKey, cacheValue);
             }
 
-            return BitsFilteredDocIdSet.wrap(DocIdSets.isEmpty(cacheValue) ? null : cacheValue, acceptDocs);
+            return BitsFilteredDocIdSet.wrap(DocIdSets.isEmpty(ret) ? null : ret, acceptDocs);
         }
 
         public String toString() {

+ 16 - 62
src/main/java/org/elasticsearch/index/mapper/core/DateFieldMapper.java

@@ -39,7 +39,6 @@ import org.elasticsearch.common.Strings;
 import org.elasticsearch.common.joda.DateMathParser;
 import org.elasticsearch.common.joda.FormatDateTimeFormatter;
 import org.elasticsearch.common.joda.Joda;
-import org.elasticsearch.common.lucene.search.NoCacheFilter;
 import org.elasticsearch.common.lucene.search.NoCacheQuery;
 import org.elasticsearch.common.lucene.search.ResolvableFilter;
 import org.elasticsearch.common.settings.Settings;
@@ -52,7 +51,11 @@ import org.elasticsearch.index.codec.docvaluesformat.DocValuesFormatProvider;
 import org.elasticsearch.index.codec.postingsformat.PostingsFormatProvider;
 import org.elasticsearch.index.fielddata.FieldDataType;
 import org.elasticsearch.index.fielddata.IndexNumericFieldData;
-import org.elasticsearch.index.mapper.*;
+import org.elasticsearch.index.mapper.Mapper;
+import org.elasticsearch.index.mapper.MapperParsingException;
+import org.elasticsearch.index.mapper.MergeContext;
+import org.elasticsearch.index.mapper.MergeMappingException;
+import org.elasticsearch.index.mapper.ParseContext;
 import org.elasticsearch.index.mapper.core.LongFieldMapper.CustomLongNumericField;
 import org.elasticsearch.index.query.QueryParseContext;
 import org.elasticsearch.index.search.NumericRangeFieldDataFilter;
@@ -342,16 +345,16 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
 
     @Override
     public Filter rangeFilter(Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable QueryParseContext context) {
-        return rangeFilter(lowerTerm, upperTerm, includeLower, includeUpper, null, null, context, null);
+        return rangeFilter(lowerTerm, upperTerm, includeLower, includeUpper, null, null, context);
     }
 
-    public Filter rangeFilter(Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable DateTimeZone timeZone, @Nullable DateMathParser forcedDateParser, @Nullable QueryParseContext context, @Nullable Boolean explicitCaching) {
-        return rangeFilter(null, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser, context, explicitCaching);
+    public Filter rangeFilter(Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable DateTimeZone timeZone, @Nullable DateMathParser forcedDateParser, @Nullable QueryParseContext context) {
+        return rangeFilter(null, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser, context);
     }
 
     @Override
     public Filter rangeFilter(QueryParseContext parseContext, Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable QueryParseContext context) {
-        return rangeFilter(parseContext, lowerTerm, upperTerm, includeLower, includeUpper, null, null, context, null);
+        return rangeFilter(parseContext, lowerTerm, upperTerm, includeLower, includeUpper, null, null, context);
     }
 
     /*
@@ -360,19 +363,17 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
      * - the object to parse is a String (does not apply to ms since epoch which are UTC based time values)
      * - the String to parse does not have already a timezone defined (ie. `2014-01-01T00:00:00+03:00`)
      */
-    public Filter rangeFilter(QueryParseContext parseContext, Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable DateTimeZone timeZone, @Nullable DateMathParser forcedDateParser, @Nullable QueryParseContext context, @Nullable Boolean explicitCaching) {
+    public Filter rangeFilter(QueryParseContext parseContext, Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable DateTimeZone timeZone, @Nullable DateMathParser forcedDateParser, @Nullable QueryParseContext context) {
         IndexNumericFieldData fieldData = parseContext != null ? (IndexNumericFieldData) parseContext.getForField(this) : null;
         // If the current search context is null we're parsing percolator query or a index alias filter.
         if (SearchContext.current() == null) {
-            return new LateParsingFilter(fieldData, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser, explicitCaching);
+            return new LateParsingFilter(fieldData, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser);
         } else {
-            return innerRangeFilter(fieldData, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser, explicitCaching);
+            return innerRangeFilter(fieldData, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser);
         }
     }
 
-    private Filter innerRangeFilter(IndexNumericFieldData fieldData, Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable DateTimeZone timeZone, @Nullable DateMathParser forcedDateParser, @Nullable Boolean explicitCaching) {
-        boolean cache;
-        boolean cacheable = true;
+    private Filter innerRangeFilter(IndexNumericFieldData fieldData, Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, @Nullable DateTimeZone timeZone, @Nullable DateMathParser forcedDateParser) {
         Long lowerVal = null;
         Long upperVal = null;
         if (lowerTerm != null) {
@@ -380,7 +381,6 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
                 lowerVal = ((Number) lowerTerm).longValue();
             } else {
                 String value = convertToString(lowerTerm);
-                cacheable = !hasDateExpressionWithNoRounding(value);
                 lowerVal = parseToMilliseconds(value, false, timeZone, forcedDateParser);
             }
         }
@@ -389,21 +389,10 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
                 upperVal = ((Number) upperTerm).longValue();
             } else {
                 String value = convertToString(upperTerm);
-                cacheable = cacheable && !hasDateExpressionWithNoRounding(value);
                 upperVal = parseToMilliseconds(value, includeUpper, timeZone, forcedDateParser);
             }
         }
 
-        if (explicitCaching != null) {
-            if (explicitCaching) {
-                cache = cacheable;
-            } else {
-                cache = false;
-            }
-        } else {
-            cache = cacheable;
-        }
-
         Filter filter;
         if (fieldData != null) {
             filter = NumericRangeFieldDataFilter.newLongRange(fieldData, lowerVal,upperVal, includeLower, includeUpper);
@@ -413,40 +402,7 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
             );
         }
 
-        if (!cache) {
-            // We don't cache range filter if `now` date expression is used and also when a compound filter wraps
-            // a range filter with a `now` date expressions.
-            return NoCacheFilter.wrap(filter);
-        } else {
-            return filter;
-        }
-    }
-
-    private boolean hasDateExpressionWithNoRounding(String value) {
-        int index = value.indexOf("now");
-        if (index != -1) {
-            if (value.length() == 3) {
-                return true;
-            } else {
-                int indexOfPotentialRounding = index + 3;
-                if (indexOfPotentialRounding >= value.length()) {
-                    return true;
-                } else {
-                    char potentialRoundingChar;
-                    do {
-                        potentialRoundingChar = value.charAt(indexOfPotentialRounding++);
-                        if (potentialRoundingChar == '/') {
-                            return false; // We found the rounding char, so we shouldn't forcefully disable caching
-                        } else if (potentialRoundingChar == ' ') {
-                            return true; // Next token in the date math expression and no rounding found, so we should not cache.
-                        }
-                    } while (indexOfPotentialRounding < value.length());
-                    return true; // Couldn't find rounding char, so we should not cache
-                }
-            }
-        } else {
-            return false;
-        }
+        return filter;
     }
 
     @Override
@@ -606,9 +562,8 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
         final boolean includeUpper;
         final DateTimeZone timeZone;
         final DateMathParser forcedDateParser;
-        final Boolean explicitCaching;
 
-        public LateParsingFilter(IndexNumericFieldData fieldData, Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, DateTimeZone timeZone, DateMathParser forcedDateParser, Boolean explicitCaching) {
+        public LateParsingFilter(IndexNumericFieldData fieldData, Object lowerTerm, Object upperTerm, boolean includeLower, boolean includeUpper, DateTimeZone timeZone, DateMathParser forcedDateParser) {
             this.fieldData = fieldData;
             this.lowerTerm = lowerTerm;
             this.upperTerm = upperTerm;
@@ -616,12 +571,11 @@ public class DateFieldMapper extends NumberFieldMapper<Long> {
             this.includeUpper = includeUpper;
             this.timeZone = timeZone;
             this.forcedDateParser = forcedDateParser;
-            this.explicitCaching = explicitCaching;
         }
 
         @Override
         public Filter resolve() {
-            return innerRangeFilter(fieldData, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser, explicitCaching);
+            return innerRangeFilter(fieldData, lowerTerm, upperTerm, includeLower, includeUpper, timeZone, forcedDateParser);
         }
     }
 

+ 1 - 1
src/main/java/org/elasticsearch/index/mapper/internal/TypeFieldMapper.java

@@ -143,7 +143,7 @@ public class TypeFieldMapper extends AbstractFieldMapper<String> implements Inte
 
     @Override
     public Query termQuery(Object value, @Nullable QueryParseContext context) {
-        return new ConstantScoreQuery(context.cacheFilter(termFilter(value, context), null));
+        return new ConstantScoreQuery(context.cacheFilter(termFilter(value, context), null, context.autoFilterCachePolicy()));
     }
 
     @Override

+ 3 - 1
src/main/java/org/elasticsearch/index/percolator/PercolatorQueriesRegistry.java

@@ -270,7 +270,9 @@ public class PercolatorQueriesRegistry extends AbstractIndexShardComponent {
             try (Engine.Searcher searcher = shard.acquireSearcher("percolator_load_queries", true)) {
                 Query query = new ConstantScoreQuery(
                         indexCache.filter().cache(
-                                new TermFilter(new Term(TypeFieldMapper.NAME, PercolatorService.TYPE_NAME))
+                                new TermFilter(new Term(TypeFieldMapper.NAME, PercolatorService.TYPE_NAME)),
+                                null,
+                                queryParserService.autoFilterCachePolicy()
                         )
                 );
                 QueriesLoaderCollector queryCollector = new QueriesLoaderCollector(PercolatorQueriesRegistry.this, logger, mapperService, indexFieldDataService);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/AndFilterParser.java

@@ -20,10 +20,11 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.AndFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.io.IOException;
 import java.util.ArrayList;
@@ -53,8 +54,8 @@ public class AndFilterParser implements FilterParser {
         ArrayList<Filter> filters = newArrayList();
         boolean filtersFound = false;
 
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
 
         String filterName = null;
         String currentFieldName = null;
@@ -91,11 +92,11 @@ public class AndFilterParser implements FilterParser {
                     }
                 } else if (token.isValue()) {
                     if ("_cache".equals(currentFieldName)) {
-                        cache = parser.booleanValue();
+                        cache = parseContext.parseFilterCachePolicy();
                     } else if ("_name".equals(currentFieldName)) {
                         filterName = parser.text();
                     } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                        cacheKey = new CacheKeyFilter.Key(parser.text());
+                        cacheKey = new HashedBytesRef(parser.text());
                     } else {
                         throw new QueryParsingException(parseContext.index(), "[and] filter does not support [" + currentFieldName + "]");
                     }
@@ -114,8 +115,8 @@ public class AndFilterParser implements FilterParser {
 
         // no need to cache this one
         Filter filter = new AndFilter(filters);
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         if (filterName != null) {
             parseContext.addNamedFilter(filterName, filter);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/BoolFilterParser.java

@@ -22,10 +22,11 @@ package org.elasticsearch.index.query;
 import org.apache.lucene.queries.FilterClause;
 import org.apache.lucene.search.BooleanClause;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.XBooleanFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.io.IOException;
 
@@ -51,8 +52,8 @@ public class BoolFilterParser implements FilterParser {
 
         XBooleanFilter boolFilter = new XBooleanFilter();
 
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
 
         String filterName = null;
         String currentFieldName = null;
@@ -115,11 +116,11 @@ public class BoolFilterParser implements FilterParser {
                 }
             } else if (token.isValue()) {
                 if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     throw new QueryParsingException(parseContext.index(), "[bool] filter does not support [" + currentFieldName + "]");
                 }
@@ -136,8 +137,8 @@ public class BoolFilterParser implements FilterParser {
         }
 
         Filter filter = boolFilter;
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         if (filterName != null) {
             parseContext.addNamedFilter(filterName, filter);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/ConstantScoreQueryParser.java

@@ -21,11 +21,12 @@ package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.ConstantScoreQuery;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.Query;
 import org.elasticsearch.common.Strings;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.io.IOException;
 
@@ -54,8 +55,8 @@ public class ConstantScoreQueryParser implements QueryParser {
         Query query = null;
         boolean queryFound = false;
         float boost = 1.0f;
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
 
         String currentFieldName = null;
         XContentParser.Token token;
@@ -76,9 +77,9 @@ public class ConstantScoreQueryParser implements QueryParser {
                 if ("boost".equals(currentFieldName)) {
                     boost = parser.floatValue();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     throw new QueryParsingException(parseContext.index(), "[constant_score] query does not support [" + currentFieldName + "]");
                 }
@@ -94,8 +95,8 @@ public class ConstantScoreQueryParser implements QueryParser {
 
         if (filter != null) {
             // cache the filter if possible needed
-            if (cache) {
-                filter = parseContext.cacheFilter(filter, cacheKey);
+            if (cache != null) {
+                filter = parseContext.cacheFilter(filter, cacheKey, cache);
             }
 
             Query query1 = new ConstantScoreQuery(filter);

+ 2 - 2
src/main/java/org/elasticsearch/index/query/ExistsFilterParser.java

@@ -24,10 +24,10 @@ import org.apache.lucene.search.BooleanClause;
 import org.apache.lucene.search.Filter;
 import org.apache.lucene.search.TermRangeFilter;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.Queries;
 import org.elasticsearch.common.lucene.search.XBooleanFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.FieldMappers;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.internal.FieldNamesFieldMapper;
@@ -127,7 +127,7 @@ public class ExistsFilterParser implements FilterParser {
 
         // we always cache this one, really does not change... (exists)
         // its ok to cache under the fieldName cacheKey, since its per segment and the mapping applies to this data on this segment...
-        Filter filter = parseContext.cacheFilter(boolFilter, new CacheKeyFilter.Key("$exists$" + fieldPattern));
+        Filter filter = parseContext.cacheFilter(boolFilter, new HashedBytesRef("$exists$" + fieldPattern), parseContext.autoFilterCachePolicy());
 
         filter = wrapSmartNameFilter(filter, nonNullFieldMappers, parseContext);
         if (filterName != null) {

+ 8 - 7
src/main/java/org/elasticsearch/index/query/FQueryFilterParser.java

@@ -20,11 +20,12 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.Query;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.Queries;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.io.IOException;
 
@@ -51,8 +52,8 @@ public class FQueryFilterParser implements FilterParser {
 
         Query query = null;
         boolean queryFound = false;
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
 
         String filterName = null;
         String currentFieldName = null;
@@ -71,9 +72,9 @@ public class FQueryFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.autoFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     throw new QueryParsingException(parseContext.index(), "[fquery] filter does not support [" + currentFieldName + "]");
                 }
@@ -86,8 +87,8 @@ public class FQueryFilterParser implements FilterParser {
             return null;
         }
         Filter filter = Queries.wrap(query, parseContext);
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         if (filterName != null) {
             parseContext.addNamedFilter(filterName, filter);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/FilteredQueryParser.java

@@ -23,6 +23,7 @@ import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.search.ConstantScoreQuery;
 import org.apache.lucene.search.DocIdSet;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.FilteredQuery;
 import org.apache.lucene.search.FilteredQuery.FilterStrategy;
 import org.apache.lucene.search.Query;
@@ -30,10 +31,10 @@ import org.apache.lucene.search.Scorer;
 import org.apache.lucene.search.Weight;
 import org.apache.lucene.util.Bits;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.docset.DocIdSets;
 import org.elasticsearch.common.lucene.search.Queries;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.io.IOException;
 
@@ -125,8 +126,8 @@ public class FilteredQueryParser implements QueryParser {
         Filter filter = null;
         boolean filterFound = false;
         float boost = 1.0f;
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String queryName = null;
 
         String currentFieldName = null;
@@ -172,9 +173,9 @@ public class FilteredQueryParser implements QueryParser {
                 } else if ("boost".equals(currentFieldName)) {
                     boost = parser.floatValue();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     throw new QueryParsingException(parseContext.index(), "[filtered] query does not support [" + currentFieldName + "]");
                 }
@@ -202,8 +203,8 @@ public class FilteredQueryParser implements QueryParser {
         }
 
         // cache if required
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
 
         // if its a match_all query, use constant_score

+ 8 - 7
src/main/java/org/elasticsearch/index/query/GeoBoundingBoxFilterParser.java

@@ -20,12 +20,13 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.ElasticsearchParseException;
 import org.elasticsearch.common.geo.GeoPoint;
 import org.elasticsearch.common.geo.GeoUtils;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.fielddata.IndexGeoPointFieldData;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
@@ -73,8 +74,8 @@ public class GeoBoundingBoxFilterParser implements FilterParser {
     public Filter parse(QueryParseContext parseContext) throws IOException, QueryParsingException {
         XContentParser parser = parseContext.parser();
 
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String fieldName = null;
 
         double top = Double.NaN;
@@ -140,9 +141,9 @@ public class GeoBoundingBoxFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else if ("normalize".equals(currentFieldName)) {
                     normalize = parser.booleanValue();
                 } else if ("type".equals(currentFieldName)) {
@@ -188,8 +189,8 @@ public class GeoBoundingBoxFilterParser implements FilterParser {
             throw new QueryParsingException(parseContext.index(), "geo bounding box type [" + type + "] not supported, either 'indexed' or 'memory' are allowed");
         }
 
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         filter = wrapSmartNameFilter(filter, smartMappers, parseContext);
         if (filterName != null) {

+ 8 - 7
src/main/java/org/elasticsearch/index/query/GeoDistanceFilterParser.java

@@ -20,14 +20,15 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.geo.GeoDistance;
 import org.elasticsearch.common.geo.GeoHashUtils;
 import org.elasticsearch.common.geo.GeoPoint;
 import org.elasticsearch.common.geo.GeoUtils;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.unit.DistanceUnit;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.fielddata.IndexGeoPointFieldData;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
@@ -65,8 +66,8 @@ public class GeoDistanceFilterParser implements FilterParser {
 
         XContentParser.Token token;
 
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String filterName = null;
         String currentFieldName = null;
         GeoPoint point = new GeoPoint();
@@ -126,9 +127,9 @@ public class GeoDistanceFilterParser implements FilterParser {
                 } else if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else if ("optimize_bbox".equals(currentFieldName) || "optimizeBbox".equals(currentFieldName)) {
                     optimizeBbox = parser.textOrNull();
                 } else if ("normalize".equals(currentFieldName)) {
@@ -167,8 +168,8 @@ public class GeoDistanceFilterParser implements FilterParser {
 
         IndexGeoPointFieldData indexFieldData = parseContext.getForField(mapper);
         Filter filter = new GeoDistanceFilter(point.lat(), point.lon(), distance, geoDistance, indexFieldData, geoMapper, optimizeBbox);
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         filter = wrapSmartNameFilter(filter, smartMappers, parseContext);
         if (filterName != null) {

+ 8 - 7
src/main/java/org/elasticsearch/index/query/GeoDistanceRangeFilterParser.java

@@ -20,14 +20,15 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.geo.GeoDistance;
 import org.elasticsearch.common.geo.GeoHashUtils;
 import org.elasticsearch.common.geo.GeoPoint;
 import org.elasticsearch.common.geo.GeoUtils;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.unit.DistanceUnit;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.fielddata.IndexGeoPointFieldData;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
@@ -65,8 +66,8 @@ public class GeoDistanceRangeFilterParser implements FilterParser {
 
         XContentParser.Token token;
 
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String filterName = null;
         String currentFieldName = null;
         GeoPoint point = new GeoPoint();
@@ -157,9 +158,9 @@ public class GeoDistanceRangeFilterParser implements FilterParser {
                 } else if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else if ("optimize_bbox".equals(currentFieldName) || "optimizeBbox".equals(currentFieldName)) {
                     optimizeBbox = parser.textOrNull();
                 } else if ("normalize".equals(currentFieldName)) {
@@ -207,8 +208,8 @@ public class GeoDistanceRangeFilterParser implements FilterParser {
 
         IndexGeoPointFieldData indexFieldData = parseContext.getForField(mapper);
         Filter filter = new GeoDistanceRangeFilter(point, from, to, includeLower, includeUpper, geoDistance, geoMapper, indexFieldData, optimizeBbox);
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         filter = wrapSmartNameFilter(filter, smartMappers, parseContext);
         if (filterName != null) {

+ 9 - 7
src/main/java/org/elasticsearch/index/query/GeoPolygonFilterParser.java

@@ -20,13 +20,15 @@
 package org.elasticsearch.index.query;
 
 import com.google.common.collect.Lists;
+
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.geo.GeoPoint;
 import org.elasticsearch.common.geo.GeoUtils;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.xcontent.XContentParser;
 import org.elasticsearch.common.xcontent.XContentParser.Token;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.fielddata.IndexGeoPointFieldData;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
@@ -68,8 +70,8 @@ public class GeoPolygonFilterParser implements FilterParser {
     public Filter parse(QueryParseContext parseContext) throws IOException, QueryParsingException {
         XContentParser parser = parseContext.parser();
 
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String fieldName = null;
 
         List<GeoPoint> shell = Lists.newArrayList();
@@ -106,9 +108,9 @@ public class GeoPolygonFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else if ("normalize".equals(currentFieldName)) {
                     normalizeLat = parser.booleanValue();
                     normalizeLon = parser.booleanValue();
@@ -152,8 +154,8 @@ public class GeoPolygonFilterParser implements FilterParser {
 
         IndexGeoPointFieldData indexFieldData = parseContext.getForField(mapper);
         Filter filter = new GeoPolygonFilter(indexFieldData, shell.toArray(new GeoPoint[shell.size()]));
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         filter = wrapSmartNameFilter(filter, smartMappers, parseContext);
         if (filterName != null) {

+ 8 - 7
src/main/java/org/elasticsearch/index/query/GeoShapeFilterParser.java

@@ -23,15 +23,16 @@ import com.spatial4j.core.shape.Shape;
 
 import org.apache.lucene.search.BooleanClause;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.spatial.prefix.PrefixTreeStrategy;
 import org.apache.lucene.spatial.prefix.RecursivePrefixTreeStrategy;
 import org.elasticsearch.common.geo.ShapeRelation;
 import org.elasticsearch.common.geo.builders.ShapeBuilder;
 import org.elasticsearch.common.inject.Inject;
 import org.elasticsearch.common.inject.internal.Nullable;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.XBooleanFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.geo.GeoShapeFieldMapper;
@@ -85,8 +86,8 @@ public class GeoShapeFilterParser implements FilterParser {
         ShapeRelation shapeRelation = ShapeRelation.INTERSECTS;
         String strategyName = null;
         ShapeBuilder shape = null;
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String filterName = null;
 
         String id = null;
@@ -148,9 +149,9 @@ public class GeoShapeFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     throw new QueryParsingException(parseContext.index(), "[geo_shape] filter does not support [" + currentFieldName + "]");
                 }
@@ -194,8 +195,8 @@ public class GeoShapeFilterParser implements FilterParser {
             filter = strategy.makeFilter(GeoShapeQueryParser.getArgs(shape, shapeRelation));
         }
 
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
 
         filter = wrapSmartNameFilter(filter, smartNameFieldMappers, parseContext);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/GeohashCellFilter.java

@@ -20,6 +20,7 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.ElasticsearchIllegalArgumentException;
 import org.elasticsearch.ElasticsearchParseException;
 import org.elasticsearch.common.Nullable;
@@ -28,11 +29,11 @@ import org.elasticsearch.common.geo.GeoHashUtils;
 import org.elasticsearch.common.geo.GeoPoint;
 import org.elasticsearch.common.geo.GeoUtils;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.unit.DistanceUnit;
 import org.elasticsearch.common.xcontent.XContentBuilder;
 import org.elasticsearch.common.xcontent.XContentParser;
 import org.elasticsearch.common.xcontent.XContentParser.Token;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.core.StringFieldMapper;
@@ -214,8 +215,8 @@ public class GeohashCellFilter {
             String geohash = null;
             int levels = -1;
             boolean neighbors = false;
-            boolean cache = false;
-            CacheKeyFilter.Key cacheKey = null;
+            FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+            HashedBytesRef cacheKey = null;
 
 
             XContentParser.Token token;
@@ -240,10 +241,10 @@ public class GeohashCellFilter {
                         neighbors = parser.booleanValue();
                     } else if (CACHE.equals(field)) {
                         parser.nextToken();
-                        cache = parser.booleanValue();
+                        cache = parseContext.parseFilterCachePolicy();
                     } else if (CACHE_KEY.equals(field)) {
                         parser.nextToken();
-                        cacheKey = new CacheKeyFilter.Key(parser.text());
+                        cacheKey = new HashedBytesRef(parser.text());
                     } else {
                         fieldName = field;
                         token = parser.nextToken();
@@ -295,8 +296,8 @@ public class GeohashCellFilter {
                 filter = create(parseContext, geoMapper, geohash, null);
             }
 
-            if (cache) {
-                filter = parseContext.cacheFilter(filter, cacheKey);
+            if (cache != null) {
+                filter = parseContext.cacheFilter(filter, cacheKey, cache);
             }
 
             return filter;

+ 2 - 2
src/main/java/org/elasticsearch/index/query/HasChildFilterParser.java

@@ -138,7 +138,7 @@ public class HasChildFilterParser implements FilterParser {
         String parentType = parentFieldMapper.type();
 
         // wrap the query with type query
-        query = new FilteredQuery(query, parseContext.cacheFilter(childDocMapper.typeFilter(), null));
+        query = new FilteredQuery(query, parseContext.cacheFilter(childDocMapper.typeFilter(), null, parseContext.autoFilterCachePolicy()));
 
         DocumentMapper parentDocMapper = parseContext.mapperService().documentMapper(parentType);
         if (parentDocMapper == null) {
@@ -154,7 +154,7 @@ public class HasChildFilterParser implements FilterParser {
             nonNestedDocsFilter = parseContext.bitsetFilter(NonNestedDocsFilter.INSTANCE);
         }
 
-        Filter parentFilter = parseContext.cacheFilter(parentDocMapper.typeFilter(), null);
+        Filter parentFilter = parseContext.cacheFilter(parentDocMapper.typeFilter(), null, parseContext.autoFilterCachePolicy());
         ParentChildIndexFieldData parentChildIndexFieldData = parseContext.getForField(parentFieldMapper);
 
         Query childrenQuery;

+ 2 - 2
src/main/java/org/elasticsearch/index/query/HasChildQueryParser.java

@@ -153,10 +153,10 @@ public class HasChildQueryParser implements QueryParser {
         }
 
         // wrap the query with type query
-        innerQuery = new FilteredQuery(innerQuery, parseContext.cacheFilter(childDocMapper.typeFilter(), null));
+        innerQuery = new FilteredQuery(innerQuery, parseContext.cacheFilter(childDocMapper.typeFilter(), null, parseContext.autoFilterCachePolicy()));
 
         Query query;
-        Filter parentFilter = parseContext.cacheFilter(parentDocMapper.typeFilter(), null);
+        Filter parentFilter = parseContext.cacheFilter(parentDocMapper.typeFilter(), null, parseContext.autoFilterCachePolicy());
         ParentChildIndexFieldData parentChildIndexFieldData = parseContext.getForField(parentFieldMapper);
         if (minChildren > 1 || maxChildren > 0 || scoreType != ScoreType.NONE) {
             query = new ChildrenQuery(parentChildIndexFieldData, parentType, childType, parentFilter, innerQuery, scoreType, minChildren,

+ 2 - 2
src/main/java/org/elasticsearch/index/query/HasParentQueryParser.java

@@ -180,8 +180,8 @@ public class HasParentQueryParser implements QueryParser {
         }
 
         // wrap the query with type query
-        innerQuery = new FilteredQuery(innerQuery, parseContext.cacheFilter(parentDocMapper.typeFilter(), null));
-        Filter childrenFilter = parseContext.cacheFilter(new NotFilter(parentFilter), null);
+        innerQuery = new FilteredQuery(innerQuery, parseContext.cacheFilter(parentDocMapper.typeFilter(), null, parseContext.autoFilterCachePolicy()));
+        Filter childrenFilter = parseContext.cacheFilter(new NotFilter(parentFilter), null, parseContext.autoFilterCachePolicy());
         if (score) {
             return new ParentQuery(parentChildIndexFieldData, innerQuery, parentDocMapper.type(), childrenFilter);
         } else {

+ 10 - 0
src/main/java/org/elasticsearch/index/query/IndexQueryParserService.java

@@ -20,7 +20,9 @@
 package org.elasticsearch.index.query;
 
 import com.google.common.collect.ImmutableMap;
+
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.Query;
 import org.apache.lucene.util.CloseableThreadLocal;
 import org.elasticsearch.ElasticsearchException;
@@ -92,6 +94,8 @@ public class IndexQueryParserService extends AbstractIndexComponent {
 
     final BitsetFilterCache bitsetFilterCache;
 
+    final FilterCachingPolicy autoFilterCachePolicy;
+
     private final Map<String, QueryParser> queryParsers;
 
     private final Map<String, FilterParser> filterParsers;
@@ -107,6 +111,7 @@ public class IndexQueryParserService extends AbstractIndexComponent {
                                    ScriptService scriptService, AnalysisService analysisService,
                                    MapperService mapperService, IndexCache indexCache, IndexFieldDataService fieldDataService,
                                    BitsetFilterCache bitsetFilterCache,
+                                   FilterCachingPolicy autoFilterCachePolicy,
                                    @Nullable SimilarityService similarityService,
                                    @Nullable Map<String, QueryParserFactory> namedQueryParsers,
                                    @Nullable Map<String, FilterParserFactory> namedFilterParsers) {
@@ -118,6 +123,7 @@ public class IndexQueryParserService extends AbstractIndexComponent {
         this.indexCache = indexCache;
         this.fieldDataService = fieldDataService;
         this.bitsetFilterCache = bitsetFilterCache;
+        this.autoFilterCachePolicy = autoFilterCachePolicy;
 
         this.defaultField = indexSettings.get(DEFAULT_FIELD, AllFieldMapper.NAME);
         this.queryStringLenient = indexSettings.getAsBoolean(QUERY_STRING_LENIENT, false);
@@ -179,6 +185,10 @@ public class IndexQueryParserService extends AbstractIndexComponent {
         return this.defaultField;
     }
 
+    public FilterCachingPolicy autoFilterCachePolicy() {
+        return autoFilterCachePolicy;
+    }
+
     public boolean queryStringLenient() {
         return this.queryStringLenient;
     }

+ 5 - 5
src/main/java/org/elasticsearch/index/query/MissingFilterParser.java

@@ -24,11 +24,11 @@ import org.apache.lucene.search.BooleanClause;
 import org.apache.lucene.search.Filter;
 import org.apache.lucene.search.TermRangeFilter;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.NotFilter;
 import org.elasticsearch.common.lucene.search.Queries;
 import org.elasticsearch.common.lucene.search.XBooleanFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.FieldMappers;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.internal.FieldNamesFieldMapper;
@@ -147,10 +147,10 @@ public class MissingFilterParser implements FilterParser {
 
             // we always cache this one, really does not change... (exists)
             // its ok to cache under the fieldName cacheKey, since its per segment and the mapping applies to this data on this segment...
-            existenceFilter = parseContext.cacheFilter(boolFilter, new CacheKeyFilter.Key("$exists$" + fieldPattern));
+            existenceFilter = parseContext.cacheFilter(boolFilter, new HashedBytesRef("$exists$" + fieldPattern), parseContext.autoFilterCachePolicy());
             existenceFilter = new NotFilter(existenceFilter);
             // cache the not filter as well, so it will be faster
-            existenceFilter = parseContext.cacheFilter(existenceFilter, new CacheKeyFilter.Key("$missing$" + fieldPattern));
+            existenceFilter = parseContext.cacheFilter(existenceFilter, new HashedBytesRef("$missing$" + fieldPattern), parseContext.autoFilterCachePolicy());
         }
 
         if (nullValue) {
@@ -160,7 +160,7 @@ public class MissingFilterParser implements FilterParser {
                     nullFilter = smartNameFieldMappers.mapper().nullValueFilter();
                     if (nullFilter != null) {
                         // cache the not filter as well, so it will be faster
-                        nullFilter = parseContext.cacheFilter(nullFilter, new CacheKeyFilter.Key("$null$" + fieldPattern));
+                        nullFilter = parseContext.cacheFilter(nullFilter, new HashedBytesRef("$null$" + fieldPattern), parseContext.autoFilterCachePolicy());
                     }
                 }
             }
@@ -173,7 +173,7 @@ public class MissingFilterParser implements FilterParser {
                 combined.add(existenceFilter, BooleanClause.Occur.SHOULD);
                 combined.add(nullFilter, BooleanClause.Occur.SHOULD);
                 // cache the not filter as well, so it will be faster
-                filter = parseContext.cacheFilter(combined, null);
+                filter = parseContext.cacheFilter(combined, null, parseContext.autoFilterCachePolicy());
             } else {
                 filter = nullFilter;
             }

+ 4 - 4
src/main/java/org/elasticsearch/index/query/NestedFilterParser.java

@@ -28,9 +28,9 @@ import org.apache.lucene.search.join.ScoreMode;
 import org.apache.lucene.search.join.ToParentBlockJoinQuery;
 import org.elasticsearch.common.Strings;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.Queries;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.object.ObjectMapper;
 import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
@@ -61,7 +61,7 @@ public class NestedFilterParser implements FilterParser {
         float boost = 1.0f;
         String path = null;
         boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        HashedBytesRef cacheKey = null;
         String filterName = null;
 
         // we need a late binding filter so we can inject a parent nested filter inner nested queries
@@ -96,7 +96,7 @@ public class NestedFilterParser implements FilterParser {
                     } else if ("_cache".equals(currentFieldName)) {
                         cache = parser.booleanValue();
                     } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                        cacheKey = new CacheKeyFilter.Key(parser.text());
+                        cacheKey = new HashedBytesRef(parser.text());
                     } else {
                         throw new QueryParsingException(parseContext.index(), "[nested] filter does not support [" + currentFieldName + "]");
                     }
@@ -151,7 +151,7 @@ public class NestedFilterParser implements FilterParser {
             Filter nestedFilter = Queries.wrap(new ToParentBlockJoinQuery(query, parentFilter, ScoreMode.None), parseContext);
 
             if (cache) {
-                nestedFilter = parseContext.cacheFilter(nestedFilter, cacheKey);
+                nestedFilter = parseContext.cacheFilter(nestedFilter, cacheKey, parseContext.autoFilterCachePolicy());
             }
             if (filterName != null) {
                 parseContext.addNamedFilter(filterName, nestedFilter);

+ 4 - 4
src/main/java/org/elasticsearch/index/query/NotFilterParser.java

@@ -21,9 +21,9 @@ package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.NotFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.io.IOException;
 
@@ -50,7 +50,7 @@ public class NotFilterParser implements FilterParser {
         Filter filter = null;
         boolean filterFound = false;
         boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        HashedBytesRef cacheKey = null;
 
         String filterName = null;
         String currentFieldName = null;
@@ -77,7 +77,7 @@ public class NotFilterParser implements FilterParser {
                 } else if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     throw new QueryParsingException(parseContext.index(), "[not] filter does not support [" + currentFieldName + "]");
                 }
@@ -94,7 +94,7 @@ public class NotFilterParser implements FilterParser {
 
         Filter notFilter = new NotFilter(filter);
         if (cache) {
-            notFilter = parseContext.cacheFilter(notFilter, cacheKey);
+            notFilter = parseContext.cacheFilter(notFilter, cacheKey, parseContext.autoFilterCachePolicy());
         }
         if (filterName != null) {
             parseContext.addNamedFilter(filterName, notFilter);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/OrFilterParser.java

@@ -20,10 +20,11 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.OrFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.io.IOException;
 import java.util.ArrayList;
@@ -53,8 +54,8 @@ public class OrFilterParser implements FilterParser {
         ArrayList<Filter> filters = newArrayList();
         boolean filtersFound = false;
 
-        boolean cache = false;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
 
         String filterName = null;
         String currentFieldName = null;
@@ -91,11 +92,11 @@ public class OrFilterParser implements FilterParser {
                     }
                 } else if (token.isValue()) {
                     if ("_cache".equals(currentFieldName)) {
-                        cache = parser.booleanValue();
+                        cache = parseContext.parseFilterCachePolicy();
                     } else if ("_name".equals(currentFieldName)) {
                         filterName = parser.text();
                     } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                        cacheKey = new CacheKeyFilter.Key(parser.text());
+                        cacheKey = new HashedBytesRef(parser.text());
                     } else {
                         throw new QueryParsingException(parseContext.index(), "[or] filter does not support [" + currentFieldName + "]");
                     }
@@ -113,8 +114,8 @@ public class OrFilterParser implements FilterParser {
 
         // no need to cache this one
         Filter filter = new OrFilter(filters);
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         if (filterName != null) {
             parseContext.addNamedFilter(filterName, filter);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/PrefixFilterParser.java

@@ -21,11 +21,12 @@ package org.elasticsearch.index.query;
 
 import org.apache.lucene.index.Term;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.PrefixFilter;
 import org.elasticsearch.common.inject.Inject;
 import org.elasticsearch.common.lucene.BytesRefs;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.MapperService;
 
 import java.io.IOException;
@@ -52,8 +53,8 @@ public class PrefixFilterParser implements FilterParser {
     public Filter parse(QueryParseContext parseContext) throws IOException, QueryParsingException {
         XContentParser parser = parseContext.parser();
 
-        boolean cache = true;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String fieldName = null;
         Object value = null;
 
@@ -67,9 +68,9 @@ public class PrefixFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     fieldName = currentFieldName;
                     value = parser.objectBytes();
@@ -100,8 +101,8 @@ public class PrefixFilterParser implements FilterParser {
             filter = new PrefixFilter(new Term(fieldName, BytesRefs.toBytesRef(value)));
         }
 
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
 
         filter = wrapSmartNameFilter(filter, smartNameFieldMappers, parseContext);

+ 29 - 11
src/main/java/org/elasticsearch/index/query/QueryParseContext.java

@@ -21,11 +21,13 @@ package org.elasticsearch.index.query;
 
 import com.google.common.collect.ImmutableMap;
 import com.google.common.collect.Maps;
+
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.queryparser.classic.MapperQueryParser;
 import org.apache.lucene.queryparser.classic.QueryParserSettings;
 import org.apache.lucene.search.DocIdSet;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.Query;
 import org.apache.lucene.search.join.BitDocIdSetFilter;
 import org.apache.lucene.search.similarities.Similarity;
@@ -33,6 +35,7 @@ import org.apache.lucene.util.Bits;
 import org.elasticsearch.Version;
 import org.elasticsearch.common.Nullable;
 import org.elasticsearch.common.ParseField;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.NoCacheFilter;
 import org.elasticsearch.common.lucene.search.NoCacheQuery;
 import org.elasticsearch.common.lucene.search.Queries;
@@ -40,7 +43,6 @@ import org.elasticsearch.common.lucene.search.ResolvableFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
 import org.elasticsearch.index.Index;
 import org.elasticsearch.index.analysis.AnalysisService;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.cache.query.parser.QueryParserCache;
 import org.elasticsearch.index.fielddata.IndexFieldData;
 import org.elasticsearch.index.mapper.FieldMapper;
@@ -53,7 +55,11 @@ import org.elasticsearch.search.internal.SearchContext;
 import org.elasticsearch.search.lookup.SearchLookup;
 
 import java.io.IOException;
-import java.util.*;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.EnumSet;
+import java.util.List;
+import java.util.Map;
 
 /**
  *
@@ -174,6 +180,24 @@ public class QueryParseContext {
         return indexQueryParser.defaultField();
     }
 
+    public FilterCachingPolicy autoFilterCachePolicy() {
+        return indexQueryParser.autoFilterCachePolicy();
+    }
+
+    public FilterCachingPolicy parseFilterCachePolicy() throws IOException {
+        final String text = parser.textOrNull();
+        if (text == null || text.equals("auto")) {
+            return autoFilterCachePolicy();
+        } else if (parser.booleanValue()) {
+            // cache without conditions on how many times the filter has been
+            // used or what the produced DocIdSet looks like, but ONLY on large
+            // segments to not pollute the cache
+            return FilterCachingPolicy.CacheOnLargeSegments.DEFAULT;
+        } else {
+            return null;
+        }
+    }
+
     public boolean queryStringLenient() {
         return indexQueryParser.queryStringLenient();
     }
@@ -187,7 +211,7 @@ public class QueryParseContext {
         return indexQueryParser.bitsetFilterCache.getBitDocIdSetFilter(filter);
     }
 
-    public Filter cacheFilter(Filter filter, @Nullable final CacheKeyFilter.Key cacheKey) {
+    public Filter cacheFilter(Filter filter, final @Nullable HashedBytesRef cacheKey, final FilterCachingPolicy cachePolicy) {
         if (filter == null) {
             return null;
         }
@@ -205,18 +229,12 @@ public class QueryParseContext {
                     if (filter == null) {
                         return null;
                     }
-                    if (cacheKey != null) {
-                        filter = new CacheKeyFilter.Wrapper(filter, cacheKey);
-                    }
-                    filter = indexQueryParser.indexCache.filter().cache(filter);
+                    filter = indexQueryParser.indexCache.filter().cache(filter, cacheKey, cachePolicy);
                     return filter.getDocIdSet(atomicReaderContext, bits);
                 }
             };
         } else {
-            if (cacheKey != null) {
-                filter = new CacheKeyFilter.Wrapper(filter, cacheKey);
-            }
-            return indexQueryParser.indexCache.filter().cache(filter);
+            return indexQueryParser.indexCache.filter().cache(filter, cacheKey, cachePolicy);
         }
     }
 

+ 10 - 21
src/main/java/org/elasticsearch/index/query/RangeFilterParser.java

@@ -20,13 +20,14 @@
 package org.elasticsearch.index.query;
 
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.TermRangeFilter;
 import org.elasticsearch.common.inject.Inject;
 import org.elasticsearch.common.joda.DateMathParser;
 import org.elasticsearch.common.joda.Joda;
 import org.elasticsearch.common.lucene.BytesRefs;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.core.DateFieldMapper;
@@ -57,8 +58,8 @@ public class RangeFilterParser implements FilterParser {
     public Filter parse(QueryParseContext parseContext) throws IOException, QueryParsingException {
         XContentParser parser = parseContext.parser();
 
-        Boolean cache = null;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String fieldName = null;
         Object from = null;
         Object to = null;
@@ -113,9 +114,9 @@ public class RangeFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else if ("execution".equals(currentFieldName)) {
                     execution = parser.text();
                 } else {
@@ -129,20 +130,16 @@ public class RangeFilterParser implements FilterParser {
         }
 
         Filter filter = null;
-        Boolean explicitlyCached = cache;
         MapperService.SmartNameFieldMappers smartNameFieldMappers = parseContext.smartFieldMappers(fieldName);
         if (smartNameFieldMappers != null) {
             if (smartNameFieldMappers.hasMapper()) {
                 if (execution.equals("index")) {
-                    if (cache == null) {
-                        cache = true;
-                    }
                     FieldMapper mapper = smartNameFieldMappers.mapper();
                     if (mapper instanceof DateFieldMapper) {
                         if ((from instanceof Number || to instanceof Number) && timeZone != null) {
                             throw new QueryParsingException(parseContext.index(), "[range] time_zone when using ms since epoch format as it's UTC based can not be applied to [" + fieldName + "]");
                         }
-                        filter = ((DateFieldMapper) mapper).rangeFilter(from, to, includeLower, includeUpper, timeZone, forcedDateParser, parseContext, explicitlyCached);
+                        filter = ((DateFieldMapper) mapper).rangeFilter(from, to, includeLower, includeUpper, timeZone, forcedDateParser, parseContext);
                     } else  {
                         if (timeZone != null) {
                             throw new QueryParsingException(parseContext.index(), "[range] time_zone can not be applied to non date field [" + fieldName + "]");
@@ -150,9 +147,6 @@ public class RangeFilterParser implements FilterParser {
                         filter = mapper.rangeFilter(from, to, includeLower, includeUpper, parseContext);
                     }
                 } else if ("fielddata".equals(execution)) {
-                    if (cache == null) {
-                        cache = false;
-                    }
                     FieldMapper mapper = smartNameFieldMappers.mapper();
                     if (!(mapper instanceof NumberFieldMapper)) {
                         throw new QueryParsingException(parseContext.index(), "[range] filter field [" + fieldName + "] is not a numeric type");
@@ -161,7 +155,7 @@ public class RangeFilterParser implements FilterParser {
                         if ((from instanceof Number || to instanceof Number) && timeZone != null) {
                             throw new QueryParsingException(parseContext.index(), "[range] time_zone when using ms since epoch format as it's UTC based can not be applied to [" + fieldName + "]");
                         }
-                        filter = ((DateFieldMapper) mapper).rangeFilter(parseContext, from, to, includeLower, includeUpper, timeZone, forcedDateParser, parseContext, explicitlyCached);
+                        filter = ((DateFieldMapper) mapper).rangeFilter(parseContext, from, to, includeLower, includeUpper, timeZone, forcedDateParser, parseContext);
                     } else {
                         if (timeZone != null) {
                             throw new QueryParsingException(parseContext.index(), "[range] time_zone can not be applied to non date field [" + fieldName + "]");
@@ -175,16 +169,11 @@ public class RangeFilterParser implements FilterParser {
         }
 
         if (filter == null) {
-            if (cache == null) {
-                cache = true;
-            }
             filter = new TermRangeFilter(fieldName, BytesRefs.toBytesRef(from), BytesRefs.toBytesRef(to), includeLower, includeUpper);
         }
 
-        if (explicitlyCached == null || explicitlyCached) {
-            if (cache) {
-                filter = parseContext.cacheFilter(filter, cacheKey);
-            }
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
 
         filter = wrapSmartNameFilter(filter, smartNameFieldMappers, parseContext);

+ 8 - 7
src/main/java/org/elasticsearch/index/query/RegexpFilterParser.java

@@ -21,12 +21,13 @@ package org.elasticsearch.index.query;
 
 import org.apache.lucene.index.Term;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.util.automaton.Operations;
 import org.elasticsearch.common.inject.Inject;
 import org.elasticsearch.common.lucene.BytesRefs;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.RegexpFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.MapperService;
 
 import java.io.IOException;
@@ -53,8 +54,8 @@ public class RegexpFilterParser implements FilterParser {
     public Filter parse(QueryParseContext parseContext) throws IOException, QueryParsingException {
         XContentParser parser = parseContext.parser();
 
-        boolean cache = true;
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String fieldName = null;
         String secondaryFieldName = null;
         Object value = null;
@@ -92,9 +93,9 @@ public class RegexpFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     secondaryFieldName = currentFieldName;
                     secondaryValue = parser.objectBytes();
@@ -130,8 +131,8 @@ public class RegexpFilterParser implements FilterParser {
             filter = new RegexpFilter(new Term(fieldName, BytesRefs.toBytesRef(value)), flagsValue, maxDeterminizedStates);
         }
 
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
 
         filter = wrapSmartNameFilter(filter, smartNameFieldMappers, parseContext);

+ 11 - 10
src/main/java/org/elasticsearch/index/query/ScriptFilterParser.java

@@ -19,26 +19,27 @@
 
 package org.elasticsearch.index.query;
 
-import java.io.IOException;
-import java.util.Map;
-
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.search.BitsFilteredDocIdSet;
 import org.apache.lucene.search.DocIdSet;
 import org.apache.lucene.search.DocValuesDocIdSet;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.util.Bits;
 import org.elasticsearch.ElasticsearchIllegalArgumentException;
 import org.elasticsearch.common.Nullable;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.script.ScriptParameterParser;
 import org.elasticsearch.script.ScriptParameterParser.ScriptParameterValue;
 import org.elasticsearch.script.ScriptService;
 import org.elasticsearch.script.SearchScript;
 import org.elasticsearch.search.lookup.SearchLookup;
 
+import java.io.IOException;
+import java.util.Map;
+
 import static com.google.common.collect.Maps.newHashMap;
 
 /**
@@ -64,8 +65,8 @@ public class ScriptFilterParser implements FilterParser {
 
         XContentParser.Token token;
 
-        boolean cache = false; // no need to cache it by default, changes a lot?
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         // also, when caching, since its isCacheable is false, will result in loading all bit set...
         String script = null;
         String scriptLang = null;
@@ -88,9 +89,9 @@ public class ScriptFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else if (!scriptParameterParser.token(currentFieldName, token, parser)){
                     throw new QueryParsingException(parseContext.index(), "[script] filter does not support [" + currentFieldName + "]");
                 }
@@ -112,8 +113,8 @@ public class ScriptFilterParser implements FilterParser {
         }
 
         Filter filter = new ScriptFilter(scriptLang, script, scriptType, params, parseContext.scriptService(), parseContext.lookup());
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
         if (filterName != null) {
             parseContext.addNamedFilter(filterName, filter);

+ 10 - 9
src/main/java/org/elasticsearch/index/query/TermFilterParser.java

@@ -22,10 +22,11 @@ package org.elasticsearch.index.query;
 import org.apache.lucene.index.Term;
 import org.apache.lucene.queries.TermFilter;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.elasticsearch.common.inject.Inject;
 import org.elasticsearch.common.lucene.BytesRefs;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.MapperService;
 
 import java.io.IOException;
@@ -52,8 +53,8 @@ public class TermFilterParser implements FilterParser {
     public Filter parse(QueryParseContext parseContext) throws IOException, QueryParsingException {
         XContentParser parser = parseContext.parser();
 
-        boolean cache = true; // since usually term filter is on repeating terms, cache it by default
-        CacheKeyFilter.Key cacheKey = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
+        HashedBytesRef cacheKey = null;
         String fieldName = null;
         Object value = null;
 
@@ -77,9 +78,9 @@ public class TermFilterParser implements FilterParser {
                         } else if ("_name".equals(currentFieldName)) {
                             filterName = parser.text();
                         } else if ("_cache".equals(currentFieldName)) {
-                            cache = parser.booleanValue();
+                            cache = parseContext.parseFilterCachePolicy();
                         } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                            cacheKey = new CacheKeyFilter.Key(parser.text());
+                            cacheKey = new HashedBytesRef(parser.text());
                         } else {
                             throw new QueryParsingException(parseContext.index(), "[term] filter does not support [" + currentFieldName + "]");
                         }
@@ -89,9 +90,9 @@ public class TermFilterParser implements FilterParser {
                 if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     fieldName = currentFieldName;
                     value = parser.objectBytes();
@@ -125,8 +126,8 @@ public class TermFilterParser implements FilterParser {
             filter = new TermFilter(new Term(fieldName, BytesRefs.toBytesRef(value)));
         }
 
-        if (cache) {
-            filter = parseContext.cacheFilter(filter, cacheKey);
+        if (cache != null) {
+            filter = parseContext.cacheFilter(filter, cacheKey, cache);
         }
 
         filter = wrapSmartNameFilter(filter, smartNameFieldMappers, parseContext);

+ 19 - 44
src/main/java/org/elasticsearch/index/query/TermsFilterParser.java

@@ -20,20 +20,22 @@
 package org.elasticsearch.index.query;
 
 import com.google.common.collect.Lists;
+
 import org.apache.lucene.index.Term;
 import org.apache.lucene.queries.TermFilter;
 import org.apache.lucene.queries.TermsFilter;
 import org.apache.lucene.search.BooleanClause;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.util.BytesRef;
 import org.elasticsearch.common.inject.Inject;
 import org.elasticsearch.common.lucene.BytesRefs;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.AndFilter;
 import org.elasticsearch.common.lucene.search.OrFilter;
 import org.elasticsearch.common.lucene.search.Queries;
 import org.elasticsearch.common.lucene.search.XBooleanFilter;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.FieldMapper;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.indices.cache.filter.terms.IndicesTermsFilterCache;
@@ -81,7 +83,7 @@ public class TermsFilterParser implements FilterParser {
         XContentParser parser = parseContext.parser();
 
         MapperService.SmartNameFieldMappers smartNameFieldMappers;
-        Boolean cache = null;
+        FilterCachingPolicy cache = parseContext.autoFilterCachePolicy();
         String filterName = null;
         String currentFieldName = null;
 
@@ -92,7 +94,7 @@ public class TermsFilterParser implements FilterParser {
         String lookupRouting = null;
         boolean lookupCache = true;
 
-        CacheKeyFilter.Key cacheKey = null;
+        HashedBytesRef cacheKey = null;
         XContentParser.Token token;
         String execution = EXECUTION_VALUE_PLAIN;
         List<Object> terms = Lists.newArrayList();
@@ -151,9 +153,9 @@ public class TermsFilterParser implements FilterParser {
                 } else if ("_name".equals(currentFieldName)) {
                     filterName = parser.text();
                 } else if ("_cache".equals(currentFieldName)) {
-                    cache = parser.booleanValue();
+                    cache = parseContext.parseFilterCachePolicy();
                 } else if ("_cache_key".equals(currentFieldName) || "_cacheKey".equals(currentFieldName)) {
-                    cacheKey = new CacheKeyFilter.Key(parser.text());
+                    cacheKey = new HashedBytesRef(parser.text());
                 } else {
                     throw new QueryParsingException(parseContext.index(), "[terms] filter does not support [" + currentFieldName + "]");
                 }
@@ -194,8 +196,8 @@ public class TermsFilterParser implements FilterParser {
             }
 
             // cache the whole filter by default, or if explicitly told to
-            if (cache == null || cache) {
-                filter = parseContext.cacheFilter(filter, cacheKey);
+            if (cache != null) {
+                filter = parseContext.cacheFilter(filter, cacheKey, cache);
             }
             return filter;
         }
@@ -216,10 +218,6 @@ public class TermsFilterParser implements FilterParser {
                     }
                     filter = new TermsFilter(fieldName, filterValues);
                 }
-                // cache the whole filter by default, or if explicitly told to
-                if (cache == null || cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else if (EXECUTION_VALUE_FIELDDATA.equals(execution)) {
                 // if there are no mappings, then nothing has been indexing yet against this shard, so we can return
                 // no match (but not cached!), since the FieldDataTermsFilter relies on a mapping...
@@ -228,25 +226,18 @@ public class TermsFilterParser implements FilterParser {
                 }
 
                 filter = fieldMapper.fieldDataTermsFilter(terms, parseContext);
-                if (cache != null && cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else if (EXECUTION_VALUE_BOOL.equals(execution)) {
                 XBooleanFilter boolFiler = new XBooleanFilter();
                 if (fieldMapper != null) {
                     for (Object term : terms) {
-                        boolFiler.add(parseContext.cacheFilter(fieldMapper.termFilter(term, parseContext), null), BooleanClause.Occur.SHOULD);
+                        boolFiler.add(parseContext.cacheFilter(fieldMapper.termFilter(term, parseContext), null, parseContext.autoFilterCachePolicy()), BooleanClause.Occur.SHOULD);
                     }
                 } else {
                     for (Object term : terms) {
-                        boolFiler.add(parseContext.cacheFilter(new TermFilter(new Term(fieldName, BytesRefs.toBytesRef(term))), null), BooleanClause.Occur.SHOULD);
+                        boolFiler.add(parseContext.cacheFilter(new TermFilter(new Term(fieldName, BytesRefs.toBytesRef(term))), null, parseContext.autoFilterCachePolicy()), BooleanClause.Occur.SHOULD);
                     }
                 }
                 filter = boolFiler;
-                // only cache if explicitly told to, since we cache inner filters
-                if (cache != null && cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else if (EXECUTION_VALUE_BOOL_NOCACHE.equals(execution)) {
                 XBooleanFilter boolFiler = new XBooleanFilter();
                 if (fieldMapper != null) {
@@ -259,26 +250,18 @@ public class TermsFilterParser implements FilterParser {
                     }
                 }
                 filter = boolFiler;
-                // cache the whole filter by default, or if explicitly told to
-                if (cache == null || cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else if (EXECUTION_VALUE_AND.equals(execution)) {
                 List<Filter> filters = Lists.newArrayList();
                 if (fieldMapper != null) {
                     for (Object term : terms) {
-                        filters.add(parseContext.cacheFilter(fieldMapper.termFilter(term, parseContext), null));
+                        filters.add(parseContext.cacheFilter(fieldMapper.termFilter(term, parseContext), null, parseContext.autoFilterCachePolicy()));
                     }
                 } else {
                     for (Object term : terms) {
-                        filters.add(parseContext.cacheFilter(new TermFilter(new Term(fieldName, BytesRefs.toBytesRef(term))), null));
+                        filters.add(parseContext.cacheFilter(new TermFilter(new Term(fieldName, BytesRefs.toBytesRef(term))), null, parseContext.autoFilterCachePolicy()));
                     }
                 }
                 filter = new AndFilter(filters);
-                // only cache if explicitly told to, since we cache inner filters
-                if (cache != null && cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else if (EXECUTION_VALUE_AND_NOCACHE.equals(execution)) {
                 List<Filter> filters = Lists.newArrayList();
                 if (fieldMapper != null) {
@@ -291,26 +274,18 @@ public class TermsFilterParser implements FilterParser {
                     }
                 }
                 filter = new AndFilter(filters);
-                // cache the whole filter by default, or if explicitly told to
-                if (cache == null || cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else if (EXECUTION_VALUE_OR.equals(execution)) {
                 List<Filter> filters = Lists.newArrayList();
                 if (fieldMapper != null) {
                     for (Object term : terms) {
-                        filters.add(parseContext.cacheFilter(fieldMapper.termFilter(term, parseContext), null));
+                        filters.add(parseContext.cacheFilter(fieldMapper.termFilter(term, parseContext), null, parseContext.autoFilterCachePolicy()));
                     }
                 } else {
                     for (Object term : terms) {
-                        filters.add(parseContext.cacheFilter(new TermFilter(new Term(fieldName, BytesRefs.toBytesRef(term))), null));
+                        filters.add(parseContext.cacheFilter(new TermFilter(new Term(fieldName, BytesRefs.toBytesRef(term))), null, parseContext.autoFilterCachePolicy()));
                     }
                 }
                 filter = new OrFilter(filters);
-                // only cache if explicitly told to, since we cache inner filters
-                if (cache != null && cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else if (EXECUTION_VALUE_OR_NOCACHE.equals(execution)) {
                 List<Filter> filters = Lists.newArrayList();
                 if (fieldMapper != null) {
@@ -323,14 +298,14 @@ public class TermsFilterParser implements FilterParser {
                     }
                 }
                 filter = new OrFilter(filters);
-                // cache the whole filter by default, or if explicitly told to
-                if (cache == null || cache) {
-                    filter = parseContext.cacheFilter(filter, cacheKey);
-                }
             } else {
                 throw new QueryParsingException(parseContext.index(), "terms filter execution value [" + execution + "] not supported");
             }
 
+            if (cache != null) {
+                filter = parseContext.cacheFilter(filter, cacheKey, cache);
+            }
+
             filter = wrapSmartNameFilter(filter, smartNameFieldMappers, parseContext);
             if (filterName != null) {
                 parseContext.addNamedFilter(filterName, filter);

+ 1 - 1
src/main/java/org/elasticsearch/index/query/TopChildrenQueryParser.java

@@ -133,7 +133,7 @@ public class TopChildrenQueryParser implements QueryParser {
 
         innerQuery.setBoost(boost);
         // wrap the query with type query
-        innerQuery = new FilteredQuery(innerQuery, parseContext.cacheFilter(childDocMapper.typeFilter(), null));
+        innerQuery = new FilteredQuery(innerQuery, parseContext.cacheFilter(childDocMapper.typeFilter(), null, parseContext.autoFilterCachePolicy()));
         ParentChildIndexFieldData parentChildIndexFieldData = parseContext.getForField(parentFieldMapper);
         TopChildrenQuery query = new TopChildrenQuery(parentChildIndexFieldData, innerQuery, childType, parentType, scoreType, factor, incrementalFactor, nonNestedDocsFilter);
         if (queryName != null) {

+ 1 - 1
src/main/java/org/elasticsearch/index/query/TypeFilterParser.java

@@ -71,6 +71,6 @@ public class TypeFilterParser implements FilterParser {
         } else {
             filter = documentMapper.typeFilter();
         }
-        return parseContext.cacheFilter(filter, null);
+        return parseContext.cacheFilter(filter, null, parseContext.autoFilterCachePolicy());
     }
 }

+ 2 - 2
src/main/java/org/elasticsearch/index/query/support/QueryParsers.java

@@ -105,7 +105,7 @@ public final class QueryParsers {
             return query;
         }
         DocumentMapper docMapper = smartFieldMappers.docMapper();
-        return new FilteredQuery(query, parseContext.cacheFilter(docMapper.typeFilter(), null));
+        return new FilteredQuery(query, parseContext.cacheFilter(docMapper.typeFilter(), null, parseContext.autoFilterCachePolicy()));
     }
 
     public static Filter wrapSmartNameFilter(Filter filter, @Nullable MapperService.SmartNameFieldMappers smartFieldMappers,
@@ -117,6 +117,6 @@ public final class QueryParsers {
             return filter;
         }
         DocumentMapper docMapper = smartFieldMappers.docMapper();
-        return new AndFilter(ImmutableList.of(parseContext.cacheFilter(docMapper.typeFilter(), null), filter));
+        return new AndFilter(ImmutableList.of(parseContext.cacheFilter(docMapper.typeFilter(), null, parseContext.autoFilterCachePolicy()), filter));
     }
 }

+ 2 - 1
src/main/java/org/elasticsearch/index/shard/IndexShard.java

@@ -20,6 +20,7 @@
 package org.elasticsearch.index.shard;
 
 import com.google.common.base.Charsets;
+
 import org.apache.lucene.codecs.PostingsFormat;
 import org.apache.lucene.index.CheckIndex;
 import org.apache.lucene.search.Filter;
@@ -866,7 +867,7 @@ public class IndexShard extends AbstractIndexShardComponent implements IndexShar
     private Query filterQueryIfNeeded(Query query, String[] types) {
         Filter searchFilter = mapperService.searchFilter(types);
         if (searchFilter != null) {
-            query = new FilteredQuery(query, indexCache.filter().cache(searchFilter));
+            query = new FilteredQuery(query, indexCache.filter().cache(searchFilter, null, indexService.queryParserService().autoFilterCachePolicy()));
         }
         return query;
     }

+ 11 - 10
src/main/java/org/elasticsearch/indices/cache/filter/terms/IndicesTermsFilterCache.java

@@ -22,6 +22,7 @@ package org.elasticsearch.indices.cache.filter.terms;
 import com.google.common.cache.Cache;
 import com.google.common.cache.CacheBuilder;
 import com.google.common.cache.Weigher;
+
 import org.apache.lucene.search.Filter;
 import org.apache.lucene.util.BytesRef;
 import org.elasticsearch.ElasticsearchException;
@@ -31,13 +32,13 @@ import org.elasticsearch.client.Client;
 import org.elasticsearch.common.Nullable;
 import org.elasticsearch.common.component.AbstractComponent;
 import org.elasticsearch.common.inject.Inject;
+import org.elasticsearch.common.lucene.HashedBytesRef;
 import org.elasticsearch.common.lucene.search.Queries;
 import org.elasticsearch.common.settings.Settings;
 import org.elasticsearch.common.unit.ByteSizeUnit;
 import org.elasticsearch.common.unit.ByteSizeValue;
 import org.elasticsearch.common.unit.TimeValue;
 import org.elasticsearch.common.xcontent.support.XContentMapValues;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 
 import java.util.List;
 import java.util.concurrent.Callable;
@@ -52,7 +53,7 @@ public class IndicesTermsFilterCache extends AbstractComponent {
 
     private final Client client;
 
-    private final Cache<BytesRef, TermsFilterValue> cache;
+    private final Cache<HashedBytesRef, TermsFilterValue> cache;
 
     @Inject
     public IndicesTermsFilterCache(Settings settings, Client client) {
@@ -63,7 +64,7 @@ public class IndicesTermsFilterCache extends AbstractComponent {
         TimeValue expireAfterWrite = componentSettings.getAsTime("expire_after_write", null);
         TimeValue expireAfterAccess = componentSettings.getAsTime("expire_after_access", null);
 
-        CacheBuilder<BytesRef, TermsFilterValue> builder = CacheBuilder.newBuilder()
+        CacheBuilder<HashedBytesRef, TermsFilterValue> builder = CacheBuilder.newBuilder()
                 .maximumWeight(size.bytes())
                 .weigher(new TermsFilterValueWeigher());
 
@@ -78,16 +79,16 @@ public class IndicesTermsFilterCache extends AbstractComponent {
     }
 
     @Nullable
-    public Filter termsFilter(final TermsLookup lookup, boolean cacheLookup, @Nullable CacheKeyFilter.Key cacheKey) throws RuntimeException {
+    public Filter termsFilter(final TermsLookup lookup, boolean cacheLookup, @Nullable HashedBytesRef cacheKey) throws RuntimeException {
         if (!cacheLookup) {
             return buildTermsFilterValue(lookup).filter;
         }
 
-        BytesRef key;
+        HashedBytesRef key;
         if (cacheKey != null) {
-            key = new BytesRef(cacheKey.bytes());
+            key = cacheKey;
         } else {
-            key = new BytesRef(lookup.toString());
+            key = new HashedBytesRef(lookup.toString());
         }
         try {
             return cache.get(key, new Callable<TermsFilterValue>() {
@@ -141,11 +142,11 @@ public class IndicesTermsFilterCache extends AbstractComponent {
         }
     }
 
-    static class TermsFilterValueWeigher implements Weigher<BytesRef, TermsFilterValue> {
+    static class TermsFilterValueWeigher implements Weigher<HashedBytesRef, TermsFilterValue> {
 
         @Override
-        public int weigh(BytesRef key, TermsFilterValue value) {
-            return (int) (key.length + value.sizeInBytes);
+        public int weigh(HashedBytesRef key, TermsFilterValue value) {
+            return (int) (key.bytes.length + value.sizeInBytes);
         }
     }
 

+ 2 - 1
src/main/java/org/elasticsearch/percolator/PercolatorService.java

@@ -20,6 +20,7 @@ package org.elasticsearch.percolator;
 
 import com.carrotsearch.hppc.ByteObjectOpenHashMap;
 import com.google.common.collect.ImmutableMap;
+
 import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.index.ReaderUtil;
 import org.apache.lucene.index.memory.ExtendedMemoryIndex;
@@ -781,7 +782,7 @@ public class PercolatorService extends AbstractComponent {
 
     private void queryBasedPercolating(Engine.Searcher percolatorSearcher, PercolateContext context, QueryCollector percolateCollector) throws IOException {
         Filter percolatorTypeFilter = context.indexService().mapperService().documentMapper(TYPE_NAME).typeFilter();
-        percolatorTypeFilter = context.indexService().cache().filter().cache(percolatorTypeFilter);
+        percolatorTypeFilter = context.indexService().cache().filter().cache(percolatorTypeFilter, null, context.indexService().queryParserService().autoFilterCachePolicy());
         FilteredQuery query = new FilteredQuery(context.percolateQuery(), percolatorTypeFilter);
         percolatorSearcher.searcher().search(query, percolateCollector);
         for (Collector queryCollector : percolateCollector.aggregatorCollector) {

+ 2 - 2
src/main/java/org/elasticsearch/search/aggregations/bucket/children/ChildrenParser.java

@@ -82,8 +82,8 @@ public class ChildrenParser implements Aggregator.Parser {
             throw new SearchParseException(context, "[children]  Type [" + childType + "] points to a non existent parent type [" + parentType + "]");
         }
 
-        Filter parentFilter = context.filterCache().cache(parentDocMapper.typeFilter());
-        Filter childFilter = context.filterCache().cache(childDocMapper.typeFilter());
+        Filter parentFilter = context.filterCache().cache(parentDocMapper.typeFilter(), null, context.queryParserService().autoFilterCachePolicy());
+        Filter childFilter = context.filterCache().cache(childDocMapper.typeFilter(), null, context.queryParserService().autoFilterCachePolicy());
 
         ParentChildIndexFieldData parentChildIndexFieldData = context.fieldData().getForField(parentFieldMapper);
         ValuesSourceConfig<ValuesSource.Bytes.WithOrdinals.ParentChild> config = new ValuesSourceConfig<>(ValuesSource.Bytes.WithOrdinals.ParentChild.class);

+ 3 - 5
src/main/java/org/elasticsearch/search/aggregations/bucket/children/ParentToChildrenAggregator.java

@@ -75,11 +75,9 @@ public class ParentToChildrenAggregator extends SingleBucketAggregator implement
                                       ValuesSource.Bytes.WithOrdinals.ParentChild valuesSource, long maxOrd, Map<String, Object> metaData) {
         super(name, factories, aggregationContext, parent, metaData);
         this.parentType = parentType;
-        // The child filter doesn't rely on random access it just used to iterate over all docs with a specific type,
-        // so use the filter cache instead. When the filter cache is smarter with what filter impl to pick we can benefit
-        // from it here
-        this.childFilter = aggregationContext.searchContext().filterCache().cache(childFilter);
-        this.parentFilter = aggregationContext.searchContext().filterCache().cache(parentFilter);
+        // these two filters are cached in the parser
+        this.childFilter = childFilter;
+        this.parentFilter = parentFilter;
         this.parentOrdToBuckets = aggregationContext.bigArrays().newLongArray(maxOrd, false);
         this.parentOrdToBuckets.fill(0, maxOrd, -1);
         this.parentOrdToOtherBuckets = new LongObjectPagedHashMap<>(aggregationContext.bigArrays());

+ 12 - 12
src/main/java/org/elasticsearch/search/aggregations/bucket/nested/NestedAggregator.java

@@ -22,6 +22,7 @@ import org.apache.lucene.index.LeafReaderContext;
 import org.apache.lucene.search.DocIdSet;
 import org.apache.lucene.search.DocIdSetIterator;
 import org.apache.lucene.search.Filter;
+import org.apache.lucene.search.FilterCachingPolicy;
 import org.apache.lucene.search.join.BitDocIdSetFilter;
 import org.apache.lucene.util.BitDocIdSet;
 import org.apache.lucene.util.BitSet;
@@ -30,7 +31,11 @@ import org.elasticsearch.common.lucene.docset.DocIdSets;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.object.ObjectMapper;
 import org.elasticsearch.index.search.nested.NonNestedDocsFilter;
-import org.elasticsearch.search.aggregations.*;
+import org.elasticsearch.search.aggregations.AggregationExecutionException;
+import org.elasticsearch.search.aggregations.Aggregator;
+import org.elasticsearch.search.aggregations.AggregatorFactories;
+import org.elasticsearch.search.aggregations.AggregatorFactory;
+import org.elasticsearch.search.aggregations.InternalAggregation;
 import org.elasticsearch.search.aggregations.bucket.SingleBucketAggregator;
 import org.elasticsearch.search.aggregations.support.AggregationContext;
 import org.elasticsearch.search.internal.SearchContext;
@@ -51,7 +56,7 @@ public class NestedAggregator extends SingleBucketAggregator implements ReaderCo
     private DocIdSetIterator childDocs;
     private BitSet parentDocs;
 
-    public NestedAggregator(String name, AggregatorFactories factories, String nestedPath, AggregationContext aggregationContext, Aggregator parentAggregator, Map<String, Object> metaData) {
+    public NestedAggregator(String name, AggregatorFactories factories, String nestedPath, AggregationContext aggregationContext, Aggregator parentAggregator, Map<String, Object> metaData, FilterCachingPolicy filterCachingPolicy) {
         super(name, factories, aggregationContext, parentAggregator, metaData);
         this.nestedPath = nestedPath;
         this.parentAggregator = parentAggregator;
@@ -67,14 +72,7 @@ public class NestedAggregator extends SingleBucketAggregator implements ReaderCo
             throw new AggregationExecutionException("[nested] nested path [" + nestedPath + "] is not nested");
         }
 
-        // TODO: Revise the cache usage for childFilter
-        // Typical usage of the childFilter in this agg is that not all parent docs match and because this agg executes
-        // in order we are maybe better off not caching? We can then iterate over the posting list and benefit from skip pointers.
-        // Even if caching does make sense it is likely that it shouldn't be forced as is today, but based on heuristics that
-        // the filter cache maintains that the childFilter should be cached.
-
-        // By caching the childFilter we're consistent with other features and previous versions.
-        childFilter = aggregationContext.searchContext().filterCache().cache(objectMapper.nestedTypeFilter());
+        childFilter = aggregationContext.searchContext().filterCache().cache(objectMapper.nestedTypeFilter(), null, filterCachingPolicy);
         // The childDocs need to be consumed in docId order, this ensures that:
         aggregationContext.ensureScoreDocsInOrder();
     }
@@ -164,15 +162,17 @@ public class NestedAggregator extends SingleBucketAggregator implements ReaderCo
     public static class Factory extends AggregatorFactory {
 
         private final String path;
+        private final FilterCachingPolicy filterCachingPolicy;
 
-        public Factory(String name, String path) {
+        public Factory(String name, String path, FilterCachingPolicy filterCachingPolicy) {
             super(name, InternalNested.TYPE.name());
             this.path = path;
+            this.filterCachingPolicy = filterCachingPolicy;
         }
 
         @Override
         public Aggregator createInternal(AggregationContext context, Aggregator parent, long expectedBucketsCount, Map<String, Object> metaData) {
-            return new NestedAggregator(name, factories, path, context, parent, metaData);
+            return new NestedAggregator(name, factories, path, context, parent, metaData, filterCachingPolicy);
         }
     }
 }

+ 1 - 1
src/main/java/org/elasticsearch/search/aggregations/bucket/nested/NestedParser.java

@@ -61,6 +61,6 @@ public class NestedParser implements Aggregator.Parser {
             throw new SearchParseException(context, "Missing [path] field for nested aggregation [" + aggregationName + "]");
         }
 
-        return new NestedAggregator.Factory(aggregationName, path);
+        return new NestedAggregator.Factory(aggregationName, path, context.queryParserService().autoFilterCachePolicy());
     }
 }

+ 1 - 1
src/main/java/org/elasticsearch/search/fetch/innerhits/InnerHitsContext.java

@@ -125,7 +125,7 @@ public final class InnerHitsContext {
                 rawParentFilter = parentObjectMapper.nestedTypeFilter();
             }
             BitDocIdSetFilter parentFilter = context.bitsetFilterCache().getBitDocIdSetFilter(rawParentFilter);
-            Filter childFilter = context.filterCache().cache(childObjectMapper.nestedTypeFilter());
+            Filter childFilter = context.filterCache().cache(childObjectMapper.nestedTypeFilter(), null, context.queryParserService().autoFilterCachePolicy());
             try {
                 Query q = new FilteredQuery(query, new NestedChildrenFilter(parentFilter, childFilter, hitContext));
                 context.searcher().search(q, topDocsCollector);

+ 1 - 1
src/main/java/org/elasticsearch/search/internal/DefaultSearchContext.java

@@ -248,7 +248,7 @@ public class DefaultSearchContext extends SearchContext {
         if (filter == null) {
             return aliasFilter;
         } else {
-            filter = filterCache().cache(filter);
+            filter = filterCache().cache(filter, null, indexService.queryParserService().autoFilterCachePolicy());
             if (aliasFilter != null) {
                 return new AndFilter(ImmutableList.of(filter, aliasFilter));
             }

+ 0 - 275
src/test/java/org/elasticsearch/index/query/IndexQueryParserFilterCachingTests.java

@@ -1,275 +0,0 @@
-/*
- * Licensed to Elasticsearch under one or more contributor
- * license agreements. See the NOTICE file distributed with
- * this work for additional information regarding copyright
- * ownership. Elasticsearch licenses this file to you under
- * the Apache License, Version 2.0 (the "License"); you may
- * not use this file except in compliance with the License.
- * You may obtain a copy of the License at
- *
- *    http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing,
- * software distributed under the License is distributed on an
- * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
- * KIND, either express or implied.  See the License for the
- * specific language governing permissions and limitations
- * under the License.
- */
-
-package org.elasticsearch.index.query;
-
-
-import org.apache.lucene.search.ConstantScoreQuery;
-import org.apache.lucene.search.Query;
-import org.elasticsearch.common.bytes.BytesArray;
-import org.elasticsearch.common.compress.CompressedString;
-import org.elasticsearch.common.inject.Injector;
-import org.elasticsearch.common.lease.Releasables;
-import org.elasticsearch.common.lucene.search.AndFilter;
-import org.elasticsearch.common.lucene.search.CachedFilter;
-import org.elasticsearch.common.lucene.search.NoCacheFilter;
-import org.elasticsearch.common.lucene.search.XBooleanFilter;
-import org.elasticsearch.common.settings.ImmutableSettings;
-import org.elasticsearch.common.settings.Settings;
-import org.elasticsearch.index.mapper.MapperService;
-import org.elasticsearch.index.IndexService;
-import org.elasticsearch.search.internal.SearchContext;
-import org.elasticsearch.test.ElasticsearchSingleNodeTest;
-import org.elasticsearch.test.TestSearchContext;
-import org.junit.After;
-import org.junit.Before;
-import org.junit.Test;
-
-import java.io.IOException;
-
-import static org.elasticsearch.common.io.Streams.copyToBytesFromClasspath;
-import static org.elasticsearch.common.io.Streams.copyToStringFromClasspath;
-import static org.hamcrest.Matchers.instanceOf;
-import static org.hamcrest.Matchers.is;
-
-/**
- *
- */
-public class IndexQueryParserFilterCachingTests extends ElasticsearchSingleNodeTest {
-
-    private Injector injector;
-    private IndexQueryParserService queryParser;
-
-    @Before
-    public void setup() throws IOException {
-        Settings settings = ImmutableSettings.settingsBuilder()
-                .put("index.cache.filter.type", "weighted")
-                .put("name", "IndexQueryParserFilterCachingTests")
-                .build();
-        IndexService indexService = createIndex("test", settings);
-        injector = indexService.injector();
-
-        MapperService mapperService = indexService.mapperService();
-        String mapping = copyToStringFromClasspath("/org/elasticsearch/index/query/mapping.json");
-        mapperService.merge("person", new CompressedString(mapping), true);
-        String childMapping = copyToStringFromClasspath("/org/elasticsearch/index/query/child-mapping.json");
-        mapperService.merge("child", new CompressedString(childMapping), true);
-        mapperService.documentMapper("person").parse(new BytesArray(copyToBytesFromClasspath("/org/elasticsearch/index/query/data.json")));
-        queryParser = injector.getInstance(IndexQueryParserService.class);
-        SearchContext.setCurrent(new TestSearchContext());
-    }
-
-    @After
-    public void removeSearchContext() {
-        SearchContext current = SearchContext.current();
-        SearchContext.removeCurrent();
-        Releasables.close(current);
-    }
-
-    private IndexQueryParserService queryParser() throws IOException {
-        return this.queryParser;
-    }
-
-    /**
-     * Runner to test our cache cases when using date range filter
-     * @param lte could be null
-     * @param gte could be null
-     * @param forcedCache true if we want to force the cache, false if we want to force no cache, null either
-     * @param expectedCache true if we expect a cached filter
-     */
-    private void testDateRangeFilterCache(IndexQueryParserService queryParser, Object gte, Object lte, Boolean forcedCache, boolean expectedCache) {
-        RangeFilterBuilder filterBuilder = FilterBuilders.rangeFilter("born")
-                .gte(gte)
-                .lte(lte);
-        if (forcedCache != null) {
-            filterBuilder.cache(forcedCache);
-        }
-
-        Query parsedQuery = queryParser.parse(QueryBuilders.constantScoreQuery(filterBuilder)).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-
-
-        if (expectedCache) {
-            if (((ConstantScoreQuery)parsedQuery).getFilter() instanceof CachedFilter) {
-                logger.info("gte [{}], lte [{}], _cache [{}] is cached", gte, lte, forcedCache);
-            } else {
-                logger.warn("gte [{}], lte [{}], _cache [{}] should be cached", gte, lte, forcedCache);
-            }
-        } else {
-            if (((ConstantScoreQuery)parsedQuery).getFilter() instanceof NoCacheFilter) {
-                logger.info("gte [{}], lte [{}], _cache [{}] is not cached", gte, lte, forcedCache);
-            } else {
-                logger.warn("gte [{}], lte [{}], _cache [{}] should not be cached", gte, lte, forcedCache);
-            }
-        }
-
-       if (expectedCache) {
-            assertThat(((ConstantScoreQuery)parsedQuery).getFilter(), instanceOf(CachedFilter.class));
-        } else {
-            assertThat(((ConstantScoreQuery)parsedQuery).getFilter(), instanceOf(NoCacheFilter.class));
-        }
-    }
-
-    /**
-     * We test all possible combinations for range date filter cache
-     */
-    @Test
-    public void testDateRangeFilterCache() throws IOException {
-        IndexQueryParserService queryParser = queryParser();
-
-        testDateRangeFilterCache(queryParser, null, null, null, true);
-        testDateRangeFilterCache(queryParser, null, null, true, true);
-        testDateRangeFilterCache(queryParser, null, null, false, false);
-        testDateRangeFilterCache(queryParser, "now", null, null, false);
-        testDateRangeFilterCache(queryParser, null, "now", null, false);
-        testDateRangeFilterCache(queryParser, "now", "now", null, false);
-        testDateRangeFilterCache(queryParser, "now/d", null, null, true);
-        testDateRangeFilterCache(queryParser, null, "now/d", null, true);
-        testDateRangeFilterCache(queryParser, "now/d", "now/d", null, true);
-        testDateRangeFilterCache(queryParser, "2012-01-01", null, null, true);
-        testDateRangeFilterCache(queryParser, null, "2012-01-01", null, true);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "2012-01-01", null, true);
-        testDateRangeFilterCache(queryParser, "now", "2012-01-01", null, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "now", null, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "now/d", null, true);
-        testDateRangeFilterCache(queryParser, "now/d", "2012-01-01", null, true);
-        testDateRangeFilterCache(queryParser, null, 1577836800, null, true);
-        testDateRangeFilterCache(queryParser, 1325376000, null, null, true);
-        testDateRangeFilterCache(queryParser, 1325376000, 1577836800, null, true);
-        testDateRangeFilterCache(queryParser, "now", 1577836800, null, false);
-        testDateRangeFilterCache(queryParser, 1325376000, "now", null, false);
-        testDateRangeFilterCache(queryParser, 1325376000, "now/d", null, true);
-        testDateRangeFilterCache(queryParser, "now/d", 1577836800, null, true);
-        testDateRangeFilterCache(queryParser, "now", null, true, false);
-        testDateRangeFilterCache(queryParser, null, "now", true, false);
-        testDateRangeFilterCache(queryParser, "now", "now", true, false);
-        testDateRangeFilterCache(queryParser, "now/d", null, true, true);
-        testDateRangeFilterCache(queryParser, null, "now/d", true, true);
-        testDateRangeFilterCache(queryParser, "now/d", "now/d", true, true);
-        testDateRangeFilterCache(queryParser, "2012-01-01", null, true, true);
-        testDateRangeFilterCache(queryParser, null, "2012-01-01", true, true);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "2012-01-01", true, true);
-        testDateRangeFilterCache(queryParser, "now", "2012-01-01", true, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "now", true, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "now/d", true, true);
-        testDateRangeFilterCache(queryParser, "now/d", "2012-01-01", true, true);
-        testDateRangeFilterCache(queryParser, null, 1577836800, true, true);
-        testDateRangeFilterCache(queryParser, 1325376000, null, true, true);
-        testDateRangeFilterCache(queryParser, 1325376000, 1577836800, true, true);
-        testDateRangeFilterCache(queryParser, "now", 1577836800, true, false);
-        testDateRangeFilterCache(queryParser, 1325376000, "now", true, false);
-        testDateRangeFilterCache(queryParser, 1325376000, "now/d", true, true);
-        testDateRangeFilterCache(queryParser, "now/d", 1577836800, true, true);
-        testDateRangeFilterCache(queryParser, "now", null, false, false);
-        testDateRangeFilterCache(queryParser, null, "now", false, false);
-        testDateRangeFilterCache(queryParser, "now", "now", false, false);
-        testDateRangeFilterCache(queryParser, "now/d", null, false, false);
-        testDateRangeFilterCache(queryParser, null, "now/d", false, false);
-        testDateRangeFilterCache(queryParser, "now/d", "now/d", false, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", null, false, false);
-        testDateRangeFilterCache(queryParser, null, "2012-01-01", false, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "2012-01-01", false, false);
-        testDateRangeFilterCache(queryParser, "now", "2012-01-01", false, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "now", false, false);
-        testDateRangeFilterCache(queryParser, "2012-01-01", "now/d", false, false);
-        testDateRangeFilterCache(queryParser, "now/d", "2012-01-01", false, false);
-        testDateRangeFilterCache(queryParser, null, 1577836800, false, false);
-        testDateRangeFilterCache(queryParser, 1325376000, null, false, false);
-        testDateRangeFilterCache(queryParser, 1325376000, 1577836800, false, false);
-        testDateRangeFilterCache(queryParser, "now", 1577836800, false, false);
-        testDateRangeFilterCache(queryParser, 1325376000, "now", false, false);
-        testDateRangeFilterCache(queryParser, 1325376000, "now/d", false, false);
-        testDateRangeFilterCache(queryParser, "now/d", 1577836800, false, false);
-    }
-
-    @Test
-    public void testNoFilterParsing() throws IOException {
-        IndexQueryParserService queryParser = queryParser();
-        String query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean.json");
-        Query parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(XBooleanFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().get(1).getFilter(), instanceOf(NoCacheFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().size(), is(2));
-
-        query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean_with_long_value.json");
-        parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(XBooleanFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().get(1).getFilter(), instanceOf(CachedFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().size(), is(2));
-
-        query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean_with_long_value_not_cached.json");
-        parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(XBooleanFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().get(1).getFilter(), instanceOf(NoCacheFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().size(), is(2));
-
-        query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean_cached_now.json");
-        parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(XBooleanFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().get(1).getFilter(), instanceOf(NoCacheFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().size(), is(2));
-
-        query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean_cached_complex_now.json");
-        parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(XBooleanFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().get(1).getFilter(), instanceOf(NoCacheFilter.class));
-        assertThat(((XBooleanFilter) ((ConstantScoreQuery) parsedQuery).getFilter()).clauses().size(), is(2));
-
-        query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean_cached.json");
-        parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(CachedFilter.class));
-
-        query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean_cached_now_with_rounding.json");
-        parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(CachedFilter.class));
-
-        query = copyToStringFromClasspath("/org/elasticsearch/index/query/date_range_in_boolean_cached_complex_now_with_rounding.json");
-        parsedQuery = queryParser.parse(query).query();
-        assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-        assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(CachedFilter.class));
-
-        try {
-            SearchContext.setCurrent(new TestSearchContext());
-            query = copyToStringFromClasspath("/org/elasticsearch/index/query/has-child.json");
-            parsedQuery = queryParser.parse(query).query();
-            assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-            assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(NoCacheFilter.class));
-
-            query = copyToStringFromClasspath("/org/elasticsearch/index/query/and-filter-cache.json");
-            parsedQuery = queryParser.parse(query).query();
-            assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-            assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(CachedFilter.class));
-
-            query = copyToStringFromClasspath("/org/elasticsearch/index/query/has-child-in-and-filter-cached.json");
-            parsedQuery = queryParser.parse(query).query();
-            assertThat(parsedQuery, instanceOf(ConstantScoreQuery.class));
-            assertThat(((ConstantScoreQuery) parsedQuery).getFilter(), instanceOf(AndFilter.class));
-        } finally {
-            SearchContext.removeCurrent();
-        }
-    }
-
-}

+ 2 - 6
src/test/java/org/elasticsearch/index/query/SimpleIndexQueryParserTests.java

@@ -51,7 +51,6 @@ import org.elasticsearch.common.unit.Fuzziness;
 import org.elasticsearch.common.xcontent.XContentFactory;
 import org.elasticsearch.common.xcontent.XContentHelper;
 import org.elasticsearch.common.xcontent.XContentParser;
-import org.elasticsearch.index.cache.filter.support.CacheKeyFilter;
 import org.elasticsearch.index.mapper.MapperService;
 import org.elasticsearch.index.mapper.core.NumberFieldMapper;
 import org.elasticsearch.index.search.NumericRangeFieldDataFilter;
@@ -687,11 +686,8 @@ public class SimpleIndexQueryParserTests extends ElasticsearchSingleNodeTest {
         ParsedQuery parsedQuery = queryParser.parse(query);
         assertThat(parsedQuery.query(), instanceOf(FilteredQuery.class));
         Filter filter = ((FilteredQuery) parsedQuery.query()).getFilter();
-        assertThat(filter, instanceOf(CacheKeyFilter.Wrapper.class));
-        CacheKeyFilter.Wrapper wrapper = (CacheKeyFilter.Wrapper) filter;
-        assertThat(new BytesRef(wrapper.cacheKey().bytes()).utf8ToString(), equalTo("key"));
-        assertThat(wrapper.wrappedFilter(), instanceOf(RegexpFilter.class));
-        RegexpFilter regexpFilter = (RegexpFilter) wrapper.wrappedFilter();
+        assertThat(filter, instanceOf(RegexpFilter.class));
+        RegexpFilter regexpFilter = (RegexpFilter) filter;
         assertThat(regexpFilter.field(), equalTo("name.first"));
         assertThat(regexpFilter.regexp(), equalTo("s.*y"));
         assertThat(regexpFilter.flags(), equalTo(INTERSECTION.value() | COMPLEMENT.value() | EMPTY.value()));

+ 0 - 25
src/test/java/org/elasticsearch/index/query/date_range_in_boolean.json

@@ -1,25 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "born" : {
-                                "gte": "2012-01-01",
-                                "lte": "now"
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 0 - 26
src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached.json

@@ -1,26 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "_cache" : true,
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "born" : {
-                                "gte": "2012-01-01",
-                                "lte": "2013-01-01"
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 0 - 26
src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_complex_now.json

@@ -1,26 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "_cache" : true,
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "born" : {
-                                "gte": "2012-01-01",
-                                "lte": "now+1m+1s"
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 0 - 26
src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_complex_now_with_rounding.json

@@ -1,26 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "_cache" : true,
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "born" : {
-                                "gte": "2012-01-01",
-                                "lte": "now+1m+1s/m"
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 0 - 26
src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_now.json

@@ -1,26 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "_cache" : true,
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "born" : {
-                                "gte": "2012-01-01",
-                                "lte": "now"
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 0 - 26
src/test/java/org/elasticsearch/index/query/date_range_in_boolean_cached_now_with_rounding.json

@@ -1,26 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "_cache" : true,
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "born" : {
-                                "gte": "2012-01-01",
-                                "lte": "now/d"
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 0 - 25
src/test/java/org/elasticsearch/index/query/date_range_in_boolean_with_long_value.json

@@ -1,25 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "born" : {
-                                "gte": 1325376000,
-                                "lte": 1577836800
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 0 - 26
src/test/java/org/elasticsearch/index/query/date_range_in_boolean_with_long_value_not_cached.json

@@ -1,26 +0,0 @@
-{
-    "constant_score": {
-        "filter": {
-            "bool": {
-                "must": [
-                    {
-                        "term": {
-                            "foo": {
-                                "value": "bar"
-                            }
-                        }
-                    },
-                    {
-                        "range" : {
-                            "_cache" : false,
-                            "born" : {
-                                "gte": 1325376000,
-                                "lte": 1577836800
-                            }
-                        }
-                    }
-                ]
-            }
-        }
-    }
-}

+ 1 - 1
src/test/java/org/elasticsearch/index/search/child/AbstractChildTests.java

@@ -129,7 +129,7 @@ public abstract class AbstractChildTests extends ElasticsearchSingleNodeLuceneTe
     }
 
     static Filter wrap(Filter filter) {
-        return SearchContext.current().filterCache().cache(filter);
+        return SearchContext.current().filterCache().cache(filter, null, SearchContext.current().indexShard().indexService().queryParserService().autoFilterCachePolicy());
     }
 
     static BitDocIdSetFilter wrapWithBitSetFilter(Filter filter) {

+ 6 - 5
src/test/java/org/elasticsearch/indices/stats/IndexStatsTests.java

@@ -19,7 +19,6 @@
 
 package org.elasticsearch.indices.stats;
 
-import org.apache.lucene.util.LuceneTestCase;
 import org.apache.lucene.util.Version;
 import org.elasticsearch.action.admin.cluster.node.stats.NodesStatsResponse;
 import org.elasticsearch.action.admin.indices.stats.CommonStats;
@@ -38,6 +37,7 @@ import org.elasticsearch.common.io.stream.BytesStreamInput;
 import org.elasticsearch.common.io.stream.BytesStreamOutput;
 import org.elasticsearch.common.settings.ImmutableSettings;
 import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.index.cache.filter.AutoFilterCachingPolicy;
 import org.elasticsearch.index.merge.policy.TieredMergePolicyProvider;
 import org.elasticsearch.index.merge.scheduler.ConcurrentMergeSchedulerProvider;
 import org.elasticsearch.index.query.FilterBuilders;
@@ -75,6 +75,7 @@ public class IndexStatsTests extends ElasticsearchIntegrationTest {
         return ImmutableSettings.settingsBuilder().put(super.nodeSettings(nodeOrdinal))
                 .put("indices.cache.filter.clean_interval", "1ms")
                 .put(IndicesQueryCache.INDICES_CACHE_QUERY_CLEAN_INTERVAL, "1ms")
+                .put(AutoFilterCachingPolicy.AGGRESSIVE_CACHING_SETTINGS)
                 .build();
     }
 
@@ -93,9 +94,9 @@ public class IndexStatsTests extends ElasticsearchIntegrationTest {
         SearchResponse searchResponse = client().prepareSearch().setQuery(filteredQuery(matchAllQuery(), FilterBuilders.termFilter("field", "value").cacheKey("test_key"))).execute().actionGet();
         assertThat(searchResponse.getHits().getHits().length, equalTo(1));
         nodesStats = client().admin().cluster().prepareNodesStats().setIndices(true).execute().actionGet();
-        assertThat(nodesStats.getNodes()[0].getIndices().getFilterCache().getMemorySizeInBytes() + nodesStats.getNodes()[1].getIndices().getFilterCache().getMemorySizeInBytes(), internalCluster().hasFilterCache() ? greaterThan(0l) : is(0L));
+        assertThat(nodesStats.getNodes()[0].getIndices().getFilterCache().getMemorySizeInBytes() + nodesStats.getNodes()[1].getIndices().getFilterCache().getMemorySizeInBytes(), greaterThan(0l));
         indicesStats = client().admin().indices().prepareStats("test").clear().setFilterCache(true).execute().actionGet();
-        assertThat(indicesStats.getTotal().getFilterCache().getMemorySizeInBytes(), internalCluster().hasFilterCache() ? greaterThan(0l) : is(0L));
+        assertThat(indicesStats.getTotal().getFilterCache().getMemorySizeInBytes(), greaterThan(0l));
 
         client().admin().indices().prepareClearCache().setFilterKeys("test_key").execute().actionGet();
         nodesStats = client().admin().cluster().prepareNodesStats().setIndices(true).execute().actionGet();
@@ -184,13 +185,13 @@ public class IndexStatsTests extends ElasticsearchIntegrationTest {
         nodesStats = client().admin().cluster().prepareNodesStats().setIndices(true)
                 .execute().actionGet();
         assertThat(nodesStats.getNodes()[0].getIndices().getFieldData().getMemorySizeInBytes() + nodesStats.getNodes()[1].getIndices().getFieldData().getMemorySizeInBytes(), greaterThan(0l));
-        assertThat(nodesStats.getNodes()[0].getIndices().getFilterCache().getMemorySizeInBytes() + nodesStats.getNodes()[1].getIndices().getFilterCache().getMemorySizeInBytes(), internalCluster().hasFilterCache() ? greaterThan(0l) : is(0L));
+        assertThat(nodesStats.getNodes()[0].getIndices().getFilterCache().getMemorySizeInBytes() + nodesStats.getNodes()[1].getIndices().getFilterCache().getMemorySizeInBytes(), greaterThan(0l));
 
         indicesStats = client().admin().indices().prepareStats("test")
                 .clear().setFieldData(true).setFilterCache(true)
                 .execute().actionGet();
         assertThat(indicesStats.getTotal().getFieldData().getMemorySizeInBytes(), greaterThan(0l));
-        assertThat(indicesStats.getTotal().getFilterCache().getMemorySizeInBytes(), internalCluster().hasFilterCache() ? greaterThan(0l) : is(0L));
+        assertThat(indicesStats.getTotal().getFilterCache().getMemorySizeInBytes(), greaterThan(0l));
 
         client().admin().indices().prepareClearCache().execute().actionGet();
         Thread.sleep(100); // Make sure the filter cache entries have been removed...

+ 17 - 1
src/test/java/org/elasticsearch/search/child/SimpleChildQuerySearchTests.java

@@ -34,8 +34,12 @@ import org.elasticsearch.action.search.SearchType;
 import org.elasticsearch.cluster.metadata.IndexMetaData;
 import org.elasticsearch.common.lucene.search.function.CombineFunction;
 import org.elasticsearch.common.settings.ImmutableSettings;
+import org.elasticsearch.common.settings.Settings;
 import org.elasticsearch.common.unit.TimeValue;
 import org.elasticsearch.common.xcontent.XContentFactory;
+import org.elasticsearch.index.cache.filter.AutoFilterCachingPolicy;
+import org.elasticsearch.index.cache.filter.FilterCacheModule;
+import org.elasticsearch.index.cache.filter.weighted.WeightedFilterCache;
 import org.elasticsearch.index.fielddata.FieldDataType;
 import org.elasticsearch.index.mapper.FieldMapper.Loading;
 import org.elasticsearch.index.mapper.MergeMappingException;
@@ -49,6 +53,8 @@ import org.elasticsearch.search.aggregations.bucket.terms.Terms;
 import org.elasticsearch.search.sort.SortBuilders;
 import org.elasticsearch.search.sort.SortOrder;
 import org.elasticsearch.test.ElasticsearchIntegrationTest;
+import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
+import org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
 import org.hamcrest.Matchers;
 import org.junit.Test;
 
@@ -73,8 +79,18 @@ import static org.hamcrest.Matchers.*;
 /**
  *
  */
+@ClusterScope(scope = Scope.SUITE)
 public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
 
+    @Override
+    protected Settings nodeSettings(int nodeOrdinal) {
+        return ImmutableSettings.settingsBuilder().put(super.nodeSettings(nodeOrdinal))
+                // aggressive filter caching so that we can assert on the filter cache size
+                .put(FilterCacheModule.FilterCacheSettings.FILTER_CACHE_TYPE, WeightedFilterCache.class)
+                .put(AutoFilterCachingPolicy.AGGRESSIVE_CACHING_SETTINGS)
+                .build();
+    }
+
     @Test
     public void multiLevelChild() throws Exception {
         assertAcked(prepareCreate("test")
@@ -2052,7 +2068,7 @@ public class SimpleChildQuerySearchTests extends ElasticsearchIntegrationTest {
 
         // filter cache should not contain any thing, b/c has_child and has_parent can't be cached.
         statsResponse = client().admin().indices().prepareStats("test").clear().setFilterCache(true).get();
-        assertThat(statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes(), cluster().hasFilterCache() ? greaterThan(initialCacheSize) : is(initialCacheSize));
+        assertThat(statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes(), greaterThan(initialCacheSize));
     }
 
     // https://github.com/elasticsearch/elasticsearch/issues/5783

+ 0 - 81
src/test/java/org/elasticsearch/search/query/SimpleQueryTests.java

@@ -23,7 +23,6 @@ import org.apache.lucene.util.English;
 import org.elasticsearch.ElasticsearchException;
 import org.elasticsearch.Version;
 import org.elasticsearch.action.admin.indices.create.CreateIndexRequestBuilder;
-import org.elasticsearch.action.admin.indices.stats.IndicesStatsResponse;
 import org.elasticsearch.action.index.IndexRequestBuilder;
 import org.elasticsearch.action.search.SearchPhaseExecutionException;
 import org.elasticsearch.action.search.SearchResponse;
@@ -2134,86 +2133,6 @@ public class SimpleQueryTests extends ElasticsearchIntegrationTest {
         assertHitCount(client().prepareCount("test").setQuery(rangeQuery("field").lte(-999999999999L)).get(), 3);
     }
 
-    @Test
-    public void testRangeFilterNoCacheWithNow() throws Exception {
-        assertAcked(prepareCreate("test")
-                //no replicas to make sure we always hit the very same shard and verify the caching behaviour
-                .setSettings(ImmutableSettings.builder().put(indexSettings()).put(SETTING_NUMBER_OF_REPLICAS, 0))
-                .addMapping("type1", "date", "type=date,format=YYYY-mm-dd"));
-        ensureGreen();
-
-        client().prepareIndex("test", "type1", "1").setSource("date", "2014-01-01", "field", "value")
-                .setRefresh(true)
-                .get();
-
-        SearchResponse searchResponse = client().prepareSearch("test")
-                .setQuery(QueryBuilders.filteredQuery(matchAllQuery(), FilterBuilders.rangeFilter("date").from("2013-01-01").to("now")))
-                .get();
-        assertHitCount(searchResponse, 1l);
-
-        // filter cache should not contain any thing, b/c `now` is used in `to`.
-        IndicesStatsResponse statsResponse = client().admin().indices().prepareStats("test").clear().setFilterCache(true).get();
-        assertThat(statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes(), equalTo(0l));
-
-        searchResponse = client().prepareSearch("test")
-                .setQuery(QueryBuilders.filteredQuery(
-                        matchAllQuery(),
-                        FilterBuilders.boolFilter().cache(true)
-                                .must(FilterBuilders.matchAllFilter())
-                                .must(FilterBuilders.rangeFilter("date").from("2013-01-01").to("now"))
-                ))
-                .get();
-        assertHitCount(searchResponse, 1l);
-
-        // filter cache should not contain any thing, b/c `now` is used in `to`.
-        statsResponse = client().admin().indices().prepareStats("test").clear().setFilterCache(true).get();
-        assertThat(statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes(), equalTo(0l));
-
-
-        searchResponse = client().prepareSearch("test")
-                .setQuery(QueryBuilders.filteredQuery(
-                        matchAllQuery(),
-                        FilterBuilders.boolFilter().cache(true)
-                                .must(FilterBuilders.matchAllFilter())
-                                .must(FilterBuilders.rangeFilter("date").from("2013-01-01").to("now/d").cache(true))
-                ))
-                .get();
-        assertHitCount(searchResponse, 1l);
-        // Now with rounding is used, so we must have something in filter cache
-        statsResponse = client().admin().indices().prepareStats("test").clear().setFilterCache(true).get();
-        long filtercacheSize = statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes();
-        assertThat(filtercacheSize, cluster().hasFilterCache() ? greaterThan(0l) : is(0L));
-
-        searchResponse = client().prepareSearch("test")
-                .setQuery(QueryBuilders.filteredQuery(
-                        matchAllQuery(),
-                        FilterBuilders.boolFilter().cache(true)
-                                .must(FilterBuilders.termFilter("field", "value").cache(true))
-                                .must(FilterBuilders.rangeFilter("date").from("2013-01-01").to("now"))
-                ))
-                .get();
-        assertHitCount(searchResponse, 1l);
-
-        // and because we use term filter, it is also added to filter cache, so it should contain more than before
-        statsResponse = client().admin().indices().prepareStats("test").clear().setFilterCache(true).get();
-        assertThat(statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes(), cluster().hasFilterCache() ? greaterThan(filtercacheSize) : is(filtercacheSize));
-        filtercacheSize = statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes();
-
-        searchResponse = client().prepareSearch("test")
-                .setQuery(QueryBuilders.filteredQuery(
-                        matchAllQuery(),
-                        FilterBuilders.boolFilter().cache(true)
-                                .must(FilterBuilders.matchAllFilter())
-                                .must(FilterBuilders.rangeFilter("date").from("2013-01-01").to("now").cache(true))
-                ))
-                .get();
-        assertHitCount(searchResponse, 1l);
-
-        // The range filter is now explicitly cached but we don't want to cache now even if the user asked for it
-        statsResponse = client().admin().indices().prepareStats("test").clear().setFilterCache(true).get();
-        assertThat(statsResponse.getIndex("test").getTotal().getFilterCache().getMemorySizeInBytes(), is(filtercacheSize));
-    }
-
     @Test
     public void testRangeFilterWithTimeZone() throws Exception {
         assertAcked(prepareCreate("test")

+ 13 - 5
src/test/java/org/elasticsearch/search/scriptfilter/ScriptFilterSearchTests.java

@@ -23,6 +23,9 @@ import org.elasticsearch.action.search.SearchResponse;
 import org.elasticsearch.cluster.metadata.IndexMetaData;
 import org.elasticsearch.common.settings.ImmutableSettings;
 import org.elasticsearch.common.settings.Settings;
+import org.elasticsearch.index.cache.filter.AutoFilterCachingPolicy;
+import org.elasticsearch.index.cache.filter.FilterCacheModule;
+import org.elasticsearch.index.cache.filter.weighted.WeightedFilterCache;
 import org.elasticsearch.script.groovy.GroovyScriptEngineService;
 import org.elasticsearch.search.sort.SortOrder;
 import org.elasticsearch.test.ElasticsearchIntegrationTest;
@@ -44,7 +47,12 @@ public class ScriptFilterSearchTests extends ElasticsearchIntegrationTest {
 
     @Override
     protected Settings nodeSettings(int nodeOrdinal) {
-        return ImmutableSettings.settingsBuilder().put(super.nodeSettings(nodeOrdinal)).put(GroovyScriptEngineService.GROOVY_SCRIPT_SANDBOX_ENABLED, false).build();
+        return ImmutableSettings.settingsBuilder().put(super.nodeSettings(nodeOrdinal))
+                .put(GroovyScriptEngineService.GROOVY_SCRIPT_SANDBOX_ENABLED, false)
+                // aggressive filter caching so that we can assert on the number of iterations of the script filters
+                .put(FilterCacheModule.FilterCacheSettings.FILTER_CACHE_TYPE, WeightedFilterCache.class)
+                .put(AutoFilterCachingPolicy.AGGRESSIVE_CACHING_SETTINGS)
+                .build();
     }
 
     @Test
@@ -132,7 +140,7 @@ public class ScriptFilterSearchTests extends ElasticsearchIntegrationTest {
                 .execute().actionGet();
 
         assertThat(response.getHits().totalHits(), equalTo(1l));
-        assertThat(scriptCounter.get(), equalTo(internalCluster().hasFilterCache() ? 3 : 1));
+        assertThat(scriptCounter.get(), equalTo(3));
 
         scriptCounter.set(0);
         logger.info("running script filter the second time");
@@ -141,7 +149,7 @@ public class ScriptFilterSearchTests extends ElasticsearchIntegrationTest {
                 .execute().actionGet();
 
         assertThat(response.getHits().totalHits(), equalTo(1l));
-        assertThat(scriptCounter.get(), equalTo(cluster().hasFilterCache() ? 0 : 1));
+        assertThat(scriptCounter.get(), equalTo(0));
 
         scriptCounter.set(0);
         logger.info("running script filter with new parameters");
@@ -150,7 +158,7 @@ public class ScriptFilterSearchTests extends ElasticsearchIntegrationTest {
                 .execute().actionGet();
 
         assertThat(response.getHits().totalHits(), equalTo(1l));
-        assertThat(scriptCounter.get(), equalTo(cluster().hasFilterCache() ? 3 : 1));
+        assertThat(scriptCounter.get(), equalTo(3));
 
         scriptCounter.set(0);
         logger.info("running script filter with same parameters");
@@ -159,6 +167,6 @@ public class ScriptFilterSearchTests extends ElasticsearchIntegrationTest {
                 .execute().actionGet();
 
         assertThat(response.getHits().totalHits(), equalTo(3l));
-        assertThat(scriptCounter.get(), equalTo(cluster().hasFilterCache() ? 0 : 3));
+        assertThat(scriptCounter.get(), equalTo(0));
     }
 }

+ 0 - 5
src/test/java/org/elasticsearch/test/CompositeTestCluster.java

@@ -250,11 +250,6 @@ public class CompositeTestCluster extends TestCluster {
         }
     }
 
-    @Override
-    public boolean hasFilterCache() {
-        return true;
-    }
-
     @Override
     public String getClusterName() {
         return cluster.getClusterName();

+ 0 - 5
src/test/java/org/elasticsearch/test/ExternalTestCluster.java

@@ -164,11 +164,6 @@ public final class ExternalTestCluster extends TestCluster {
         return Lists.newArrayList(client).iterator();
     }
 
-    @Override
-    public boolean hasFilterCache() {
-        return true; // default
-    }
-
     @Override
     public String getClusterName() {
         return clusterName;

+ 42 - 14
src/test/java/org/elasticsearch/test/InternalTestCluster.java

@@ -25,10 +25,16 @@ import com.carrotsearch.randomizedtesting.generators.RandomPicks;
 import com.carrotsearch.randomizedtesting.generators.RandomStrings;
 import com.google.common.base.Predicate;
 import com.google.common.base.Predicates;
-import com.google.common.collect.*;
+import com.google.common.collect.Collections2;
+import com.google.common.collect.Iterables;
+import com.google.common.collect.Iterators;
+import com.google.common.collect.Lists;
+import com.google.common.collect.Maps;
+import com.google.common.collect.Sets;
 import com.google.common.util.concurrent.Futures;
 import com.google.common.util.concurrent.ListenableFuture;
 import com.google.common.util.concurrent.SettableFuture;
+
 import org.apache.lucene.util.AbstractRandomizedTest;
 import org.apache.lucene.util.IOUtils;
 import org.elasticsearch.ElasticsearchException;
@@ -70,6 +76,7 @@ import org.elasticsearch.common.util.concurrent.EsExecutors;
 import org.elasticsearch.env.NodeEnvironment;
 import org.elasticsearch.http.HttpServerTransport;
 import org.elasticsearch.index.IndexService;
+import org.elasticsearch.index.cache.filter.AutoFilterCachingPolicy;
 import org.elasticsearch.index.cache.filter.FilterCacheModule;
 import org.elasticsearch.index.cache.filter.none.NoneFilterCache;
 import org.elasticsearch.index.cache.filter.weighted.WeightedFilterCache;
@@ -102,19 +109,34 @@ import java.io.IOException;
 import java.net.InetSocketAddress;
 import java.nio.file.Path;
 import java.nio.file.Paths;
-import java.util.*;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Collection;
+import java.util.Collections;
+import java.util.HashSet;
+import java.util.Iterator;
+import java.util.List;
+import java.util.Map;
+import java.util.NavigableMap;
+import java.util.Random;
+import java.util.Set;
+import java.util.TreeMap;
 import java.util.concurrent.ExecutorService;
 import java.util.concurrent.TimeUnit;
 import java.util.concurrent.atomic.AtomicBoolean;
 import java.util.concurrent.atomic.AtomicInteger;
 
 import static junit.framework.Assert.fail;
-import static org.apache.lucene.util.LuceneTestCase.*;
+import static org.apache.lucene.util.LuceneTestCase.TEST_NIGHTLY;
+import static org.apache.lucene.util.LuceneTestCase.rarely;
+import static org.apache.lucene.util.LuceneTestCase.usually;
 import static org.elasticsearch.common.settings.ImmutableSettings.settingsBuilder;
 import static org.elasticsearch.node.NodeBuilder.nodeBuilder;
 import static org.elasticsearch.test.ElasticsearchTestCase.assertBusy;
 import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoTimeout;
-import static org.hamcrest.Matchers.*;
+import static org.hamcrest.Matchers.equalTo;
+import static org.hamcrest.Matchers.greaterThan;
+import static org.hamcrest.Matchers.greaterThanOrEqualTo;
 import static org.junit.Assert.assertFalse;
 import static org.junit.Assert.assertThat;
 
@@ -191,8 +213,6 @@ public final class InternalTestCluster extends TestCluster {
 
     private final ExecutorService executor;
 
-    private final boolean hasFilterCache;
-
     /**
      * All nodes started by the cluster will have their name set to nodePrefix followed by a positive number
      */
@@ -301,7 +321,6 @@ public final class InternalTestCluster extends TestCluster {
         builder.put(RecoverySettings.INDICES_RECOVERY_RETRY_DELAY, TimeValue.timeValueMillis(RandomInts.randomIntBetween(random, 20, 50)));
         defaultSettings = builder.build();
         executor = EsExecutors.newCached(0, TimeUnit.SECONDS, EsExecutors.daemonThreadFactory("test_" + clusterName));
-        this.hasFilterCache = random.nextBoolean();
     }
 
     public static String nodeMode() {
@@ -340,8 +359,7 @@ public final class InternalTestCluster extends TestCluster {
 
     private Settings getSettings(int nodeOrdinal, long nodeSeed, Settings others) {
         Builder builder = ImmutableSettings.settingsBuilder().put(defaultSettings)
-                .put(getRandomNodeSettings(nodeSeed))
-                .put(FilterCacheModule.FilterCacheSettings.FILTER_CACHE_TYPE, hasFilterCache() ? WeightedFilterCache.class : NoneFilterCache.class);
+                .put(getRandomNodeSettings(nodeSeed));
         Settings settings = settingsSource.node(nodeOrdinal);
         if (settings != null) {
             if (settings.get(ClusterName.SETTING) != null) {
@@ -428,6 +446,21 @@ public final class InternalTestCluster extends TestCluster {
             builder.put(HierarchyCircuitBreakerService.FIELDDATA_CIRCUIT_BREAKER_TYPE_SETTING, "noop");
         }
 
+        if (random.nextBoolean()) {
+            builder.put(FilterCacheModule.FilterCacheSettings.FILTER_CACHE_TYPE, random.nextBoolean() ? WeightedFilterCache.class : NoneFilterCache.class);
+        }
+
+        if (random.nextBoolean()) {
+            final int freqCacheable = 1 + random.nextInt(5);
+            final int freqCostly = 1 + random.nextInt(5);
+            final int freqOther = Math.max(freqCacheable, freqCostly) + random.nextInt(2);
+            builder.put(AutoFilterCachingPolicy.HISTORY_SIZE, 3 + random.nextInt(100));
+            builder.put(AutoFilterCachingPolicy.MIN_FREQUENCY_CACHEABLE, freqCacheable);
+            builder.put(AutoFilterCachingPolicy.MIN_FREQUENCY_COSTLY, freqCostly);
+            builder.put(AutoFilterCachingPolicy.MIN_FREQUENCY_OTHER, freqOther);
+            builder.put(AutoFilterCachingPolicy.MIN_SEGMENT_SIZE_RATIO, random.nextFloat());
+        }
+
         return builder.build();
     }
 
@@ -1452,11 +1485,6 @@ public final class InternalTestCluster extends TestCluster {
         return benchNodeAndClients().size();
     }
 
-    @Override
-    public boolean hasFilterCache() {
-        return hasFilterCache;
-    }
-
     public void setDisruptionScheme(ServiceDisruptionScheme scheme) {
         clearDisruptionScheme();
         scheme.applyToCluster(this);

+ 0 - 5
src/test/java/org/elasticsearch/test/TestCluster.java

@@ -208,11 +208,6 @@ public abstract class TestCluster implements Iterable<Client>, Closeable {
      */
     public abstract void ensureEstimatedStats();
 
-    /**
-     * Return whether or not this cluster can cache filters.
-     */
-    public abstract boolean hasFilterCache();
-
     /**
      * Returns the cluster name
      */

+ 45 - 14
src/test/java/org/elasticsearch/validate/SimpleValidateQueryTests.java

@@ -19,24 +19,32 @@
 package org.elasticsearch.validate;
 
 import com.google.common.base.Charsets;
+
 import org.elasticsearch.action.admin.indices.alias.Alias;
 import org.elasticsearch.action.admin.indices.validate.query.ValidateQueryResponse;
 import org.elasticsearch.client.Client;
 import org.elasticsearch.common.bytes.BytesArray;
 import org.elasticsearch.common.geo.GeoDistance;
 import org.elasticsearch.common.settings.ImmutableSettings;
+import org.elasticsearch.common.settings.Settings;
 import org.elasticsearch.common.unit.DistanceUnit;
 import org.elasticsearch.common.xcontent.XContentFactory;
+import org.elasticsearch.index.cache.filter.AutoFilterCachingPolicy;
+import org.elasticsearch.index.cache.filter.FilterCacheModule;
+import org.elasticsearch.index.cache.filter.none.NoneFilterCache;
+import org.elasticsearch.index.cache.filter.weighted.WeightedFilterCache;
 import org.elasticsearch.index.query.FilterBuilders;
 import org.elasticsearch.index.query.QueryBuilder;
 import org.elasticsearch.index.query.QueryBuilders;
 import org.elasticsearch.indices.IndexMissingException;
 import org.elasticsearch.test.ElasticsearchIntegrationTest;
 import org.elasticsearch.test.ElasticsearchIntegrationTest.ClusterScope;
+import org.elasticsearch.test.ElasticsearchIntegrationTest.Scope;
 import org.hamcrest.Matcher;
 import org.joda.time.DateTime;
 import org.joda.time.DateTimeZone;
 import org.joda.time.format.ISODateTimeFormat;
+import org.junit.BeforeClass;
 import org.junit.Test;
 
 import java.io.IOException;
@@ -44,14 +52,37 @@ import java.io.IOException;
 import static org.elasticsearch.index.query.QueryBuilders.queryStringQuery;
 import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertAcked;
 import static org.elasticsearch.test.hamcrest.ElasticsearchAssertions.assertNoFailures;
-import static org.hamcrest.Matchers.*;
+import static org.hamcrest.Matchers.containsString;
+import static org.hamcrest.Matchers.equalTo;
+import static org.hamcrest.Matchers.nullValue;
 
 /**
  *
  */
-@ClusterScope(randomDynamicTemplates = false)
+@ClusterScope(randomDynamicTemplates = false, scope = Scope.SUITE)
 public class SimpleValidateQueryTests extends ElasticsearchIntegrationTest {
 
+    static Boolean hasFilterCache;
+
+    @BeforeClass
+    public static void enableFilterCache() {
+        assert hasFilterCache == null;
+        hasFilterCache = randomBoolean();
+    }
+
+    @Override
+    protected Settings nodeSettings(int nodeOrdinal) {
+        ImmutableSettings.Builder builder = ImmutableSettings.settingsBuilder().put(super.nodeSettings(nodeOrdinal));
+        if (hasFilterCache) {
+            // cache everything
+            builder.put(FilterCacheModule.FilterCacheSettings.FILTER_CACHE_TYPE, WeightedFilterCache.class)
+                   .put(AutoFilterCachingPolicy.AGGRESSIVE_CACHING_SETTINGS);
+        } else {
+            builder.put(FilterCacheModule.FilterCacheSettings.FILTER_CACHE_TYPE, NoneFilterCache.class);
+        }
+        return builder.build();
+    }
+
     @Test
     public void simpleValidateQuery() throws Exception {
         createIndex("test");
@@ -79,7 +110,7 @@ public class SimpleValidateQueryTests extends ElasticsearchIntegrationTest {
 
     private static String filter(String uncachedFilter) {
         String filter = uncachedFilter;
-        if (cluster().hasFilterCache()) {
+        if (hasFilterCache) {
             filter = "cache(" + filter + ")";
         }
         return filter;
@@ -132,14 +163,14 @@ public class SimpleValidateQueryTests extends ElasticsearchIntegrationTest {
                         FilterBuilders.termFilter("bar", "2"),
                         FilterBuilders.termFilter("baz", "3")
                 )
-        ), equalTo("filtered(filtered(foo:1)->" + filter("bar:[2 TO 2]") + " " + filter("baz:3") + ")->" + typeFilter));
+        ), equalTo("filtered(filtered(foo:1)->" + filter(filter("bar:[2 TO 2]") + " " + filter("baz:3")) + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.filteredQuery(
                 QueryBuilders.termQuery("foo", "1"),
                 FilterBuilders.orFilter(
                         FilterBuilders.termFilter("bar", "2")
                 )
-        ), equalTo("filtered(filtered(foo:1)->" + filter("bar:[2 TO 2]") + ")->" + typeFilter));
+        ), equalTo("filtered(filtered(foo:1)->" + filter(filter("bar:[2 TO 2]")) + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.filteredQuery(
                 QueryBuilders.matchAllQuery(),
@@ -148,28 +179,28 @@ public class SimpleValidateQueryTests extends ElasticsearchIntegrationTest {
                         .addPoint(30, -80)
                         .addPoint(20, -90)
                         .addPoint(40, -70)    // closing polygon
-        ), equalTo("filtered(ConstantScore(GeoPolygonFilter(pin.location, [[40.0, -70.0], [30.0, -80.0], [20.0, -90.0], [40.0, -70.0]])))->" + typeFilter));
+        ), equalTo("filtered(ConstantScore(" + filter("GeoPolygonFilter(pin.location, [[40.0, -70.0], [30.0, -80.0], [20.0, -90.0], [40.0, -70.0]]))") + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.constantScoreQuery(FilterBuilders.geoBoundingBoxFilter("pin.location")
                 .topLeft(40, -80)
                 .bottomRight(20, -70)
-        ), equalTo("filtered(ConstantScore(GeoBoundingBoxFilter(pin.location, [40.0, -80.0], [20.0, -70.0])))->" + typeFilter));
+        ), equalTo("filtered(ConstantScore(" + filter("GeoBoundingBoxFilter(pin.location, [40.0, -80.0], [20.0, -70.0]))") + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.constantScoreQuery(FilterBuilders.geoDistanceFilter("pin.location")
                 .lat(10).lon(20).distance(15, DistanceUnit.DEFAULT).geoDistance(GeoDistance.PLANE)
-        ), equalTo("filtered(ConstantScore(GeoDistanceFilter(pin.location, PLANE, 15.0, 10.0, 20.0)))->" + typeFilter));
+        ), equalTo("filtered(ConstantScore(" + filter("GeoDistanceFilter(pin.location, PLANE, 15.0, 10.0, 20.0))") + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.constantScoreQuery(FilterBuilders.geoDistanceFilter("pin.location")
                 .lat(10).lon(20).distance(15, DistanceUnit.DEFAULT).geoDistance(GeoDistance.PLANE)
-        ), equalTo("filtered(ConstantScore(GeoDistanceFilter(pin.location, PLANE, 15.0, 10.0, 20.0)))->" + typeFilter));
+        ), equalTo("filtered(ConstantScore(" + filter("GeoDistanceFilter(pin.location, PLANE, 15.0, 10.0, 20.0))") + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.constantScoreQuery(FilterBuilders.geoDistanceRangeFilter("pin.location")
                 .lat(10).lon(20).from("15m").to("25m").geoDistance(GeoDistance.PLANE)
-        ), equalTo("filtered(ConstantScore(GeoDistanceRangeFilter(pin.location, PLANE, [15.0 - 25.0], 10.0, 20.0)))->" + typeFilter));
+        ), equalTo("filtered(ConstantScore(" + filter("GeoDistanceRangeFilter(pin.location, PLANE, [15.0 - 25.0], 10.0, 20.0))") + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.constantScoreQuery(FilterBuilders.geoDistanceRangeFilter("pin.location")
                 .lat(10).lon(20).from("15miles").to("25miles").geoDistance(GeoDistance.PLANE)
-        ), equalTo("filtered(ConstantScore(GeoDistanceRangeFilter(pin.location, PLANE, [" + DistanceUnit.DEFAULT.convert(15.0, DistanceUnit.MILES) + " - " + DistanceUnit.DEFAULT.convert(25.0, DistanceUnit.MILES) + "], 10.0, 20.0)))->" + typeFilter));
+        ), equalTo("filtered(ConstantScore(" + filter("GeoDistanceRangeFilter(pin.location, PLANE, [" + DistanceUnit.DEFAULT.convert(15.0, DistanceUnit.MILES) + " - " + DistanceUnit.DEFAULT.convert(25.0, DistanceUnit.MILES) + "], 10.0, 20.0))") + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.filteredQuery(
                 QueryBuilders.termQuery("foo", "1"),
@@ -177,13 +208,13 @@ public class SimpleValidateQueryTests extends ElasticsearchIntegrationTest {
                         FilterBuilders.termFilter("bar", "2"),
                         FilterBuilders.termFilter("baz", "3")
                 )
-        ), equalTo("filtered(filtered(foo:1)->+" + filter("bar:[2 TO 2]") + " +" + filter("baz:3") + ")->" + typeFilter));
+        ), equalTo("filtered(filtered(foo:1)->" + filter("+" + filter("bar:[2 TO 2]") + " +" + filter("baz:3")) + ")->" + typeFilter));
 
         assertExplanation(QueryBuilders.constantScoreQuery(FilterBuilders.termsFilter("foo", "1", "2", "3")),
                 equalTo("filtered(ConstantScore(" + filter("foo:1 foo:2 foo:3") + "))->" + typeFilter));
 
         assertExplanation(QueryBuilders.constantScoreQuery(FilterBuilders.notFilter(FilterBuilders.termFilter("foo", "bar"))),
-                equalTo("filtered(ConstantScore(NotFilter(" + filter("foo:bar") + ")))->" + typeFilter));
+                equalTo("filtered(ConstantScore(" + filter("NotFilter(" + filter("foo:bar") + ")") + "))->" + typeFilter));
 
         assertExplanation(QueryBuilders.filteredQuery(
                 QueryBuilders.termQuery("foo", "1"),
@@ -196,7 +227,7 @@ public class SimpleValidateQueryTests extends ElasticsearchIntegrationTest {
         assertExplanation(QueryBuilders.filteredQuery(
                 QueryBuilders.termQuery("foo", "1"),
                 FilterBuilders.scriptFilter("true")
-        ), equalTo("filtered(filtered(foo:1)->ScriptFilter(true))->" + typeFilter));
+        ), equalTo("filtered(filtered(foo:1)->" + filter("ScriptFilter(true)") + ")->" + typeFilter));
 
     }