Skip to content

Commit

Permalink
Merge branch 'opensearch-project:main' into tokenManager
Browse files Browse the repository at this point in the history
  • Loading branch information
stephen-crawford authored May 15, 2023
2 parents 6cd0353 + 9e6e041 commit 53f58a3
Show file tree
Hide file tree
Showing 169 changed files with 3,330 additions and 585 deletions.
17 changes: 15 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Allow mmap to use new JDK-19 preview APIs in Apache Lucene 9.4+ ([#5151](https://github.com/opensearch-project/OpenSearch/pull/5151))
- Add events correlation engine plugin ([#6854](https://github.com/opensearch-project/OpenSearch/issues/6854))
- Add connectToNodeAsExtension in TransportService ([#6866](https://github.com/opensearch-project/OpenSearch/pull/6866))
- Adds ExtensionsManager.lookupExtensionSettingsById ([#7466](https://github.com/opensearch-project/OpenSearch/pull/7466))

### Dependencies
- Bump `log4j-core` from 2.18.0 to 2.19.0
Expand Down Expand Up @@ -39,7 +40,6 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Bump `org.apache.commons:commons-compress` from 1.22 to 1.23.0
- Bump `org.apache.commons:commons-configuration2` from 2.8.0 to 2.9.0
- Bump `com.netflix.nebula:nebula-publishing-plugin` from 19.2.0 to 20.3.0
- Bump `org.apache.commons:commons-compress` from 1.22 to 1.23.0
- Bump `com.diffplug.spotless` from 6.17.0 to 6.18.0
- Bump `io.opencensus:opencensus-api` from 0.18.0 to 0.31.1 ([#7291](https://github.com/opensearch-project/OpenSearch/pull/7291))

Expand Down Expand Up @@ -74,6 +74,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Fix compression support for h2c protocol ([#4944](https://github.com/opensearch-project/OpenSearch/pull/4944))
- Support OpenSSL Provider with default Netty allocator ([#5460](https://github.com/opensearch-project/OpenSearch/pull/5460))
- Replaces ZipInputStream with ZipFile to fix Zip Slip vulnerability ([#7230](https://github.com/opensearch-project/OpenSearch/pull/7230))
- Add missing validation/parsing of SearchBackpressureMode of SearchBackpressureSettings ([#7541](https://github.com/opensearch-project/OpenSearch/pull/7541))

### Security

Expand All @@ -82,9 +83,12 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- [Extensions] Moving Extensions APIs to support cross versions via protobuf. ([#7402](https://github.com/opensearch-project/OpenSearch/issues/7402))
- [Extensions] Add IdentityPlugin into core to support Extension identities ([#7246](https://github.com/opensearch-project/OpenSearch/pull/7246))
- Add connectToNodeAsExtension in TransportService ([#6866](https://github.com/opensearch-project/OpenSearch/pull/6866))
- [Search Pipelines] Accept pipelines defined in search source ([#7253](https://github.com/opensearch-project/OpenSearch/pull/7253))
- Add descending order search optimization through reverse segment read. ([#7244](https://github.com/opensearch-project/OpenSearch/pull/7244))
- Add 'unsigned_long' numeric field type ([#6237](https://github.com/opensearch-project/OpenSearch/pull/6237))
- Add back primary shard preference for queries ([#7375](https://github.com/opensearch-project/OpenSearch/pull/7375))
- Add descending order search optimization through reverse segment read. ([#7244](https://github.com/opensearch-project/OpenSearch/pull/7244))
- Adds ExtensionsManager.lookupExtensionSettingsById ([#7466](https://github.com/opensearch-project/OpenSearch/pull/7466))

### Dependencies
- Bump `com.netflix.nebula:gradle-info-plugin` from 12.0.0 to 12.1.0
Expand All @@ -98,19 +102,28 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
- Bump `commons-io:commons-io` from 2.7 to 2.11.0
- Bump `org.apache.shiro:shiro-core` from 1.9.1 to 1.11.0 ([#7397](https://github.com/opensearch-project/OpenSearch/pull/7397))
- Bump `jetty-server` in hdfs-fixture from 9.4.49.v20220914 to 9.4.51.v20230217 ([#7405](https://github.com/opensearch-project/OpenSearch/pull/7405))
- OpenJDK Update (April 2023 Patch releases) ([#7448](https://github.com/opensearch-project/OpenSearch/pull/7448)
- Bump `org.apache.commons:commons-compress` from 1.22 to 1.23.0 (#7462)
- Bump `com.azure:azure-core` from 1.34.0 to 1.39.0
- Bump `com.networknt:json-schema-validator` from 1.0.78 to 1.0.81 (#7460)
- Bump Apache Lucene to 9.6.0 ([#7505](https://github.com/opensearch-project/OpenSearch/pull/7505))
- Bump `com.google.cloud:google-cloud-core-http` from 1.93.3 to 2.17.0 (#7488)
- Bump `com.google.guava:guava` from 30.1.1-jre to 31.1-jre (#7565)

### Changed
- Enable `./gradlew build` on MacOS by disabling bcw tests ([#7303](https://github.com/opensearch-project/OpenSearch/pull/7303))
- Moved concurrent-search from sandbox plugin to server module behind feature flag ([#7203](https://github.com/opensearch-project/OpenSearch/pull/7203))
- Allow access to indices cache clear APIs for read only indexes ([#7303](https://github.com/opensearch-project/OpenSearch/pull/7303))

### Deprecated

### Removed

### Fixed
- Replaces ZipInputStream with ZipFile to fix Zip Slip vulnerability ([#7230](https://github.com/opensearch-project/OpenSearch/pull/7230))
- Add missing validation/parsing of SearchBackpressureMode of SearchBackpressureSettings ([#7541](https://github.com/opensearch-project/OpenSearch/pull/7541))

### Security

[Unreleased 3.0]: https://github.com/opensearch-project/OpenSearch/compare/2.x...HEAD
[Unreleased 2.x]: https://github.com/opensearch-project/OpenSearch/compare/2.7...2.x
[Unreleased 2.x]: https://github.com/opensearch-project/OpenSearch/compare/2.7...2.x
9 changes: 5 additions & 4 deletions DEVELOPER_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -614,8 +614,9 @@ Pass a list of files or directories to limit your search.

### Lucene Snapshots

The Github workflow in [lucene-snapshots.yml](.github/workflows/lucene-snapshots.yml) is a Github worfklow executable by maintainers to build a top-down snapshot build of lucene.
The Github workflow in [lucene-snapshots.yml](.github/workflows/lucene-snapshots.yml) is a GitHub workflow executable by maintainers to build a top-down snapshot build of Lucene.
These snapshots are available to test compatibility with upcoming changes to Lucene by updating the version at [version.properties](buildsrc/version.properties) with the `version-snapshot-sha` version. Example: `lucene = 10.0.0-snapshot-2e941fc`.
Note that these snapshots do not follow the Maven [naming convention](https://maven.apache.org/guides/getting-started/index.html#what-is-a-snapshot-version) with a (case sensitive) SNAPSHOT suffix, so these artifacts are considered "releases" by build systems such as the `mavenContent` repository filter in Gradle or `releases` artifact policies in Maven.

### Flaky Tests

Expand All @@ -626,6 +627,6 @@ If you encounter a build/test failure in CI that is unrelated to the change in y
1. Follow failed CI links, and locate the failing test(s).
2. Copy-paste the failure into a comment of your PR.
3. Search through [issues](https://github.com/opensearch-project/OpenSearch/issues?q=is%3Aopen+is%3Aissue+label%3A%22flaky-test%22) using the name of the failed test for whether this is a known flaky test.
5. If an existing issue is found, paste a link to the known issue in a comment to your PR.
6. If no existing issue is found, open one.
7. Retry CI via the GitHub UX or by pushing an update to your PR.
4. If an existing issue is found, paste a link to the known issue in a comment to your PR.
5. If no existing issue is found, open one.
6. Retry CI via the GitHub UX or by pushing an update to your PR.
2 changes: 1 addition & 1 deletion buildSrc/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ dependencies {
api 'com.avast.gradle:gradle-docker-compose-plugin:0.16.12'
api "org.yaml:snakeyaml:${props.getProperty('snakeyaml')}"
api 'org.apache.maven:maven-model:3.9.1'
api 'com.networknt:json-schema-validator:1.0.78'
api 'com.networknt:json-schema-validator:1.0.81'
api "com.fasterxml.jackson.core:jackson-databind:${props.getProperty('jackson_databind')}"

testFixturesApi "junit:junit:${props.getProperty('junit')}"
Expand Down
2 changes: 1 addition & 1 deletion buildSrc/version.properties
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
opensearch = 3.0.0
lucene = 9.6.0-snapshot-a3ae27f
lucene = 9.6.0

bundled_jdk_vendor = adoptium
bundled_jdk = 19.0.2+7
Expand Down
5 changes: 4 additions & 1 deletion libs/common/src/main/java/org/opensearch/common/Numbers.java
Original file line number Diff line number Diff line change
Expand Up @@ -140,6 +140,9 @@ public static BigInteger toUnsignedLongExact(Number value) {
private static BigDecimal BIGDECIMAL_GREATER_THAN_LONG_MAX_VALUE = BigDecimal.valueOf(Long.MAX_VALUE).add(BigDecimal.ONE);
private static BigDecimal BIGDECIMAL_LESS_THAN_LONG_MIN_VALUE = BigDecimal.valueOf(Long.MIN_VALUE).subtract(BigDecimal.ONE);
private static BigDecimal BIGDECIMAL_GREATER_THAN_USIGNED_LONG_MAX_VALUE = new BigDecimal(MAX_UNSIGNED_LONG_VALUE).add(BigDecimal.ONE);
private static BigDecimal BIGDECIMAL_LESS_THAN_USIGNED_LONG_MIN_VALUE = new BigDecimal(MIN_UNSIGNED_LONG_VALUE).subtract(
BigDecimal.ONE
);

/** Return the long that {@code stringValue} stores or throws an exception if the
* stored value cannot be converted to a long that stores the exact same
Expand Down Expand Up @@ -180,7 +183,7 @@ public static BigInteger toUnsignedLong(String stringValue, boolean coerce) {
try {
BigDecimal bigDecimalValue = new BigDecimal(stringValue);
if (bigDecimalValue.compareTo(BIGDECIMAL_GREATER_THAN_USIGNED_LONG_MAX_VALUE) >= 0
|| bigDecimalValue.compareTo(BigDecimal.ZERO) <= 0) {
|| bigDecimalValue.compareTo(BIGDECIMAL_LESS_THAN_USIGNED_LONG_MIN_VALUE) <= 0) {
throw new IllegalArgumentException("Value [" + stringValue + "] is out of range for an unsigned long");
}
bigIntegerValue = coerce ? bigDecimalValue.toBigInteger() : bigDecimalValue.toBigIntegerExact();
Expand Down

This file was deleted.

1 change: 1 addition & 0 deletions libs/core/licenses/lucene-core-9.6.0.jar.sha1
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
2c08c7a491e9d033bb4806e0a45496e3a0667217
2 changes: 1 addition & 1 deletion libs/core/src/main/java/org/opensearch/Version.java
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ public class Version implements Comparable<Version>, ToXContentFragment {
public static final Version V_2_6_1 = new Version(2060199, org.apache.lucene.util.Version.LUCENE_9_5_0);
public static final Version V_2_7_0 = new Version(2070099, org.apache.lucene.util.Version.LUCENE_9_5_0);
public static final Version V_2_7_1 = new Version(2070199, org.apache.lucene.util.Version.LUCENE_9_5_0);
public static final Version V_2_8_0 = new Version(2080099, org.apache.lucene.util.Version.LUCENE_9_5_0);
public static final Version V_2_8_0 = new Version(2080099, org.apache.lucene.util.Version.LUCENE_9_6_0);
public static final Version V_3_0_0 = new Version(3000099, org.apache.lucene.util.Version.LUCENE_9_6_0);
public static final Version CURRENT = V_3_0_0;

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
/*
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/
package org.opensearch.core.common.io.stream;

import java.io.InputStream;

/**
* Foundation class for reading core types off the transport stream
*
* todo: refactor {@code StreamInput} primitive readers to this class
*
* @opensearch.internal
*/
public abstract class BaseStreamInput extends InputStream {}
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
/*
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/
package org.opensearch.core.common.io.stream;

import java.io.OutputStream;

/**
* Foundation class for writing core types over the transport stream
*
* todo: refactor {@code StreamOutput} primitive writers to this class
*
* @opensearch.internal
*/
public abstract class BaseStreamOutput extends OutputStream {}
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
/*
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/
package org.opensearch.core.common.io.stream;

import java.io.IOException;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;

/**
* Implementers can be written to a {@code StreamOutput} and read from a {@code StreamInput}. This allows them to be "thrown
* across the wire" using OpenSearch's internal protocol. If the implementer also implements equals and hashCode then a copy made by
* serializing and deserializing must be equal and have the same hashCode. It isn't required that such a copy be entirely unchanged.
*
* @opensearch.internal
*/
public interface BaseWriteable<S extends BaseStreamOutput> {
/**
* A WriteableRegistry registers {@link Writer} methods for writing data types over a
* {@link BaseStreamOutput} channel and {@link Reader} methods for reading data from a
* {@link BaseStreamInput} channel.
*
* @opensearch.internal
*/
class WriteableRegistry {
private static final Map<Class<?>, Writer<? extends BaseStreamOutput, ?>> WRITER_REGISTRY = new ConcurrentHashMap<>();
private static final Map<Byte, Reader<? extends BaseStreamInput, ?>> READER_REGISTRY = new ConcurrentHashMap<>();

/**
* registers a streamable writer
*
* @opensearch.internal
*/
public static <W extends Writer<? extends BaseStreamOutput, ?>> void registerWriter(final Class<?> clazz, final W writer) {
if (WRITER_REGISTRY.containsKey(clazz)) {
throw new IllegalArgumentException("Streamable writer already registered for type [" + clazz.getName() + "]");
}
WRITER_REGISTRY.put(clazz, writer);
}

/**
* registers a streamable reader
*
* @opensearch.internal
*/
public static <R extends Reader<? extends BaseStreamInput, ?>> void registerReader(final byte ordinal, final R reader) {
if (READER_REGISTRY.containsKey(ordinal)) {
throw new IllegalArgumentException("Streamable reader already registered for ordinal [" + (int) ordinal + "]");
}
READER_REGISTRY.put(ordinal, reader);
}

/**
* Returns the registered writer keyed by the class type
*/
@SuppressWarnings("unchecked")
public static <W extends Writer<? extends BaseStreamOutput, ?>> W getWriter(final Class<?> clazz) {
return (W) WRITER_REGISTRY.get(clazz);
}

/**
* Returns the ristered reader keyed by the unique ordinal
*/
@SuppressWarnings("unchecked")
public static <R extends Reader<? extends BaseStreamInput, ?>> R getReader(final byte b) {
return (R) READER_REGISTRY.get(b);
}
}

/**
* Write this into the {@linkplain BaseStreamOutput}.
*/
void writeTo(final S out) throws IOException;

/**
* Reference to a method that can write some object to a {@link BaseStreamOutput}.
* <p>
* By convention this is a method from {@link BaseStreamOutput} itself (e.g., {@code StreamOutput#writeString}). If the value can be
* {@code null}, then the "optional" variant of methods should be used!
* <p>
* Most classes should implement {@code Writeable} and the {@code Writeable#writeTo(BaseStreamOutput)} method should <em>use</em>
* {@link BaseStreamOutput} methods directly or this indirectly:
* <pre><code>
* public void writeTo(StreamOutput out) throws IOException {
* out.writeVInt(someValue);
* out.writeMapOfLists(someMap, StreamOutput::writeString, StreamOutput::writeString);
* }
* </code></pre>
*/
@FunctionalInterface
interface Writer<S extends BaseStreamOutput, V> {

/**
* Write {@code V}-type {@code value} to the {@code out}put stream.
*
* @param out Output to write the {@code value} too
* @param value The value to add
*/
void write(final S out, V value) throws IOException;
}

/**
* Reference to a method that can read some object from a stream. By convention this is a constructor that takes
* {@linkplain BaseStreamInput} as an argument for most classes and a static method for things like enums. Returning null from one of these
* is always wrong - for that we use methods like {@code StreamInput#readOptionalWriteable(Reader)}.
* <p>
* As most classes will implement this via a constructor (or a static method in the case of enumerations), it's something that should
* look like:
* <pre><code>
* public MyClass(final StreamInput in) throws IOException {
* this.someValue = in.readVInt();
* this.someMap = in.readMapOfLists(StreamInput::readString, StreamInput::readString);
* }
* </code></pre>
*/
@FunctionalInterface
interface Reader<S extends BaseStreamInput, V> {

/**
* Read {@code V}-type value from a stream.
*
* @param in Input to read the value from
*/
V read(final S in) throws IOException;
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
/*
* SPDX-License-Identifier: Apache-2.0
*
* The OpenSearch Contributors require contributions made to
* this file be licensed under the Apache-2.0 license or a
* compatible open source license.
*/
/** Core transport stream classes */
package org.opensearch.core.common.io.stream;
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,6 @@

import org.opensearch.common.io.stream.StreamInput;
import org.opensearch.common.io.stream.StreamOutput;
import org.opensearch.common.io.stream.Writeable;
import org.opensearch.common.util.LongObjectPagedHashMap;
import org.opensearch.core.xcontent.XContentBuilder;
import org.opensearch.search.aggregations.InternalAggregation;
Expand Down Expand Up @@ -69,7 +68,7 @@ protected BaseGeoGrid(String name, int requiredSize, List<BaseGeoGridBucket> buc
this.buckets = buckets;
}

protected abstract Writeable.Reader<B> getBucketReader();
protected abstract Reader<B> getBucketReader();

/**
* Read from a stream.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ protected InternalGeoHashGridBucket createBucket(long hashAsLong, long docCount,
}

@Override
protected Reader getBucketReader() {
protected Reader<InternalGeoHashGridBucket> getBucketReader() {
return InternalGeoHashGridBucket::new;
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ protected InternalGeoTileGridBucket createBucket(long hashAsLong, long docCount,
}

@Override
protected Reader getBucketReader() {
protected Reader<InternalGeoTileGridBucket> getBucketReader() {
return InternalGeoTileGridBucket::new;
}

Expand Down

This file was deleted.

Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
f2b28bb17fa6a1415233b1db98bd6fd371afc9b3
Original file line number Diff line number Diff line change
Expand Up @@ -618,7 +618,7 @@ public SpanQuery spanPrefixQuery(String value, SpanMultiTermQueryWrapper.SpanRew
private final PrefixFieldMapper prefixField;
private final ShingleFieldMapper[] shingleFields;

private final Builder builder;
private final IndexAnalyzers indexAnalyzers;

public SearchAsYouTypeFieldMapper(
String simpleName,
Expand All @@ -636,7 +636,7 @@ public SearchAsYouTypeFieldMapper(
this.store = builder.store.getValue();
this.indexOptions = builder.indexOptions.getValue();
this.termVectors = builder.termVectors.getValue();
this.builder = builder;
this.indexAnalyzers = builder.analyzers.indexAnalyzers;
}

@Override
Expand All @@ -663,7 +663,7 @@ protected String contentType() {

@Override
public ParametrizedFieldMapper.Builder getMergeBuilder() {
return new Builder(simpleName(), builder.analyzers.indexAnalyzers).init(this);
return new Builder(simpleName(), this.indexAnalyzers).init(this);
}

public static String getShingleFieldName(String parentField, int shingleSize) {
Expand Down
Loading

0 comments on commit 53f58a3

Please sign in to comment.