Support for Elasticsearch 7.7 (#1192)
This commit is contained in:
committed by
GitHub
parent
be6257380e
commit
51169d5efa
191
docs/changelog.asciidoc
Normal file
191
docs/changelog.asciidoc
Normal file
@ -0,0 +1,191 @@
|
||||
[[changelog-client]]
|
||||
== Changelog
|
||||
|
||||
=== 7.7.0
|
||||
|
||||
==== Features
|
||||
|
||||
===== Support for Elasticsearch `v7.7`.
|
||||
|
||||
You can find all the API changes https://www.elastic.co/guide/en/elasticsearch/reference/7.7/release-notes-7.7.0.html[here].
|
||||
|
||||
===== Introduced client helpers - https://github.com/elastic/elasticsearch-js/pull/1107[#1107]
|
||||
|
||||
From now on, the client comes with an handy collection of helpers to give you a more comfortable experience with some APIs.
|
||||
|
||||
CAUTION: The client helpers are experimental, and the API may change in the next minor releases.
|
||||
|
||||
The following helpers has been introduced:
|
||||
|
||||
- `client.helpers.bulk`
|
||||
- `client.helpers.search`
|
||||
- `client.helpers.scrollSearch`
|
||||
- `client.helpers.scrollDocuments`
|
||||
|
||||
===== The `ConnectionPool.getConnection` now always returns a `Connection` - https://github.com/elastic/elasticsearch-js/pull/1127[#1127]
|
||||
|
||||
What does this mean? It means that you will see less `NoLivingConnectionError`, which now can only be caused if you set a selector/filter too strict.
|
||||
For improving the debugging experience, the `NoLivingConnectionsError` error message has been updated.
|
||||
|
||||
===== Abortable promises - https://github.com/elastic/elasticsearch-js/pull/1141[#1141]
|
||||
|
||||
From now on, it will be possible to abort a request generated with the promise-styl API. If you abort a request generated from a promise, the promise will be rejected with a `RequestAbortedError`.
|
||||
|
||||
|
||||
[source,js]
|
||||
----
|
||||
const promise = client.search({
|
||||
body: {
|
||||
query: { match_all: {} }
|
||||
}
|
||||
})
|
||||
|
||||
promise
|
||||
.then(console.log)
|
||||
.catch(console.log)
|
||||
|
||||
promise.abort()
|
||||
----
|
||||
|
||||
===== Major refactor of the Type Definitions - https://github.com/elastic/elasticsearch-js/pull/1119[#1119] https://github.com/elastic/elasticsearch-js/issues/1130[#1130] https://github.com/elastic/elasticsearch-js/pull/1132[#1132]
|
||||
|
||||
Now every API makes better use of the generics and overloading, so you can (or not, by default request/response bodies are `Record<string, any>`) define the request/response bodies in the generics.
|
||||
[source,ts]
|
||||
----
|
||||
// request and response bodies are generics
|
||||
client.search(...)
|
||||
// response body is `SearchResponse` and request body is generic
|
||||
client.search<SearchResponse>(...)
|
||||
// request body is `SearchBody` and response body is `SearchResponse`
|
||||
client.search<SearchResponse, SearchBody>(...)
|
||||
----
|
||||
|
||||
This *should* not be a breaking change, as every generics defaults to `any`. It might happen to some users that the code breaks, but our test didn't detect any of it, probably because they were not robust enough. However, given the gigantic improvement in the developer experience, we have decided to release this change in the 7.x line.
|
||||
|
||||
==== Fixes
|
||||
|
||||
===== The `ConnectionPool.update` method now cleans the `dead` list - https://github.com/elastic/elasticsearch-js/issues/1122[#1122] https://github.com/elastic/elasticsearch-js/pull/1127[#1127]
|
||||
|
||||
It can happen in a situation where we are updating the connections list and running sniff, leaving the `dead` list in a dirty state. Now the `ConnectionPool.update` cleans up the `dead` list every time, which makes way more sense given that all the new connections are alive.
|
||||
|
||||
===== `ConnectionPoolmarkDead` should ignore connections that no longer exists - https://github.com/elastic/elasticsearch-js/pull/1159[#1159]
|
||||
|
||||
It might happen that markDead is called just after a pool update, and in such case, the clint was adding the dead list a node that no longer exists, causing unhandled exceptions later.
|
||||
|
||||
===== Do not retry a request if the body is a stream - https://github.com/elastic/elasticsearch-js/pull/1143[#1143]
|
||||
|
||||
The client should not retry if it's sending a stream body, because it should store in memory a copy of the stream to be able to send it again, but since it doesn't know in advance the size of the stream, it risks to take too much memory.
|
||||
Furthermore, copying everytime the stream is very an expensive operation.
|
||||
|
||||
===== Return an error if the request has been aborted - https://github.com/elastic/elasticsearch-js/pull/1141[#1141]
|
||||
|
||||
Until now, aborting a request was blocking the HTTP request, but never calling the callback or resolving the promise to notify the user. This is a bug because it could lead to dangerous memory leaks. From now on if the user calls the `request.abort()` method, the callback style API will be called with a `RequestAbortedError`, the promise will be rejected with `RequestAbortedError` as well.
|
||||
|
||||
=== 7.6.1
|
||||
|
||||
**Fixes:**
|
||||
|
||||
- Secure json parsing - https://github.com/elastic/elasticsearch-js/pull/1110[#1110]
|
||||
- ApiKey should take precedence over basic auth - https://github.com/elastic/elasticsearch-js/pull/1115[#1115]
|
||||
|
||||
**Documentation:**
|
||||
|
||||
- Fix typo in api reference - https://github.com/elastic/elasticsearch-js/pull/1109[#1109]
|
||||
|
||||
=== 7.6.0
|
||||
|
||||
Support for Elasticsearch `v7.6`.
|
||||
|
||||
=== 7.5.1
|
||||
|
||||
**Fixes:**
|
||||
|
||||
- Skip compression in case of empty string body - https://github.com/elastic/elasticsearch-js/pull/1080[#1080]
|
||||
- Fix typo in NoLivingConnectionsError - https://github.com/elastic/elasticsearch-js/pull/1045[#1045]
|
||||
- Change TransportRequestOptions.ignore to number[] - https://github.com/elastic/elasticsearch-js/pull/1053[#1053]
|
||||
- ClientOptions["cloud"] should have optional auth fields - https://github.com/elastic/elasticsearch-js/pull/1032[#1032]
|
||||
|
||||
**Documentation:**
|
||||
|
||||
- Docs: Return super in example Transport subclass - https://github.com/elastic/elasticsearch-js/pull/980[#980]
|
||||
- Add examples to reference - https://github.com/elastic/elasticsearch-js/pull/1076[#1076]
|
||||
- Added new examples - https://github.com/elastic/elasticsearch-js/pull/1031[#1031]
|
||||
|
||||
=== 7.5.0
|
||||
|
||||
Support for Elasticsearch `v7.5`.
|
||||
|
||||
**Features**
|
||||
|
||||
- X-Opaque-Id support https://github.com/elastic/elasticsearch-js/pull/997[#997]
|
||||
|
||||
=== 7.4.0
|
||||
|
||||
Support for Elasticsearch `v7.4`.
|
||||
|
||||
**Fixes:**
|
||||
|
||||
- Fix issue; node roles are defaulting to true when undefined is breaking usage of nodeFilter option - https://github.com/elastic/elasticsearch-js/pull/967[#967]
|
||||
|
||||
**Documentation:**
|
||||
|
||||
- Updated API reference doc - https://github.com/elastic/elasticsearch-js/pull/945[#945] https://github.com/elastic/elasticsearch-js/pull/969[#969]
|
||||
- Fix inaccurate description sniffEndpoint - https://github.com/elastic/elasticsearch-js/pull/959[#959]
|
||||
|
||||
**Internals:**
|
||||
|
||||
- Update code generation https://github.com/elastic/elasticsearch-js/pull/969[#969]
|
||||
|
||||
=== 7.3.0
|
||||
|
||||
Support for Elasticsearch `v7.3`.
|
||||
|
||||
**Features:**
|
||||
|
||||
- Added `auth` option - https://github.com/elastic/elasticsearch-js/pull/908[#908]
|
||||
- Added support for `ApiKey` authentication - https://github.com/elastic/elasticsearch-js/pull/908[#908]
|
||||
|
||||
**Fixes:**
|
||||
|
||||
- fix(Typings): sniffInterval can also be boolean - https://github.com/elastic/elasticsearch-js/pull/914[#914]
|
||||
|
||||
**Internals:**
|
||||
|
||||
- Refactored connection pool - https://github.com/elastic/elasticsearch-js/pull/913[#913]
|
||||
|
||||
**Documentation:**
|
||||
|
||||
- Better reference code examples - https://github.com/elastic/elasticsearch-js/pull/920[#920]
|
||||
- Improve README - https://github.com/elastic/elasticsearch-js/pull/909[#909]
|
||||
|
||||
=== 7.2.0
|
||||
|
||||
Support for Elasticsearch `v7.2`
|
||||
|
||||
**Fixes:**
|
||||
|
||||
- Remove auth data from inspect and toJSON in connection class - https://github.com/elastic/elasticsearch-js/pull/887[#887]
|
||||
|
||||
=== 7.1.0
|
||||
|
||||
Support for Elasticsearch `v7.1`
|
||||
|
||||
**Fixes:**
|
||||
|
||||
- Support for non-friendly chars in url username and password - https://github.com/elastic/elasticsearch-js/pull/858[#858]
|
||||
- Patch deprecated parameters - https://github.com/elastic/elasticsearch-js/pull/851[#851]
|
||||
|
||||
=== 7.0.1
|
||||
|
||||
**Fixes:**
|
||||
|
||||
- Fix TypeScript export *(issue https://github.com/elastic/elasticsearch-js/pull/841[#841])* - https://github.com/elastic/elasticsearch-js/pull/842[#842]
|
||||
- Fix http and https port handling *(issue https://github.com/elastic/elasticsearch-js/pull/843[#843])* - https://github.com/elastic/elasticsearch-js/pull/845[#845]
|
||||
- Fix TypeScript definiton *(issue https://github.com/elastic/elasticsearch-js/pull/803[#803])* - https://github.com/elastic/elasticsearch-js/pull/846[#846]
|
||||
- Added toJSON method to Connection class *(issue https://github.com/elastic/elasticsearch-js/pull/848[#848])* - https://github.com/elastic/elasticsearch-js/pull/849[#849]
|
||||
|
||||
=== 7.0.0
|
||||
|
||||
Support for Elasticsearch `v7.0`
|
||||
|
||||
- Stable release.
|
||||
@ -4,6 +4,8 @@
|
||||
The `bulk` API makes it possible to perform many index/delete operations in a
|
||||
single API call. This can greatly increase the indexing speed.
|
||||
|
||||
NOTE: Did you know that we provide an helper for sending bulk request? You can find it {jsclient}/client-helpers.html[here].
|
||||
|
||||
[source,js]
|
||||
----
|
||||
'use strict'
|
||||
|
||||
@ -19,6 +19,8 @@ In order to use scrolling, the initial search request should specify the scroll
|
||||
parameter in the query string, which tells Elasticsearch how long it should keep
|
||||
the “search context” alive.
|
||||
|
||||
NOTE: Did you know that we provide an helper for sending scroll requests? You can find it {jsclient}/client-helpers.html[here].
|
||||
|
||||
[source,js]
|
||||
----
|
||||
'use strict'
|
||||
|
||||
302
docs/helpers.asciidoc
Normal file
302
docs/helpers.asciidoc
Normal file
@ -0,0 +1,302 @@
|
||||
[[client-helpers]]
|
||||
== Client Helpers
|
||||
|
||||
The client comes with an handy collection of helpers to give you a more comfortable experience with some APIs.
|
||||
|
||||
CAUTION: The client helpers are experimental, and the API may change in the next minor releases.
|
||||
If you are using the client with Node.js v8 you should run your code with the `--harmony-async-iteration` argument. +
|
||||
eg: `node --harmony-async-iteration index.js`
|
||||
|
||||
=== Bulk Helper
|
||||
Running Bulk requests can be complex due to the shape of the API, this helper aims to provide a nicer developer experience around the Bulk API.
|
||||
|
||||
==== Usage
|
||||
[source,js]
|
||||
----
|
||||
const { createReadStream } = require('fs')
|
||||
const split = require('split2')
|
||||
const { Client } = require('@elastic/elasticsearch')
|
||||
|
||||
const client = new Client({ node: 'http://localhost:9200' })
|
||||
const result = await client.helpers.bulk({
|
||||
datasource: createReadStream('./dataset.ndjson').pipe(split()),
|
||||
onDocument (doc) {
|
||||
return {
|
||||
index: { _index: 'my-index' }
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
console.log(result)
|
||||
// {
|
||||
// total: number,
|
||||
// failed: number,
|
||||
// retry: number,
|
||||
// successful: number,
|
||||
// time: number,
|
||||
// bytes: number,
|
||||
// aborted: boolean
|
||||
// }
|
||||
----
|
||||
|
||||
To create a new instance of the Bulk helper, you should access it as shown in the example above, the configuration options are:
|
||||
[cols=2*]
|
||||
|===
|
||||
|`datasource`
|
||||
a|An array, async generator or a readable stream with the data you need to index/create/update/delete.
|
||||
It can be an array of strings or objects, but also a stream of json strings or JavaScript objects. +
|
||||
If it is a stream, we recommend to use the https://www.npmjs.com/package/split2[`split2`] package, that will split the stream on new lines delimiters. +
|
||||
This parameter is mandatory.
|
||||
[source,js]
|
||||
----
|
||||
const { createReadStream } = require('fs')
|
||||
const split = require('split2')
|
||||
const b = client.helpers.bulk({
|
||||
// if you just use split(), the data will be used as array of strings
|
||||
datasource: createReadStream('./dataset.ndjson').pipe(split())
|
||||
// if you need to manipulate the data, you can pass JSON.parse to split
|
||||
datasource: createReadStream('./dataset.ndjson').pipe(split(JSON.parse))
|
||||
})
|
||||
----
|
||||
|
||||
|`onDocument`
|
||||
a|A function that will be called for each document of the datasource. Inside this function you can manipulate the document and you must return the operation you want to execute with the document. Look at the link:{ref}/docs-bulk.html[Bulk API documentation] to see the supported operations. +
|
||||
This parameter is mandatory.
|
||||
[source,js]
|
||||
----
|
||||
const b = client.helpers.bulk({
|
||||
onDocument (doc) {
|
||||
return {
|
||||
index: { _index: 'my-index' }
|
||||
}
|
||||
}
|
||||
})
|
||||
----
|
||||
|
||||
|`onDrop`
|
||||
a|A function that will be called for everytime a document can't be indexed and it has reached the maximum amount of retries.
|
||||
[source,js]
|
||||
----
|
||||
const b = client.helpers.bulk({
|
||||
onDrop (doc) {
|
||||
console.log(doc)
|
||||
}
|
||||
})
|
||||
----
|
||||
|
||||
|`flushBytes`
|
||||
a|The size of the bulk body in bytes to reach before to send it. Default of 5MB. +
|
||||
_Default:_ `5000000`
|
||||
[source,js]
|
||||
----
|
||||
const b = client.helpers.bulk({
|
||||
flushBytes: 1000000
|
||||
})
|
||||
----
|
||||
|
||||
|`concurrency`
|
||||
a|How many request will be executed at the same time. +
|
||||
_Default:_ `5`
|
||||
[source,js]
|
||||
----
|
||||
const b = client.helpers.bulk({
|
||||
concurrency: 10
|
||||
})
|
||||
----
|
||||
|
||||
|`retries`
|
||||
a|How many times a document will be retried before to call the `onDrop` callback. +
|
||||
_Default:_ Client max retries.
|
||||
[source,js]
|
||||
----
|
||||
const b = client.helpers.bulk({
|
||||
retries: 3
|
||||
})
|
||||
----
|
||||
|
||||
|`wait`
|
||||
a|How much time to wait before retries in milliseconds.+
|
||||
_Default:_ 5000.
|
||||
[source,js]
|
||||
----
|
||||
const b = client.helpers.bulk({
|
||||
wait: 3000
|
||||
})
|
||||
----
|
||||
|
||||
|`refreshOnCompletion`
|
||||
a|If `true`, at the end of the bulk operation it will run a refresh on all indices or on the specified indices. +
|
||||
_Default:_ false.
|
||||
[source,js]
|
||||
----
|
||||
const b = client.helpers.bulk({
|
||||
refreshOnCompletion: true
|
||||
// or
|
||||
refreshOnCompletion: 'index-name'
|
||||
})
|
||||
----
|
||||
|
||||
|===
|
||||
|
||||
==== Abort a bulk operation
|
||||
If needed, you can abort a bulk operation at any time. The bulk helper returns a https://promisesaplus.com/[thenable], which has an `abort` method.
|
||||
|
||||
NOTE: The abort method will stop the execution of the bulk operation, but if you are using a concurrency higher than one, the operations that are already running will not be stopped.
|
||||
|
||||
[source,js]
|
||||
----
|
||||
const { createReadStream } = require('fs')
|
||||
const split = require('split2')
|
||||
const { Client } = require('@elastic/elasticsearch')
|
||||
|
||||
const client = new Client({ node: 'http://localhost:9200' })
|
||||
const b = client.helpers.bulk({
|
||||
datasource: createReadStream('./dataset.ndjson').pipe(split()),
|
||||
onDocument (doc) {
|
||||
return {
|
||||
index: { _index: 'my-index' }
|
||||
}
|
||||
},
|
||||
onDrop (doc) {
|
||||
b.abort()
|
||||
}
|
||||
})
|
||||
|
||||
console.log(await b)
|
||||
----
|
||||
|
||||
==== Passing custom options to the Bulk API
|
||||
You can pass any option supported by the link:{ref}/docs-bulk.html#docs-bulk-api-query-params[Bulk API] to the helper, and the helper will use those options in conjuction with the Bulk
|
||||
API call.
|
||||
|
||||
[source,js]
|
||||
----
|
||||
const result = await client.helpers.bulk({
|
||||
datasource: [...]
|
||||
onDocument (doc) {
|
||||
return {
|
||||
index: { _index: 'my-index' }
|
||||
}
|
||||
},
|
||||
pipeline: 'my-pipeline'
|
||||
})
|
||||
----
|
||||
|
||||
==== Usage with an async generator
|
||||
|
||||
[source,js]
|
||||
----
|
||||
const { Client } = require('@elastic/elasticsearch')
|
||||
|
||||
async function * generator () {
|
||||
const dataset = [
|
||||
{ user: 'jon', age: 23 },
|
||||
{ user: 'arya', age: 18 },
|
||||
{ user: 'tyrion', age: 39 }
|
||||
]
|
||||
for (const doc of dataset) {
|
||||
yield doc
|
||||
}
|
||||
}
|
||||
|
||||
const client = new Client({ node: 'http://localhost:9200' })
|
||||
const result = await client.helpers.bulk({
|
||||
datasource: generator(),
|
||||
onDocument (doc) {
|
||||
return {
|
||||
index: { _index: 'my-index' }
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
console.log(result)
|
||||
----
|
||||
|
||||
=== Search Helper
|
||||
A simple wrapper around the search API. Instead of returning the entire `result` object it will return only the search documents result.
|
||||
|
||||
[source,js]
|
||||
----
|
||||
const documents = await client.helpers.search({
|
||||
index: 'stackoverflow',
|
||||
body: {
|
||||
query: {
|
||||
match: {
|
||||
title: 'javascript'
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
for (const doc of documents) {
|
||||
console.log(doc)
|
||||
}
|
||||
----
|
||||
|
||||
=== Scroll Search Helper
|
||||
This helpers offers a simple and intuitive way to use the scroll search API. Once called, it returns an https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for-await...of[async iterator] which can be used in conjuction with a for-await...of. +
|
||||
It handles automatically the `429` error and uses the client's `maxRetries` option.
|
||||
|
||||
[source,js]
|
||||
----
|
||||
const scrollSearch = await client.helpers.scrollSearch({
|
||||
index: 'stackoverflow',
|
||||
body: {
|
||||
query: {
|
||||
match: {
|
||||
title: 'javascript'
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
for await (const result of scrollSearch) {
|
||||
console.log(result)
|
||||
}
|
||||
----
|
||||
|
||||
==== Clear a scroll search
|
||||
|
||||
If needed, you can clear a scroll search by calling `result.clear()`:
|
||||
|
||||
[source,js]
|
||||
----
|
||||
for await (const result of scrollSearch) {
|
||||
if (condition) {
|
||||
await result.clear()
|
||||
}
|
||||
}
|
||||
----
|
||||
|
||||
==== Quickly getting the documents
|
||||
|
||||
If you only need the documents from the result of a scroll search, you can access them via `result.documents`:
|
||||
|
||||
[source,js]
|
||||
----
|
||||
for await (const result of scrollSearch) {
|
||||
console.log(result.documents)
|
||||
}
|
||||
----
|
||||
|
||||
=== Scroll Documents Helper
|
||||
|
||||
It works in the same way as the scroll search helper, but it returns only the documents instead. Note, every loop cycle will return you a single document, and you can't use the `clear` method.
|
||||
|
||||
[source,js]
|
||||
----
|
||||
const scrollSearch = await client.helpers.scrollDocuments({
|
||||
index: 'stackoverflow',
|
||||
body: {
|
||||
query: {
|
||||
match: {
|
||||
title: 'javascript'
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
for await (const doc of scrollSearch) {
|
||||
console.log(doc)
|
||||
}
|
||||
----
|
||||
@ -4,6 +4,7 @@
|
||||
include::{asciidoc-dir}/../../shared/attributes.asciidoc[]
|
||||
|
||||
include::introduction.asciidoc[]
|
||||
include::changelog.asciidoc[]
|
||||
include::usage.asciidoc[]
|
||||
include::configuration.asciidoc[]
|
||||
include::reference.asciidoc[]
|
||||
@ -12,6 +13,7 @@ include::authentication.asciidoc[]
|
||||
include::observability.asciidoc[]
|
||||
include::child.asciidoc[]
|
||||
include::extend.asciidoc[]
|
||||
include::helpers.asciidoc[]
|
||||
include::typescript.asciidoc[]
|
||||
include::testing.asciidoc[]
|
||||
include::examples/index.asciidoc[]
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@ -1,4 +1,5 @@
|
||||
[[typescript]]
|
||||
|
||||
== TypeScript support
|
||||
|
||||
The client offers a first-class support for TypeScript, since it ships the type
|
||||
@ -7,69 +8,36 @@ definitions for every exposed API.
|
||||
NOTE: If you are using TypeScript you will be required to use _snake_case_ style
|
||||
to define the API parameters instead of _camelCase_.
|
||||
|
||||
Other than the types for the surface API, the client offers the types for every
|
||||
request method, via the `RequestParams`, if you need the types for a search
|
||||
request for instance, you can access them via `RequestParams.Search`.
|
||||
Every API that supports a body, accepts a
|
||||
https://www.typescriptlang.org/docs/handbook/generics.html[generics] which
|
||||
represents the type of the request body, if you don't configure anything, it
|
||||
will default to `any`.
|
||||
By default event API uses https://www.typescriptlang.org/docs/handbook/generics.html[generics] to specify the requets and response bodies and the `meta.context`. Currently we can't provide those definitions, but we are working to improve this situation.
|
||||
|
||||
For example:
|
||||
You can't fid a partial definition of the request types by importing `RequestParams`, which it is used by default in the client and accepts a body (when needed) as a generic to provide a better specification.
|
||||
|
||||
The body defaults to `RequestBody` and `RequestNDBody`, which are defined as follows:
|
||||
|
||||
[source,ts]
|
||||
----
|
||||
import { RequestParams } from '@elastic/elasticsearch'
|
||||
type RequestBody<T = Record<string, any>> = T | string | Buffer | ReadableStream
|
||||
type RequestNDBody<T = Record<string, any>[]> = T | string | string[] | Buffer | ReadableStream
|
||||
----
|
||||
|
||||
interface SearchBody {
|
||||
query: {
|
||||
match: { foo: string }
|
||||
}
|
||||
}
|
||||
You can specify the response and request body in each API as follows:
|
||||
|
||||
const searchParams: RequestParams.Search<SearchBody> = {
|
||||
[source,ts]
|
||||
----
|
||||
const response = await client.search<ResponseBody, RequestBody, Context>({
|
||||
index: 'test',
|
||||
body: {
|
||||
query: {
|
||||
match: { foo: 'bar' }
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
// This is valid as well
|
||||
const searchParams: RequestParams.Search = {
|
||||
index: 'test',
|
||||
body: {
|
||||
query: {
|
||||
match: { foo: 'bar' }
|
||||
}
|
||||
}
|
||||
}
|
||||
console.log(response.body)
|
||||
----
|
||||
|
||||
You can find the type definiton of a response in `ApiResponse`, which accepts a
|
||||
generics as well if you want to specify the body type, otherwise it defaults to
|
||||
`any`.
|
||||
You don't have to specify all the generics, but the order must be respected.
|
||||
|
||||
[source,ts]
|
||||
----
|
||||
interface SearchResponse<T> {
|
||||
hits: {
|
||||
hits: Array<{
|
||||
_source: T;
|
||||
}>
|
||||
}
|
||||
}
|
||||
|
||||
// Define the interface of the source object
|
||||
interface Source {
|
||||
foo: string
|
||||
}
|
||||
|
||||
client.search(searchParams)
|
||||
.then((response: ApiResponse<SearchResponse<Source>>) => console.log(response))
|
||||
.catch((err: Error) => {})
|
||||
----
|
||||
|
||||
=== A complete example
|
||||
|
||||
@ -137,19 +105,43 @@ interface Source {
|
||||
foo: string
|
||||
}
|
||||
|
||||
async function run (): Promise<void> {
|
||||
// Define the search parameters
|
||||
const searchParams: RequestParams.Search<SearchBody> = {
|
||||
async function run () {
|
||||
// All of the examples below are valid code, by default,
|
||||
// the request body will be `RequestBody` and response will be `Record<string, any>`.
|
||||
const response = await client.search({
|
||||
index: 'test',
|
||||
body: {
|
||||
query: {
|
||||
match: { foo: 'bar' }
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
// body here is `ResponseBody`
|
||||
console.log(response.body)
|
||||
|
||||
// Craft the final type definition
|
||||
const response: ApiResponse<SearchResponse<Source>> = await client.search(searchParams)
|
||||
// The first generic is the response body
|
||||
const response = await client.search<SearchResponse<Source>>({
|
||||
index: 'test',
|
||||
// Here the body must follow the `RequestBody` interface
|
||||
body: {
|
||||
query: {
|
||||
match: { foo: 'bar' }
|
||||
}
|
||||
}
|
||||
})
|
||||
// body here is `SearchResponse<Source>`
|
||||
console.log(response.body)
|
||||
|
||||
const response = await client.search<SearchResponse<Source>, SearchBody>({
|
||||
index: 'test',
|
||||
// Here the body must follow the `SearchBody` interface
|
||||
body: {
|
||||
query: {
|
||||
match: { foo: 'bar' }
|
||||
}
|
||||
}
|
||||
})
|
||||
// body here is `SearchResponse<Source>`
|
||||
console.log(response.body)
|
||||
}
|
||||
|
||||
|
||||
@ -67,45 +67,34 @@ client.search({
|
||||
|
||||
=== Aborting a request
|
||||
|
||||
When using the callback style API, the function also returns an object that
|
||||
allows you to abort the API request.
|
||||
If needed, you can abort a running request by calling the `request.abort()` method returned by the API.
|
||||
|
||||
CAUTION: If you abort a request, the request will fail with a `RequestAbortedError`.
|
||||
|
||||
|
||||
[source,js]
|
||||
----
|
||||
// calback API
|
||||
const request = client.search({
|
||||
index: 'my-index',
|
||||
body: { foo: 'bar' }
|
||||
}, {
|
||||
ignore: [404],
|
||||
maxRetries: 3
|
||||
}, (err, { body }) => {
|
||||
if (err) console.log(err)
|
||||
}, (err, result) => {
|
||||
if (err) {
|
||||
console.log(err) // RequestAbortedError
|
||||
} else {
|
||||
console.log(result)
|
||||
}
|
||||
})
|
||||
|
||||
request.abort()
|
||||
----
|
||||
|
||||
Aborting a request with the promise style API is not supported, but you can
|
||||
achieve that with convenience wrapper.
|
||||
|
||||
The same behavior is valid for the promise style API as well.
|
||||
[source,js]
|
||||
----
|
||||
function abortableRequest (params, options) {
|
||||
var request = null
|
||||
const promise = new Promise((resolve, reject) => {
|
||||
request = client.search(params, options, (err, result) => {
|
||||
err ? reject(err) : resolve(res)
|
||||
})
|
||||
})
|
||||
return {
|
||||
promise,
|
||||
abort: () => request.abort()
|
||||
}
|
||||
}
|
||||
|
||||
const request = abortableRequest({
|
||||
const request = client.search({
|
||||
index: 'my-index',
|
||||
body: { foo: 'bar' }
|
||||
}, {
|
||||
@ -113,8 +102,11 @@ const request = abortableRequest({
|
||||
maxRetries: 3
|
||||
})
|
||||
|
||||
request
|
||||
.then(result => console.log(result))
|
||||
.catch(err => console.log(err)) // RequestAbortedError
|
||||
|
||||
request.abort()
|
||||
// access the promise with `request.promise.[method]`
|
||||
----
|
||||
|
||||
|
||||
@ -204,7 +196,7 @@ You can find the errors exported by the client in the table below.
|
||||
|
||||
[cols=2*]
|
||||
|===
|
||||
|`ElasticsearchClientErrors`
|
||||
|`ElasticsearchClientError`
|
||||
|Every error inherits from this class, it is the basic error generated by the client.
|
||||
|
||||
|`TimeoutError`
|
||||
@ -213,8 +205,11 @@ You can find the errors exported by the client in the table below.
|
||||
|`ConnectionError`
|
||||
|Generated when an error occurs during the request, it can be a connection error or a malformed stream of data.
|
||||
|
||||
|`RequestAbortedError`
|
||||
|Generated if the user calls the `request.abort()` method.
|
||||
|
||||
|`NoLivingConnectionsError`
|
||||
|Generated in case of all connections present in the connection pool are dead.
|
||||
|Given the configuration, the ConnectionPool was not able to find a usable Connection for this request.
|
||||
|
||||
|`SerializationError`
|
||||
|Generated if the serialization fails.
|
||||
|
||||
Reference in New Issue
Block a user