querySource for rows to upload.StoredQuery (required)
threadsHow many threads to use.int64
indexThe name of the index to upload to. If not specified ensure a column is named ‘_index’.string
typeThe type of the index to upload to.string
chunk_sizeThe number of rows to send at the time.int64
addressesA list of Elasticsearch nodes to use.list of string
usernameUsername for HTTP Basic Authentication.string
passwordPassword for HTTP Basic Authentication.string
cloud_idEndpoint for the Elastic Service (https://elastic.co/cloud).string
api_keyBase64-encoded token for authorization; if set, overrides username and password.string
wait_timeBatch elastic upload this long (2 sec).int64
pipelinePipeline for uploadsstring
disable_ssl_securityDisable ssl certificate verifications (deprecated in favor of SkipVerify).bool
skip_verifyDisable ssl certificate verifications.bool
root_caAs a better alternative to disable_ssl_security, allows root ca certs to be added here.string
max_memory_bufferHow large we allow the memory buffer to grow to while we are trying to contact the Elastic server (default 100mb).uint64
actionEither index or create. For data streams this must be create.string
secretAlternatively use a secret from the secrets service. Secret must be of type ‘AWS S3 Creds’string

Required Permissions: COLLECT_SERVER


Upload rows to elastic.

This uses the Elastic bulk upload API to push arbitrary rows to elastic. The query specified in query will be run and each row it emits will be uploaded as a separate event to Elastic.

You can either specify the elastic index explicitly using the index parameter or provide an _index column in the query itself to send the row to a different index each time.