Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DOCS] Update ES ingest pipeline refs #28239

Merged
merged 10 commits into from
Oct 5, 2021
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion auditbeat/docs/fields.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -2358,7 +2358,7 @@ alias to: error.message
[float]
=== geoip

The geoip fields are defined as a convenience in case you decide to enrich the data using a geoip filter in Logstash or Ingest Node.
The geoip fields are defined as a convenience in case you decide to enrich the data using a geoip filter in Logstash or Elasticsearch geoip ingest processor.



Expand Down
3 changes: 2 additions & 1 deletion auditbeat/module/auditd/_meta/fields.yml
Original file line number Diff line number Diff line change
Expand Up @@ -858,7 +858,8 @@
type: group
description: >
The geoip fields are defined as a convenience in case you decide to
enrich the data using a geoip filter in Logstash or Ingest Node.
enrich the data using a geoip filter in Logstash or Elasticsearch geoip
ingest processor.
fields:
- name: continent_name
type: keyword
Expand Down
2 changes: 1 addition & 1 deletion auditbeat/module/auditd/fields.go

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

13 changes: 6 additions & 7 deletions docs/devguide/modules-dev-guide.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -304,10 +304,9 @@ variables to dynamically switch between configurations.
[float]
==== ingest/*.json

The `ingest/` folder contains Elasticsearch
{ref}/ingest.html[Ingest Node] pipeline configurations. The Ingest
Node pipelines are responsible for parsing the log lines and doing other
manipulations on the data.
The `ingest/` folder contains {es} {ref}/ingest.html[ingest pipeline]
configurations. Ingest pipelines are responsible for parsing the log lines and
doing other manipulations on the data.

The files in this folder are JSON or YAML documents representing
{ref}/pipeline.html[pipeline definitions]. Just like with the `config/`
Expand Down Expand Up @@ -344,10 +343,10 @@ on_failure:
----

From here, you would typically add processors to the `processors` array to do
the actual parsing. For details on how to use ingest node processors, see the
{ref}/ingest-processors.html[ingest node documentation]. In
the actual parsing. For information about available ingest processors, see the
{ref}/processors.html[processor reference documentation]. In
particular, you will likely find the
{ref}/grok-processor.html[Grok processor] to be useful for parsing.
{ref}/grok-processor.html[grok processor] to be useful for parsing.
Here is an example for parsing the Nginx access logs.

[source,json]
Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/filebeat-modules-options.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ a log type that isn't supported, or you want to use a different setup.

{beatname_uc} <<{beatname_lc}-modules,modules>> provide a quick way to
get started processing common log formats. They contain default configurations,
{es} ingest node pipeline definitions, and {kib} dashboards to help you
{es} ingest pipeline definitions, and {kib} dashboards to help you
implement and deploy a log monitoring solution.

You can configure modules in the `modules.d` directory (recommended), or in the
Expand Down
4 changes: 2 additions & 2 deletions filebeat/docs/include/what-happens.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ defaults)

* Makes sure each multiline log event gets sent as a single event

* Uses ingest node to parse and process the log lines, shaping the data into a structure suitable
for visualizing in Kibana
* Uses an {es} ingest pipeline to parse and process the log lines, shaping the
data into a structure suitable for visualizing in Kibana

ifeval::["{has-dashboards}"=="true"]
* Deploys dashboards for visualizing the log data
Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/inputs/input-common-options.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ processors in your config.
[float]
===== `pipeline`

The Ingest Node pipeline ID to set for the events generated by this input.
The ingest pipeline ID to set for the events generated by this input.

NOTE: The pipeline ID can also be configured in the Elasticsearch output, but
this option usually results in simpler configuration files. If the pipeline is
Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/modules-overview.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ the following:
The {beatname_uc} configuration is also responsible with stitching together
multiline events when needed.

* {es} {ref}/ingest.html[Ingest Node] pipeline definition,
* {es} {ref}/ingest.html[ingest pipeline] definition,
which is used to parse the log lines.

* Fields definitions, which are used to configure {es} with the
Expand Down
4 changes: 2 additions & 2 deletions filebeat/docs/modules/iptables.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ When you run the module, it performs a few tasks under the hood:
* Sets the default input to `syslog` and binds to `localhost` port `9001`
(but don’t worry, you can override the defaults).

* Uses ingest node to parse and process the log lines, shaping the data into
a structure suitable for visualizing in Kibana.
* Uses an ingest pipeline to parse and process the log lines, shaping the data
into a structure suitable for visualizing in Kibana.

* Deploys dashboards for visualizing the log data.

Expand Down
2 changes: 1 addition & 1 deletion filebeat/docs/modules/netflow.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ NetFlow versions older than 9, fields are mapped automatically to NetFlow v9.

This module wraps the <<filebeat-input-netflow,netflow input>> to enrich the
flow records with geolocation information about the IP endpoints by using
Elasticsearch Ingest Node.
an {es} ingest pipeline.

include::../include/gs-link.asciidoc[]

Expand Down
2 changes: 1 addition & 1 deletion heartbeat/docs/monitors/monitor-common-options.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ processors in your config.
[[monitor-pipeline]]
===== `pipeline`

The Ingest Node pipeline ID to set for the events generated by this input.
The {es} ingest pipeline ID to set for the events generated by this input.

NOTE: The pipeline ID can also be configured in the Elasticsearch output, but
this option usually results in simpler configuration files. If the pipeline is
Expand Down
4 changes: 2 additions & 2 deletions libbeat/docs/monitoring/monitoring-metricbeat.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -259,8 +259,8 @@ If you configured the monitoring cluster to use encrypted communications, you
must access it via HTTPS. For example, use a `hosts` setting like
`https://es-mon-1:9200`.

IMPORTANT: The {es} {monitor-features} use ingest pipelines, therefore the
cluster that stores the monitoring data must have at least one ingest node.
IMPORTANT: The {es} {monitor-features} use ingest pipelines. The cluster that
stores the monitoring data must have at least one node with the `ingest` role.

If the {es} {security-features} are enabled on the monitoring cluster, you
must provide a valid user ID and password so that {metricbeat} can send metrics
Expand Down
Loading