Skip to content

Commit

Permalink
[DOCS] Add ECS fields to Logs monitoring guide (elastic#1203)
Browse files Browse the repository at this point in the history
* Add ECS fields

* Fix indentation

* Update fields list

* Update following review
  • Loading branch information
EamonnTP authored and Eamonn Smith committed Jun 29, 2020
1 parent 20473bf commit b1c4246
Show file tree
Hide file tree
Showing 3 changed files with 160 additions and 10 deletions.
2 changes: 2 additions & 0 deletions docs/en/logs/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -23,3 +23,5 @@ include::using.asciidoc[]
include::log-rate.asciidoc[]

include::logs-alerting.asciidoc[]

include::logs-field-reference.asciidoc[]
142 changes: 142 additions & 0 deletions docs/en/logs/logs-field-reference.asciidoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
[[logs-fields-reference]]
[chapter, role="xpack"]
= Logs fields reference

This section lists the required fields the {logs-app} uses to display data.
Some of the fields listed are https://www.elastic.co/guide/en/ecs/current/ecs-reference.html#_what_is_ecs[ECS fields].

IMPORTANT: Beat modules (for example, {filebeat-ref}/filebeat-modules.html[{filebeat} modules])
are ECS-compliant so manual field mapping is not required, and all {logs-app}
data is automatically populated. If you cannot use {beats}, map your data to
{ecs-ref}[ECS fields] (see {ecs-ref}/ecs-converting.html[how to map data to ECS]).
You can also try using the experimental https://github.com/elastic/ecs-mapper[ECS Mapper] tool.

`@timestamp`::

Date/time when the event originated.
+
This is the date/time extracted from the event, typically representing when the event was generated by the source.
If the event source has no original timestamp, this value is typically populated by the first time the event was received by the pipeline.
Required field for all events.
+
type: date
+
required: True
+
ECS field: True
+
example: `May 27, 2020 @ 15:22:27.982`


`_doc`::

This field is used to break ties between two entries with the same timestamp.
+
required: True
+
ECS field: False


`container.id`::

Unique container id.
+
type: keyword
+
required: True
+
ECS field: True
+
example: `data`


`event.dataset`::

Name of the dataset.
+
If an event source publishes more than one type of log or events (e.g. access log, error log), the dataset is used to specify which one the event comes from.
+
It’s recommended but not required to start the dataset name with the module name, followed by a dot, then the dataset name.
+
type: keyword
+
required: True, if you want to use the {ml-features}.
+
ECS field: True
+
example: `apache.access`


`host.hostname`::

Hostname of the host.
+
It normally contains what the `hostname` command returns on the host machine.
+
type: keyword
+
required: True, if you want to enable and use the *View in Context* feature.
+
ECS field: True
+
example: `Elastic.local`


`host.name`::

Name of the host.
+
It can contain what `hostname` returns on Unix systems, the fully qualified domain name, or a name specified by the user. The sender decides which value to use.
+
type: keyword
+
required: True
+
ECS field: True
+
example: `MacBook-Elastic.local`


`kubernetes.pod.uid`::

Kubernetes Pod UID.
+
type: keyword
+
required: True
+
ECS field: False
+
example: `8454328b-673d-11ea-7d80-21010a840123`


`log.file.path`::

Full path to the log file this event came from, including the file name. It should include the drive letter, when appropriate.
+
If the event wasn't read from a log file, do not populate this field.
+
type: keyword
+
required: True, if you want to use the *View in Context* feature.
+
ECS field: False
+
example: `/var/log/demo.log`


`message`::

For log events the message field contains the log message, optimized for viewing in a log viewer.
+
For structured logs without an original message field, other fields can be concatenated to form a human-readable summary of the event.
+
If multiple messages exist, they can be combined into one message.
+
type: text
+
required: True
+
ECS field: True
+
example: `Hello World`
26 changes: 16 additions & 10 deletions docs/en/logs/logs-installation.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -43,33 +43,39 @@ Install {kib}, start it up, and open up the web interface:
. {stack-gs}/get-started-elastic-stack.html#_launch_the_kibana_web_interface[Launch the Kibana Web Interface].

[[install-shippers]]
=== Step 3: Install and enable {beats} shippers
=== Step 3: Set up and run {filebeat}

To start collecting logs data, you need to install and configure the {filebeat} {beats} shipper.
IMPORTANT: This section describes using {filebeat} to ingest data. There are other available methods to ingest data, such as {logstash-ref}/introduction.html[{ls}] or Fluentd.

You can install and configure {beats} shippers for most kinds of data directly from {kib}, or you can install {beats} yourself.
To start collecting logs data, you need to install {filebeat} and configure the {filebeat} modules directly from Kibana.

Alternatively, you can install {filebeat} and configure the {filebeat} modules yourself.

[float]
==== Install {beats} from {kib}
==== Install {filebeat} from {kib}

IMPORTANT: {filebeat-ref}/filebeat-modules.html[{filebeat} modules]
are ECS-compliant so manual <<logs-fields-reference, ECS field>> mapping is not required, and all {logs-app}
data is automatically populated.

To install {beats} from {kib}, on the machine where you want to collect the data, open a {kib} browser window.
To install a {filebeat} module from {kib}, on the machine where you want to collect the data, open a {kib} browser window.
In the *Observability* section displayed on the home page of {kib}, click *Add log data*.
Now follow the instructions for the type of data you want to collect.
The instructions include the steps required to download, install, and configure the appropriate Beats modules for your data.
The instructions include how to install and configure {filebeat}, and enable the appropriate {filebeat} module for your data.

[role="screenshot"]
image::images/add-data.png[Add log data]

[float]
==== Install {beats} yourself
==== Install {filebeat} yourself

If your data source doesn't have a {beats} module, or if you want to install {beats} the old fashioned way, follow the instructions in {filebeat-ref}/filebeat-modules-quickstart.html[{filebeat} modules quick start] and enable modules for the logs you want to collect.
If your data source doesn't have a {filebeat} module, or if you want to install one the old fashioned way, follow the instructions in {filebeat-ref}/filebeat-modules-quickstart.html[{filebeat} modules quick start] and enable modules for the logs you want to collect.
If there is no module for the logs you want to collect, see the {filebeat-ref}/filebeat-getting-started.html[{filebeat} getting started] to learn how to configure inputs.

[float]
=== Enable modules
=== Enable {filebeat} modules

However you install {beats}, you need to enable the appropriate modules in {filebeat} to start collecting logs data.
To start collecting logs data, you need to enable the appropriate modules in {filebeat}.

To collect logs from your host system, enable:

Expand Down

0 comments on commit b1c4246

Please sign in to comment.