diff --git a/docs/faq.rst b/docs/faq.rst index 9a076e9f2e90..1280fedcfa3e 100644 --- a/docs/faq.rst +++ b/docs/faq.rst @@ -93,9 +93,19 @@ How do I shutdown a KSQL environment? .. code:: bash - $ jps | grep DataGen + jps | grep DataGen + + Your output should resemble: + + .. code:: text + 25379 DataGen - $ kill 25379 + + Stop the DataGen JVM by using the specified process ID: + + .. code:: bash + + kill 25379 - Exit KSQL. @@ -108,13 +118,13 @@ How do I shutdown a KSQL environment? .. code:: bash - $ confluent stop + confluent stop - To remove all data, topics, and streams: .. code:: bash - $ confluent destroy + confluent destroy ============================================ How do I configure the target Kafka cluster? @@ -284,7 +294,7 @@ In the KSQL CLI, use the SET statement to assign a value to ``ksql.streams.reten .. code:: bash - ksql> SET 'ksql.streams.retention.ms' = '86400000'; + SET 'ksql.streams.retention.ms' = '86400000'; Make the setting global by assigning ``ksql.streams.retention.ms`` in the KSQL server configuration file. diff --git a/docs/includes/ksql-includes.rst b/docs/includes/ksql-includes.rst index 72b4be9a5b59..0a417a3556a8 100644 --- a/docs/includes/ksql-includes.rst +++ b/docs/includes/ksql-includes.rst @@ -98,7 +98,7 @@ These examples query messages from Kafka topics called ``pageviews`` and ``users .. code:: sql - ksql> CREATE STREAM pageviews_original (viewtime bigint, userid varchar, pageid varchar) WITH \ + CREATE STREAM pageviews_original (viewtime bigint, userid varchar, pageid varchar) WITH \ (kafka_topic='pageviews', value_format='DELIMITED'); Your output should resemble: @@ -116,7 +116,7 @@ These examples query messages from Kafka topics called ``pageviews`` and ``users .. code:: sql - ksql> CREATE TABLE users_original (registertime BIGINT, gender VARCHAR, regionid VARCHAR, userid VARCHAR) WITH \ + CREATE TABLE users_original (registertime BIGINT, gender VARCHAR, regionid VARCHAR, userid VARCHAR) WITH \ (kafka_topic='users', value_format='JSON', key = 'userid'); Your output should resemble: @@ -160,7 +160,7 @@ the latest offset. .. code:: sql - ksql> SELECT pageid FROM pageviews_original LIMIT 3; + SELECT pageid FROM pageviews_original LIMIT 3; Your output should resemble: @@ -178,7 +178,7 @@ the latest offset. .. code:: sql - ksql> CREATE STREAM pageviews_enriched AS SELECT users_original.userid AS userid, pageid, regionid, gender \ + CREATE STREAM pageviews_enriched AS SELECT users_original.userid AS userid, pageid, regionid, gender \ FROM pageviews_original LEFT JOIN users_original ON pageviews_original.userid = users_original.userid; Your output should resemble: @@ -197,7 +197,7 @@ the latest offset. .. code:: sql - ksql> SELECT * FROM pageviews_enriched; + SELECT * FROM pageviews_enriched; Your output should resemble: @@ -213,7 +213,7 @@ the latest offset. .. code:: sql - ksql> CREATE STREAM pageviews_female AS SELECT * FROM pageviews_enriched WHERE gender = 'FEMALE'; + CREATE STREAM pageviews_female AS SELECT * FROM pageviews_enriched WHERE gender = 'FEMALE'; Your output should resemble: @@ -231,7 +231,7 @@ the latest offset. .. code:: sql - ksql> CREATE STREAM pageviews_female_like_89 WITH (kafka_topic='pageviews_enriched_r8_r9', \ + CREATE STREAM pageviews_female_like_89 WITH (kafka_topic='pageviews_enriched_r8_r9', \ value_format='DELIMITED') AS SELECT * FROM pageviews_female WHERE regionid LIKE '%_8' OR regionid LIKE '%_9'; Your output should resemble: @@ -250,7 +250,7 @@ the latest offset. .. code:: sql - ksql> CREATE TABLE pageviews_regions WITH (value_format='avro') AS SELECT gender, regionid , COUNT(*) AS numusers \ + CREATE TABLE pageviews_regions WITH (value_format='avro') AS SELECT gender, regionid , COUNT(*) AS numusers \ FROM pageviews_enriched WINDOW TUMBLING (size 30 second) GROUP BY gender, regionid HAVING COUNT(*) > 1; Your output should resemble: @@ -268,7 +268,7 @@ the latest offset. .. code:: sql - ksql> SELECT gender, regionid, numusers FROM pageviews_regions LIMIT 5; + SELECT gender, regionid, numusers FROM pageviews_regions LIMIT 5; Your output should resemble: @@ -287,7 +287,7 @@ the latest offset. :: - ksql> SHOW QUERIES; + SHOW QUERIES; Your output should resemble: @@ -317,13 +317,14 @@ queries. .. code:: sql - ksql> TERMINATE CTAS_PAGEVIEWS_REGIONS; + TERMINATE CTAS_PAGEVIEWS_REGIONS; -#. Run this command to exit the KSQL CLI. +#. Run the ``exit`` command to leave the KSQL CLI. :: ksql> exit + Exiting KSQL. .. enable JMX metrics diff --git a/docs/tutorials/basics-docker.rst b/docs/tutorials/basics-docker.rst index 0fd796ad0723..5f514fc038f3 100644 --- a/docs/tutorials/basics-docker.rst +++ b/docs/tutorials/basics-docker.rst @@ -19,28 +19,28 @@ Download the Tutorial and Start KSQL .. code:: bash - $ git clone https://github.com/confluentinc/ksql.git - $ cd ksql + git clone https://github.com/confluentinc/ksql.git + cd ksql #. Switch to the correct Confluent Platform release branch: .. code:: bash - $ git checkout 4.1.0-post + git checkout 4.1.0-post #. Navigate to the KSQL repository ``docs/tutorials/`` directory and launch the tutorial in Docker. Depending on your network speed, this may take up to 5-10 minutes. .. code:: bash - $ cd docs/tutorials/ - $ docker-compose up -d + cd docs/tutorials/ + docker-compose up -d #. From the host machine, start KSQL CLI on the container. .. code:: bash - $ docker-compose exec ksql-cli ksql http://ksql-server:8088 + docker-compose exec ksql-cli ksql http://ksql-server:8088 .. include:: ../includes/ksql-includes.rst :start-line: 19 @@ -79,7 +79,7 @@ following methods. .. code:: bash - $ docker-compose exec kafka kafka-console-producer --topic t1 --broker-list kafka:29092 --property parse.key=true --property key.separator=: + docker-compose exec kafka kafka-console-producer --topic t1 --broker-list kafka:29092 --property parse.key=true --property key.separator=: Your data input should resemble this. @@ -96,7 +96,7 @@ following methods. .. code:: bash - $ docker-compose exec kafka kafka-console-producer --topic t2 --broker-list kafka:29092 --property parse.key=true --property key.separator=: + docker-compose exec kafka kafka-console-producer --topic t2 --broker-list kafka:29092 --property parse.key=true --property key.separator=: Your data input should resemble this. @@ -117,7 +117,7 @@ environment is properly setup. .. code:: bash - $ docker-compose ps + docker-compose ps Your output should resemble this. Take note of the ``Up`` state. @@ -139,7 +139,7 @@ environment is properly setup. .. code:: bash - $ docker-compose exec kafka kafka-topics --zookeeper zookeeper:32181 --list + docker-compose exec kafka kafka-topics --zookeeper zookeeper:32181 --list Your output should resemble this. @@ -157,7 +157,7 @@ environment is properly setup. .. code:: bash - $ docker-compose exec kafka kafka-console-consumer --topic pageviews --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true + docker-compose exec kafka kafka-console-consumer --topic pageviews --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true Your output should resemble this. @@ -169,7 +169,7 @@ environment is properly setup. .. code:: bash - $ docker-compose exec kafka kafka-console-consumer --topic users --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true + docker-compose exec kafka kafka-console-consumer --topic users --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true Your output should resemble this. diff --git a/docs/tutorials/clickstream-docker.rst b/docs/tutorials/clickstream-docker.rst index cb16e546f186..45e4f1e1c9a6 100644 --- a/docs/tutorials/clickstream-docker.rst +++ b/docs/tutorials/clickstream-docker.rst @@ -19,7 +19,7 @@ your local host. .. code:: bash - $ docker run -p 33000:3000 -it confluentinc/ksql-clickstream-demo:4.1.0 bash + docker run -p 33000:3000 -it confluentinc/ksql-clickstream-demo:4.1.0 bash Your output should resemble: @@ -59,7 +59,7 @@ Configure and Start Elastic, Grafana, and |cp| .. code:: bash - $ /etc/init.d/elasticsearch start + /etc/init.d/elasticsearch start Your output should resemble: @@ -72,7 +72,7 @@ Configure and Start Elastic, Grafana, and |cp| .. code:: bash - $ /etc/init.d/grafana-server start + /etc/init.d/grafana-server start Your output should resemble: @@ -84,7 +84,7 @@ Configure and Start Elastic, Grafana, and |cp| .. code:: bash - $ confluent start + confluent start Your output should resemble: @@ -116,7 +116,7 @@ Create the Clickstream Data .. code:: bash - $ ksql-datagen -daemon quickstart=clickstream format=json topic=clickstream maxInterval=100 iterations=500000 + ksql-datagen -daemon quickstart=clickstream format=json topic=clickstream maxInterval=100 iterations=500000 Your output should resemble: @@ -128,7 +128,7 @@ Create the Clickstream Data .. code:: bash - $ ksql-datagen quickstart=clickstream_codes format=json topic=clickstream_codes maxInterval=20 iterations=100 + ksql-datagen quickstart=clickstream_codes format=json topic=clickstream_codes maxInterval=20 iterations=100 Your output should resemble: @@ -144,7 +144,7 @@ Create the Clickstream Data .. code:: bash - $ ksql-datagen quickstart=clickstream_users format=json topic=clickstream_users maxInterval=10 iterations=1000 + ksql-datagen quickstart=clickstream_users format=json topic=clickstream_users maxInterval=10 iterations=1000 Your output should resemble: @@ -177,9 +177,9 @@ Load the Streaming Data to KSQL ksql-datagen utility to create the clickstream data, status codes, and set of users. - .. code:: bash + .. code:: sql - ksql> RUN SCRIPT '/usr/share/doc/ksql-clickstream-demo/clickstream-schema.sql'; + RUN SCRIPT '/usr/share/doc/ksql-clickstream-demo/clickstream-schema.sql'; The output should resemble: @@ -200,7 +200,7 @@ Verify the data .. code:: bash - ksql> list TABLES; + list TABLES; Your output should resemble: @@ -223,7 +223,7 @@ Verify the data .. code:: bash - ksql> list STREAMS; + list STREAMS; Your output should resemble: @@ -241,9 +241,9 @@ Verify the data **View clickstream data** - .. code:: bash + .. code:: sql - ksql> SELECT * FROM CLICKSTREAM LIMIT 5; + SELECT * FROM CLICKSTREAM LIMIT 5; Your output should resemble: @@ -259,9 +259,9 @@ Verify the data **View the events per minute** - .. code:: bash + .. code:: sql - ksql> SELECT * FROM EVENTS_PER_MIN LIMIT 5; + SELECT * FROM EVENTS_PER_MIN LIMIT 5; Your output should resemble: @@ -278,9 +278,9 @@ Verify the data **View pages per minute** - .. code:: bash + .. code:: sql - ksql> SELECT * FROM PAGES_PER_MIN LIMIT 5; + SELECT * FROM PAGES_PER_MIN LIMIT 5; Your output should resemble: @@ -303,7 +303,7 @@ Send the KSQL tables to Elasticsearch and Grafana. 1. Exit the KSQL CLI with ``CTRL+D``. - .. code:: bash + .. code:: text ksql> Exiting KSQL. @@ -312,14 +312,14 @@ Send the KSQL tables to Elasticsearch and Grafana. .. code:: bash - $ cd /usr/share/doc/ksql-clickstream-demo/ + cd /usr/share/doc/ksql-clickstream-demo/ 3. Run this command to send the KSQL tables to Elasticsearch and Grafana: .. code:: bash - $ ./ksql-tables-to-grafana.sh + ./ksql-tables-to-grafana.sh Your output should resemble: @@ -342,7 +342,7 @@ Send the KSQL tables to Elasticsearch and Grafana. .. code:: bash - $ ./clickstream-analysis-dashboard.sh + ./clickstream-analysis-dashboard.sh Your output should resemble: diff --git a/docs/tutorials/examples.rst b/docs/tutorials/examples.rst index 64c31f41a345..03cce0cd12af 100644 --- a/docs/tutorials/examples.rst +++ b/docs/tutorials/examples.rst @@ -299,13 +299,13 @@ The following examples show common usage: .. code:: bash - $ echo -e "SHOW TOPICS;\nexit" | ksql + echo -e "SHOW TOPICS;\nexit" | ksql - This example uses the Bash `here document `__ (``<<``) to run KSQL CLI commands. .. code:: bash - $ ksql < SHOW TOPICS; > SHOW STREAMS; > exit @@ -316,7 +316,7 @@ The following examples show common usage: .. code:: bash - $ ksql http://localhost:8088 <<< "SHOW TOPICS; + ksql http://localhost:8088 <<< "SHOW TOPICS; SHOW STREAMS; exit" @@ -325,12 +325,12 @@ The following examples show common usage: .. code:: bash - $ cat /path/to/local/application.sql + cat /path/to/local/application.sql CREATE STREAM pageviews_copy AS SELECT * FROM pageviews; .. code:: bash - $ ksql http://localhost:8088 < RUN SCRIPT '/path/to/local/application.sql'; > exit > EOF