Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KSQL-1864: Remove $ chars prompts for example commands #2152

Merged
merged 1 commit into from
Nov 14, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 15 additions & 5 deletions docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -93,9 +93,19 @@ How do I shutdown a KSQL environment?

.. code:: bash

$ jps | grep DataGen
jps | grep DataGen

Your output should resemble:

.. code:: text

25379 DataGen
$ kill 25379

Stop the DataGen JVM by using the specified process ID:

.. code:: bash

kill 25379

- Exit KSQL.

Expand All @@ -108,13 +118,13 @@ How do I shutdown a KSQL environment?

.. code:: bash

$ confluent stop
confluent stop

- To remove all data, topics, and streams:

.. code:: bash

$ confluent destroy
confluent destroy

============================================
How do I configure the target Kafka cluster?
Expand Down Expand Up @@ -284,7 +294,7 @@ In the KSQL CLI, use the SET statement to assign a value to ``ksql.streams.reten

.. code:: bash

ksql> SET 'ksql.streams.retention.ms' = '86400000';
SET 'ksql.streams.retention.ms' = '86400000';

Make the setting global by assigning ``ksql.streams.retention.ms`` in the KSQL
server configuration file.
Expand Down
25 changes: 13 additions & 12 deletions docs/includes/ksql-includes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ These examples query messages from Kafka topics called ``pageviews`` and ``users

.. code:: sql

ksql> CREATE STREAM pageviews_original (viewtime bigint, userid varchar, pageid varchar) WITH \
CREATE STREAM pageviews_original (viewtime bigint, userid varchar, pageid varchar) WITH \
(kafka_topic='pageviews', value_format='DELIMITED');

Your output should resemble:
Expand All @@ -116,7 +116,7 @@ These examples query messages from Kafka topics called ``pageviews`` and ``users

.. code:: sql

ksql> CREATE TABLE users_original (registertime BIGINT, gender VARCHAR, regionid VARCHAR, userid VARCHAR) WITH \
CREATE TABLE users_original (registertime BIGINT, gender VARCHAR, regionid VARCHAR, userid VARCHAR) WITH \
(kafka_topic='users', value_format='JSON', key = 'userid');

Your output should resemble:
Expand Down Expand Up @@ -160,7 +160,7 @@ the latest offset.

.. code:: sql

ksql> SELECT pageid FROM pageviews_original LIMIT 3;
SELECT pageid FROM pageviews_original LIMIT 3;

Your output should resemble:

Expand All @@ -178,7 +178,7 @@ the latest offset.

.. code:: sql

ksql> CREATE STREAM pageviews_enriched AS SELECT users_original.userid AS userid, pageid, regionid, gender \
CREATE STREAM pageviews_enriched AS SELECT users_original.userid AS userid, pageid, regionid, gender \
FROM pageviews_original LEFT JOIN users_original ON pageviews_original.userid = users_original.userid;

Your output should resemble:
Expand All @@ -197,7 +197,7 @@ the latest offset.

.. code:: sql

ksql> SELECT * FROM pageviews_enriched;
SELECT * FROM pageviews_enriched;

Your output should resemble:

Expand All @@ -213,7 +213,7 @@ the latest offset.

.. code:: sql

ksql> CREATE STREAM pageviews_female AS SELECT * FROM pageviews_enriched WHERE gender = 'FEMALE';
CREATE STREAM pageviews_female AS SELECT * FROM pageviews_enriched WHERE gender = 'FEMALE';

Your output should resemble:

Expand All @@ -231,7 +231,7 @@ the latest offset.

.. code:: sql

ksql> CREATE STREAM pageviews_female_like_89 WITH (kafka_topic='pageviews_enriched_r8_r9', \
CREATE STREAM pageviews_female_like_89 WITH (kafka_topic='pageviews_enriched_r8_r9', \
value_format='DELIMITED') AS SELECT * FROM pageviews_female WHERE regionid LIKE '%_8' OR regionid LIKE '%_9';

Your output should resemble:
Expand All @@ -250,7 +250,7 @@ the latest offset.

.. code:: sql

ksql> CREATE TABLE pageviews_regions WITH (value_format='avro') AS SELECT gender, regionid , COUNT(*) AS numusers \
CREATE TABLE pageviews_regions WITH (value_format='avro') AS SELECT gender, regionid , COUNT(*) AS numusers \
FROM pageviews_enriched WINDOW TUMBLING (size 30 second) GROUP BY gender, regionid HAVING COUNT(*) > 1;

Your output should resemble:
Expand All @@ -268,7 +268,7 @@ the latest offset.

.. code:: sql

ksql> SELECT gender, regionid, numusers FROM pageviews_regions LIMIT 5;
SELECT gender, regionid, numusers FROM pageviews_regions LIMIT 5;

Your output should resemble:

Expand All @@ -287,7 +287,7 @@ the latest offset.

::

ksql> SHOW QUERIES;
SHOW QUERIES;

Your output should resemble:

Expand Down Expand Up @@ -317,13 +317,14 @@ queries.

.. code:: sql

ksql> TERMINATE CTAS_PAGEVIEWS_REGIONS;
TERMINATE CTAS_PAGEVIEWS_REGIONS;

#. Run this command to exit the KSQL CLI.
#. Run the ``exit`` command to leave the KSQL CLI.

::

ksql> exit
Exiting KSQL.

.. enable JMX metrics

Expand Down
24 changes: 12 additions & 12 deletions docs/tutorials/basics-docker.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,28 +19,28 @@ Download the Tutorial and Start KSQL

.. code:: bash

$ git clone https://github.com/confluentinc/ksql.git
$ cd ksql
git clone https://github.com/confluentinc/ksql.git
cd ksql

#. Switch to the correct Confluent Platform release branch:

.. code:: bash

$ git checkout 4.1.0-post
git checkout 4.1.0-post

#. Navigate to the KSQL repository ``docs/tutorials/`` directory and launch the tutorial in
Docker. Depending on your network speed, this may take up to 5-10 minutes.

.. code:: bash

$ cd docs/tutorials/
$ docker-compose up -d
cd docs/tutorials/
docker-compose up -d

#. From the host machine, start KSQL CLI on the container.

.. code:: bash

$ docker-compose exec ksql-cli ksql http://ksql-server:8088
docker-compose exec ksql-cli ksql http://ksql-server:8088

.. include:: ../includes/ksql-includes.rst
:start-line: 19
Expand Down Expand Up @@ -79,7 +79,7 @@ following methods.

.. code:: bash

$ docker-compose exec kafka kafka-console-producer --topic t1 --broker-list kafka:29092 --property parse.key=true --property key.separator=:
docker-compose exec kafka kafka-console-producer --topic t1 --broker-list kafka:29092 --property parse.key=true --property key.separator=:

Your data input should resemble this.

Expand All @@ -96,7 +96,7 @@ following methods.

.. code:: bash

$ docker-compose exec kafka kafka-console-producer --topic t2 --broker-list kafka:29092 --property parse.key=true --property key.separator=:
docker-compose exec kafka kafka-console-producer --topic t2 --broker-list kafka:29092 --property parse.key=true --property key.separator=:

Your data input should resemble this.

Expand All @@ -117,7 +117,7 @@ environment is properly setup.

.. code:: bash

$ docker-compose ps
docker-compose ps

Your output should resemble this. Take note of the ``Up`` state.

Expand All @@ -139,7 +139,7 @@ environment is properly setup.

.. code:: bash

$ docker-compose exec kafka kafka-topics --zookeeper zookeeper:32181 --list
docker-compose exec kafka kafka-topics --zookeeper zookeeper:32181 --list

Your output should resemble this.

Expand All @@ -157,7 +157,7 @@ environment is properly setup.

.. code:: bash

$ docker-compose exec kafka kafka-console-consumer --topic pageviews --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true
docker-compose exec kafka kafka-console-consumer --topic pageviews --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true

Your output should resemble this.

Expand All @@ -169,7 +169,7 @@ environment is properly setup.

.. code:: bash

$ docker-compose exec kafka kafka-console-consumer --topic users --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true
docker-compose exec kafka kafka-console-consumer --topic users --bootstrap-server kafka:29092 --from-beginning --max-messages 3 --property print.key=true

Your output should resemble this.

Expand Down
Loading