From 20efe4a3a5ffbceedac7bf775466b7a8cde5044f Mon Sep 17 00:00:00 2001
From: Saif Addin 1. DocumentAssembler: Getting
which may be used by annotators down the road
Example:
import
-com.johnsnowlabs.nlp._import
-com.johnsnowlabs.nlp.annotators._import
-org.apache.spark.ml.Pipelineval
+ from sparknlp.annotator import *
+from sparknlp.common import *
+from sparknlp.base import *
+from pyspark.ml import Pipeline
documentAssembler = new DocumentAssembler() \
.setInputCol("text") \
.setOutputCol("document")
@@ -146,10 +146,9 @@ 1. DocumentAssembler: Getting
which may be used by annotators down the road
Example:
- import
-com.johnsnowlabs.nlp._import
-com.johnsnowlabs.nlp.annotators._import
-org.apache.spark.ml.Pipelineval
+ import com.johnsnowlabs.nlp._
+import com.johnsnowlabs.nlp.annotators._
+import org.apache.spark.ml.Pipeline
documentAssembler = new DocumentAssembler()
.setInputCol("text")
.setOutputCol("document")
diff --git a/docs/quickstart.html b/docs/quickstart.html
index 128c16ffc39d59..54fa296fa10aff 100644
--- a/docs/quickstart.html
+++ b/docs/quickstart.html
@@ -92,12 +92,12 @@ Requirements
get started with Spark.
- We are working on updating this page since we published the library on public
- repos to make it easier to use and with spark-packages. This will allow to use pyspark library as well.
- Stay tuned for updates in this section.
+ To use the most recent version just add the --packages JohnSnowLabs:spark-nlp:1.0.0 to you spark command
+
spark-shell --packages JohnSnowLabs:spark-nlp:1.2.3
+ pyspark --packages JohnSnowLabs:spark-nlp:1.2.3
+ spark-submit --packages JohnSnowLabs:spark-nlp:1.2.3
- You can follow up-to-date instructions in our README page in GitHub.
Another way is including downloadable snapshot jar in spark classpath,
which can be downloaded
here
@@ -105,9 +105,7 @@
Requirements
/path/to/spark-nlp.jar to use the library in scala spark.
- To use pyspark now, you may have to clone the repo, and stand inside the python
- folder to make sparknlp module avaiable, while also adding the jar to pyspark with
- --jars as above
+ For further alternatives and documentation check out our README page in GitHub.