Skip to content

Commit

Permalink
Model 1 and Model 2 ParamMaps Missing
Browse files Browse the repository at this point in the history
@dongjoon-hyun @HyukjinKwon

Error in PySpark example code:
[https://github.com/apache/spark/blob/master/examples/src/main/python/ml/estimator_transformer_param_example.py]

The original Scala code says
println("Model 2 was fit using parameters: " + model2.parent.extractParamMap)

The parent is lr

There is no method for accessing parent as is done in Scala.

This code has been tested in Python, and returns values consistent with Scala
  • Loading branch information
marktab authored Sep 7, 2017
1 parent 49968de commit a2ccb8a
Showing 1 changed file with 2 additions and 2 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
# This prints the parameter (name: value) pairs, where names are unique IDs for this
# LogisticRegression instance.
print("Model 1 was fit using parameters: ")
print(model1.extractParamMap())
print(lr.extractParamMap())

# We may alternatively specify parameters using a Python dictionary as a paramMap
paramMap = {lr.maxIter: 20}
Expand All @@ -69,7 +69,7 @@
# paramMapCombined overrides all parameters set earlier via lr.set* methods.
model2 = lr.fit(training, paramMapCombined)
print("Model 2 was fit using parameters: ")
print(model2.extractParamMap())
print(lr.extractParamMap(extra=paramMapCombined))

# Prepare test data
test = spark.createDataFrame([
Expand Down

0 comments on commit a2ccb8a

Please sign in to comment.