Skip to content

Commit

Permalink
[SPARK-11658] simplify documentation for PySpark combineByKey
Browse files Browse the repository at this point in the history
Author: Chris Snow <chsnow123@gmail.com>

Closes apache#9640 from snowch/patch-3.
  • Loading branch information
snowch authored and dskrvk committed Nov 13, 2015
1 parent 62a7a68 commit e769b1f
Showing 1 changed file with 0 additions and 1 deletion.
1 change: 0 additions & 1 deletion python/pyspark/rdd.py
Original file line number Diff line number Diff line change
Expand Up @@ -1760,7 +1760,6 @@ def combineByKey(self, createCombiner, mergeValue, mergeCombiners,
In addition, users can control the partitioning of the output RDD.
>>> x = sc.parallelize([("a", 1), ("b", 1), ("a", 1)])
>>> def f(x): return x
>>> def add(a, b): return a + str(b)
>>> sorted(x.combineByKey(str, add, add).collect())
[('a', '11'), ('b', '1')]
Expand Down

0 comments on commit e769b1f

Please sign in to comment.