'In spark I want to create one class that extend RDD ,but compute method will return iterator or conv subclass
I hope create class extend from RDD(spark). but spark RDD compute method return Iterator In RDD class that method define: def compute(split: Partition, context: TaskContext): Iterator[T] but I hope return tuple. In this situation, I do not want change RDD too much. Likewise use Either type replace Iterator, this cause too many code change. Does someone have better solution? thanks.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
