'PySpark str.title(): 'Column' object is not callable
I want the first letter in all rows of a particular column to be capitalized.
df['Name'] = df.Name.str.title()
But I keep getting the error 'Column' object is not callable. What am I missing?
Solution 1:[1]
Hey you are doing it wrong, pyspark is bit different from pandas.
do the follow:
from pyspark.sql.functions import initcap, col
df.withColumn('Name', initcap(col('Name'))
Reference: https://www.datasciencemadesimple.com/convert-to-upper-case-lower-case-and-title-case-in-pyspark/
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Benny Elgazar |
