pyspark.sql.functions.to_unix_timestamp

pyspark.sql.functions.to_unix_timestamp(timestamp: ColumnOrName, format: Optional[ColumnOrName] = None) → pyspark.sql.column.Column[source]

Returns the UNIX timestamp of the given time.

New in version 3.5.0.

Parameters
timestampColumn or str

Input column or strings.

formatColumn or str, optional

format to use to convert UNIX timestamp values.

Examples

>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> df = spark.createDataFrame([("2016-04-08",)], ["e"])
>>> df.select(to_unix_timestamp(df.e, lit("yyyy-MM-dd")).alias('r')).collect()
[Row(r=1460098800)]
>>> spark.conf.unset("spark.sql.session.timeZone")
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> df = spark.createDataFrame([("2016-04-08",)], ["e"])
>>> df.select(to_unix_timestamp(df.e).alias('r')).collect()  
[Row(r=None)]
>>> spark.conf.unset("spark.sql.session.timeZone")