pyspark.sql.functions.weekday#

pyspark.sql.functions.weekday(col)[source]#

Returns the day of the week for date/timestamp (0 = Monday, 1 = Tuesday, …, 6 = Sunday).

New in version 3.5.0.

Parameters
colColumn or column name

target date/timestamp column to work on.

Returns
Column

the day of the week for date/timestamp (0 = Monday, 1 = Tuesday, …, 6 = Sunday).

Examples

Example 1: Extract the day of the week from a string column representing dates

>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame([('2015-04-08',), ('2024-10-31',)], ['dt'])
>>> df.select("*", sf.typeof('dt'), sf.weekday('dt')).show()
+----------+----------+-----------+
|        dt|typeof(dt)|weekday(dt)|
+----------+----------+-----------+
|2015-04-08|    string|          2|
|2024-10-31|    string|          3|
+----------+----------+-----------+

Example 2: Extract the day of the week from a string column representing timestamp

>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame([('2015-04-08 13:08:15',), ('2024-10-31 10:09:16',)], ['ts'])
>>> df.select("*", sf.typeof('ts'), sf.weekday('ts')).show()
+-------------------+----------+-----------+
|                 ts|typeof(ts)|weekday(ts)|
+-------------------+----------+-----------+
|2015-04-08 13:08:15|    string|          2|
|2024-10-31 10:09:16|    string|          3|
+-------------------+----------+-----------+

Example 3: Extract the day of the week from a date column

>>> import datetime
>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame([
...     (datetime.date(2015, 4, 8),),
...     (datetime.date(2024, 10, 31),)], ['dt'])
>>> df.select("*", sf.typeof('dt'), sf.weekday('dt')).show()
+----------+----------+-----------+
|        dt|typeof(dt)|weekday(dt)|
+----------+----------+-----------+
|2015-04-08|      date|          2|
|2024-10-31|      date|          3|
+----------+----------+-----------+

Example 4: Extract the day of the week from a timestamp column

>>> import datetime
>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame([
...     (datetime.datetime(2015, 4, 8, 13, 8, 15),),
...     (datetime.datetime(2024, 10, 31, 10, 9, 16),)], ['ts'])
>>> df.select("*", sf.typeof('ts'), sf.weekday('ts')).show()
+-------------------+----------+-----------+
|                 ts|typeof(ts)|weekday(ts)|
+-------------------+----------+-----------+
|2015-04-08 13:08:15| timestamp|          2|
|2024-10-31 10:09:16| timestamp|          3|
+-------------------+----------+-----------+