Forums

Help Converting SQL to PySpark

I have tried converting the following

select      CAST(LAST_MOD_DATE AS VARCHAR(8)) + ' ' + CONVERT(VARCHAR(8), DATEADD(SECOND, LAST_MOD_TIME % 100 + LAST_MOD_TIME / 100 % 100 * 60 + LAST_MOD_TIME / 10000 * 3600, 0), 8) AS expr1

To Pyspark as follows in Databricks as follows:

test = sql("""select CAST(LAST_MOD_DATE) + ' ' + DATE_ADD(SECOND, LAST_MOD_TIME % 100 + LAST_MOD_TIME / 100 % 100 * 60 + LAST_MOD_TIME / 10000 * 3600, 0), 8) AS expr1 from DmWo""")

But I keep getting the error:

mismatched input ')' expecting {, ';'}(line 1, pos 142)

Does anyone have any thoughts on the error please

We can help you here with PythonAnywhere services, but for questions like that you need to ask on other forums.