I have some Python code that runs as part of an Azure Synapse Analytics Apache Spark Notebook (or Synapse Notebook) and would like to add effective error handling. The code simply executes a given SQL script against the database. The code runs but I sometimes see errors like attempt to use closed connection. I would like to do the following:
- Improve code that I wrote through peer review
Can I improve the error handling? eg pseudo-codeif connection still open close connectionThe code using SQL auth works. I would like to authenticate as the Managed Identity, I've tried using the object id of the MI in the connection string withAuthentication=ActiveDirectoryMsibut it didn't work
Cell1 - parameters
pAccountName = 'someStorageAccount'
pContainerName = 'someContainer'
pRelativePath = '/raw/sql/some_sql_files/'
pFileName = 'someSQLscript.sql'
Cell 2 - main
import pyodbc
from pyspark import SparkFiles
try:
# Add the file to the cluster so we can view it
sqlFilepath = f"""abfss://{pContainerName}""" + "@" + f"""{pAccountName}.dfs.core.windows.net{pRelativePath}{pFileName}"""
sc.addFile(sqlFilepath, False)
# Open the file for reading
with open(SparkFiles.getRootDirectory() + f'/{pFileName}', 'r') as f:
lines = f.read() ## read all text from a file into a string
# Open the database connection
conn = pyodbc.connect( 'DRIVER={ODBC Driver 17 for SQL Server};'
'SERVER=someServer.sql.azuresynapse.net;'
'DATABASE=someDatabase;UID=batchOnlyUser;'
'PWD=youWish;', autocommit = True )
# Split the script into batches separated by "GO"
for batch in lines.split("GO"):
conn.execute(batch) # execute the SQL statement
except:
raise
finally:
# Tidy up
conn.close()
select @@versionto big create table, insert statements etc \$\endgroup\$Authentication=ActiveDirectoryMsidoesn't work. We can only review working code. \$\endgroup\$