0

I have written the following Python code to generate a SAS token for a file placed in an Azure file share directory:

def getParcelFilePath(webcountyval):
    filePath = rf"/plan/IN/{webcountyval}%20parcels.dbf"
    file_sas_token = generate_file_sas(
       account_name=os.environ["AZURE_STORAGE_ACCOUNT_NAME"],
       account_key=os.environ["AZURE_STORAGE_ACCOUNT_KEY"],
       share_name="dev",
       file_path=filePath,
       permission=AccountSasPermissions(read=True),
       expiry=datetime.datetime.now() + datetime.timedelta(hours=8),
       start=datetime.datetime.now()
     )

     filePathWithToken = os.environ["AZURE_STORAGE_SHARE_URL"] + filePath + "?" + file_sas_token
     return(filePathWithToken)

I live in the Central time zone. To ensure that the authentication does not fail, you can see that I have passed 8 hours to the timedelta() function.

The following URL is generated for the file: https://crestlinecapitalstorage.file.core.windows.net/dev/plan/IN/elkhart%20parcels.dbf?st=2024-11-11T23%3A14%3A30Z&se=2024-11-12T07%3A14%3A30Z&sp=r&sv=2024-11-04&sr=f&sig=[redacted_sig]

When I paste this URL into the browser, it shows the following error: Signature did not match. String to sign used was r 2024-11-11T23:14:30Z 2024-11-12T07:14:30Z /file/crestlinecapitalstorage/dev/plan/IN/elkhart parcels.dbf 2024-11-04

What could be the reason for the error?

2
  • Are you sure the account name and key is correct? and what about the file path? Also if I recall correctly the start and expiry times should be in UTC without timezone Commented Nov 12, 2024 at 6:48
  • @LarsKakavandi-Nielsen - Yes, I have verified that the account name is correct. I copied the account key from the Azure portal. I even removed the time zone parameter from the datetime function call, and yet the error persists. This thing is a nightmare to get working. Commented Nov 12, 2024 at 16:31

1 Answer 1

0

It is quite possible that the file_path value “/plan/in/{webcountyval}%20parcels.dbf” was incorrect. The extra ‘/’ at the beginning may not be needed. Anyway, instead of spending any more nightmarish moments trying to get the URL with a SAS token to work, I found a workaround, which is easier to work with and maintain (see AI Overview provided by Google search).

Here's the final code:

def getFileClient(stateval, webcountyval):
    filePath = rf"plan/{stateval}/{webcountyval} parcels.dbf"
    accountKey = os.environ["AZURE_STORAGE_ACCOUNT_KEY"]
    shareName = "dev"

    # Create a ShareClient object
    shareClient = ShareClient(
        account_url=os.environ["AZURE_STORAGE_ACCOUNT_URL"],
        credential=accountKey,
        share_name=shareName
    )

    # Get a reference to the file
    fileClient = shareClient.get_file_client(filePath)
    return(fileClient)

def parcelDBFDataSave(parcelFileClient):

  # Download the file
  fileContent = parcelFileClient.download_file().readall()
  with tempfile.NamedTemporaryFile(delete=False) as tempParcelFile:
    tempParcelFile.write(fileContent)
    tempParcelFileName = tempParcelFile.name

  # Read the first five records 
  with DBF(tempParcelFileName) as dbfRecords:
    i = 0
    for record in dbfRecords:
        logging.info(record)
        i += 1
        if i > 5: break

  return(SUCCESS)

ccParcelFileClient = getFileClient(stateabbr, webCounty)
result = parcelDBFDataSave(ccParcelFileClient)
Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.