0

Alright so I was going to test my pipeline/data flow where I'm querying a cosmos DB and putting some of that data into an Azure SQL DB table. When trying to do so, I came across this validation error before it could run.

The following columns have a complex structure which can only be written to Snowflake, REST, ORC, JSON, AVRO and Azure Cosmos DB: (list of columns)

I don't care about these columns because they don't even exist in my SQL table, so it wouldn't have been inserted, but Azure Data Factory forced me to solve this. (Also just to note I need this data flow to by dynamic, so I can't just pick the exact columns I want because this is being set up so that it could handle different queries/columns). My short term solution to this has been to do a rule based mapping within my Select activity name != 'column1' && name != 'column2' etc. But is there a way to always exclude a column if it's complex? I would hate to have to do this for every complex column for every cosmos query.

1
  • 1
    This is not supported by mapping data flow. But you can set up those columns to be ignored when you query the source. So rather than having a bunch of rules in the MDF, create a data source, alter the queries in different pipelines, and have those pipelines use your generic data flow Commented Aug 29 at 13:34

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.