1

I am working on a sensitive migration. The scenario is as follows:

  • I have a new table that I need to populate with data
  • There is an existing table, which contains a column (type = json), which contains an array of objects such as:
[
 {
    "id": 0,
    "name": "custom-field-0",
    "label": "When is the deadline for a response?",
    "type": "Date",
    "options": "",
    "value": "2020-10-02",
    "index": 1
  },
  {
    "id": 1,
    "name": "custom-field-1",
    "label": "What territory does this relate to?",
    "type": "Dropdown",
    "options": "UK, DE, SE, DK, BE, NL, IT, FR, ES, AT, CH, NO, US, SG, Other",
    "value": " DE",
    "index": 2
  }
]

I need to essentially map these values in this column to my new table. I have worked with JSON data in PostgresQL before, where I was dealing with a single object in the JSON, but never with arrays of objects and on such a large scale.

So just to summarise, how does someone iterate every row, and every object in an array, and insert that data into a new table?

EDIT

I have been experimenting with some functions, and I found one that seems promising json_array_elements_text or json_array_elements. As this allowed me to add multiple rows to the new table using this array of objects.

However, my issue is that I need to map certain values to the new table.

INSERT INTO form_field_value ("name", "label", "inputType", "options", "form" "workspace")
SELECT <<HERE IS WHERE I NEED TO EXTRACT VALUES FROM THE JSON ARRAY>>, task.form, task.workspace
FROM task;

EDIT 2

I have been playing around some more with the above functions, but reached a slight issue.

INSERT INTO form_field_value ("name", "label", "inputType", "options", "form" "workspace")
SELECT cf ->> 'name',
       (cf ->> 'label')
       ...
FROM jsonb_array_elements(task."customFields") AS t(cf);

My issue lies in the FROM clause, so customFields is the array of objects, but I also need to get the form and workspace attribute from this table too. Plus I a pretty sure that the FROM clause would not work anyway, as it probably will complain about the task."customFields" not being specified or something.

2 Answers 2

2

Here is the select statement that uses json_array_elements and a lateral join in the from clause to flatten the data.

select j ->> 'name' as "name", j ->> 'label' as "label", 
  j ->> 'type' as "inputType", j ->> 'options' as "options", form, workspace
from task 
cross join lateral json_array_elements("customFields") as l(j);

The from clause can be less verbose

from task, json_array_elements("customFields") as l(j)
Sign up to request clarification or add additional context in comments.

1 Comment

Thanks for this answer. This is a huge migration, and I still have a lot of data manipulation to work with. But this was a huge block in the road for me. Appreciate the help.
0

you can try to use json_to_recordset:

select * from json_to_recordset('
[
 {
    "id": 0,
    "name": "custom-field-0",
    "label": "When is the deadline for a response?",
    "type": "Date",
    "options": "",
    "value": "2020-10-02",
    "index": 1
  },
  {
    "id": 1,
    "name": "custom-field-1",
    "label": "What territory does this relate to?",
    "type": "Dropdown",
    "options": "UK, DE, SE, DK, BE, NL, IT, FR, ES, AT, CH, NO, US, SG, Other",
    "value": " DE",
    "index": 2
  }
]
') as x(id int, name text,label text,type text,options text,value text,index int)

for insert record you can use an sql like this:

INSERT INTO form_field_value ("name", "label", "inputType", "options", "form" "workspace")
SELECT name, label, type, options, form, workspace
FROM 
  task, 
  json_to_record(task) AS 
 x (id int, name text,label text,type text,options text,value text,index int)  

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.