'SQL import command

I am having trouble writing an import query in multiple tables. Here is my original data in csv file.

Original Table `survey.csv`

id   |date      |race   |age   |other data    
--------------------------------------------
001  |2022-02-05|White  |32    |data123
002  |2022-01-16|Asian  |28    |data234
003  |2022-01-25|Asian  |24    |data462
004  |2022-01-25|Black  |28    |data346
005  |2022-03-25|Other  |28    |data346

I know the import script for the table is

LOAD DATA INFILE '~/Documents/survey.csv' 
INTO TABLE survey 
FIELDS TERMINATED BY ',' 
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;

but I am redesigning the table into following three

---race--- 

id   |short_name
--------------------
1    |White
2    |Black
3    |Asian
4    |Other

---imported `survey` table---

id   |date      |race_id|age   
-------------------------------
001  |2022-02-05|1      |32    
002  |2022-01-16|3      |28    
003  |2022-01-25|3      |24    
004  |2022-01-25|2      |28    
005  |2022-03-25|4      |28    


---data---

id   |other data 
-----------------
001  |data123
002  |data234
003  |data462
004  |data346
005  |data346

What is a good script to import the data in proper way? I tried several way and I wasn't succeed.



Solution 1:[1]

The LOAD DATA INFILE command only supports importing to a single table per command. You can't split up the data into multiple tables in a single call to that command.

One solution is that you could import the data to a single table, then run a series of steps to refactor the data into multiple tables.

A different solution is that you split up the file into three different CSV files, then import each one to the respective table.

A third solution would be for you to write a custom program to import the data row by row, splitting as needed, or re-using entries created by earlier rows. As far as I know, there is no tool that does this automatically. You would have to write it yourself. Pretty much every programming language has a function for reading a CSV file and looping over its entries. Then it's up to you to write the logic to separate the fields that go into each table.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Bill Karwin