I am writing a script in PHP to directly import data from a JSON file directly into a MySQL database. My sample data at the moment is pretty small. I have a few questions about it. I would like to do this without external packages. If you know or have a code sample I can look at that would be very helpful.
- How can I make it work with very large data sets? Break it up in chunks? Import it line by line? Use fopen?
- If during it’s run the script get interrupted for some reason, is there a way to make sure it continues where it left of? How would you do this?
- If I want to remove duplicates with the same emails should I do it during the import or maybe after with a SQL query?
- Would this work with XML and CSV as well?
DATA
[ { "id":1, "name":"Mike", "email":"mike@businessweek.com" }, { "id":2, "name":"Nick", "email":"nick@businessweek.com" } ]
$jsondata = file_get_contents('data.json'); $data = json_decode($jsondata, true); foreach ($data as $row) { $query = 'INSERT INTO `member` (`name`, `email`) VALUES (?, ?)'; $statement = $pdo->prepare($sql); $statment->execute([$row['name'], $row['email']]); }
Advertisement
Answer
This will probably be closed, but to give you some information:
- The only way to break it up is if you had individual JSON objects per line, or individual JSON objects, or grab every X lines (5 in your example). Then you could just
fopen
and read it line by line. You could get from{
to}
(not recommended). - The only way that I can think of is to use transactions, and/or track the number of rows total and decrement each insert in a separate DB table.
- If you import the entire file you can do it then, maybe with
array_column
to reindex onemail
which will remove duplicates. If not then it would have to be after with SQL. - Yes, you would do it the same way, just parsing the data is different. CSV would be fgetcsv or str_getcsv and XML would be Simple XML or something similar. After the data is in an array of arrays it will be the same inserting.