Is it possible to validate a text file before I dump its data into a MYSQL database?
I want to check if it contains, say, 5 columns (of data). If so, then i go ahead with the following query:
LOAD DATA CONCURRENT INFILE 'c:/test/test.txt' INTO TABLE DUMP_TABLE FIELDS TERMINATED BY 't' ENCLOSED BY '' LINES TERMINATED BY 'n' ignore 1 lines.
If not, I remove the entire row. I repeat this process for all rows in the txt file.
The text file contains data of the format:
id col2 col3 2012-07-27-19:27:06 col5 id col2 col3 2012-07-25-09:58:50 col5 id col2 col3 2012-07-23-10:14:13 col5
Advertisement
Answer
EDIT: After reading your comments, here’s the code for doing the same on tab separated data:
$handler = fopen("myfile.txt","r"); $error = false; while (!feof($handler)){ fgets($handler,$linetocheck); $cols = explode (chr(9), $linetocheck); //edit: using http://es.php.net/manual/en/function.fgetcsv.php you can get the same result as with fgets+explode if (count($cols)>$max_cols){ $error=true; break; } } fclose($handler); if (!$error){ //...do stuff }
This code reads a file, let’s say “myfile.txt”, line by line, and sets variable $error to true if any of the lines has a length of more than $max_cols. (My apologies if that’s not what you’re asking, your question is not the most clear to me)
$handler = fopen("myfile.txt","r"); $error = false; while (!feof($handler)){ fgets($handler,$linetocheck); if (strlen($linetocheck)>$max_cols){ $error=true; break; } } fclose($handler); if (!$error){ //...do stuff }