Asked  7 Months ago    Answers:  5   Viewed   31 times

UPDATE

The second after I posted this question, thanks to the syntax highlighting of the resulting queries I saw what went wrong: the $pl string didn't open with a closing back tick. Now I changed it to:

$pk = ',`%1$sUSER`,`%1$sTYPE`,`%1$sCODE`,`%1$sVALUE`,`%1$s';//WRONG
$pk = '`,`%1$sUSER`,`%1$sTYPE`,`%1$sCODE`,`%1$sVALUE`,`%1$s';//OK

However, this poses a new question: why didn't the PDO object spew errors back at me for this? executing the query manually surely would return an error saying there is no field called fld2_AGE,, with a comma at the end. Why didn't I get any errors? any Ideas?

PS: any idea's how to upvote SO's syntax highlighting for solving my problem? :-)

I'm leaving the original question as a reference/example, though it doesn't need solving anymore


Ok, I've been using a script I wrote a while back to import generate several csv files from a huge file, to quickly import the data into several mysql tables. This script has worked failry well in the past (I think) up until I wrote a second script, in which I accidentally truncated my tables (stupid, I know). 'No biggie' I thought, as I had the the script, it would be a matter of seconds to restore my data. Unfortunately, I found that now, only one file is being imported, and no errors are showing. Below I have pasted the entire db section of the script. When I execute this code, all I get is the output files imported successfully, which is the very last line of code...

I know it's a failry large block of code, with a lot of string format printing, which doesn't improve readability, so I've also provided the resulting query strings below. As far as I can tell, they look well formatted, as do the files (I checked). Can anyone tell where else I'm supposed to look for errors? It would be a great help... Thanks!

<?php
$files = array_fill_keys(array('filename1','filename2','filename3','filename4'),'');
//$files === array of handles fputcsv($files['filename1'],array('values','from','other','files'),';');
$tbls = array_combine($files,array('tblname1','tblname2','tblname3','tblname4'));
$path = dirname(__FILE__)'/';
$qf = 'LOAD DATA LOCAL INFILE '%s%s.csv' INTO TABLE my_db.tbl_prefix_%s FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY 'n'';
$pref = array_combine($files,array('fld1_','fld2_','fld3_','fld3_'));
$pkA = ' (`%1$sNAME`,`%1$sAGE';
$pk = '`,`%1$sUSER`,`%1$sTYPE`,`%1$sCODE`,`%1$sVALUE`,`%1$s';
try
{
    $db = new PDO('mysql:host=mysqlhostn','user','pass');
    foreach($files as $f)
    {
        $db->beginTransaction();
        $db->exec(sprintf('TRUNCATE TABLE my_db.tbl_prefix_%s',$tbls[$f]));
        $db->commit();
    }
}
catch(PDOException $e)
{
    if ($db)
    {
        $db->rollBack();
        $db = null;
    }
    die('DB connection/truncate failed: '.$e->getMessage()."n");
}
try
{
    while($f = array_shift($files))
    {
        $db->beginTransaction();
        $q = sprintf($qf,$path,$f,$tbls[$f]).sprintf($pkA.($f !== 'agent' ? $pk : ''),$pref[$f]);
        switch($f)
        {
            case 'filename3':
                $q .= 'tbl3_specific_field';
            break;
            case 'filename2':
                $q .= sprintf('tbl2_specific_field`,`%1$tbl2_specific_field2',$pref[$f]);
            break;
            case 'filename4':
                $q .= sprintf('tbl4_specific_field`,`%1$tbl4_specific_field2`,`%1$tbl4_specific_field3`,`%1$tbl4_specific_field4',$pref[$f]);
            break;
        }
        $stmt = $db->prepare($q.'`)');
        $stmt->execute();
        $db->commit();
    }
}
catch(PDOException $e)
{
    $db->rollBack();
    $e = 'CSV import Failed: '.$e->getMessage();
    $db=null;
    die($e."n");
}
$db = null;
exit('files imported successfully'."n");
?>

generated Queries - execution output:

TRUNCATE TABLE my_db.tbl_prefix_tblname1
TRUNCATE TABLE my_db.tbl_prefix_tblname2
TRUNCATE TABLE my_db.tbl_prefix_tblname3
TRUNCATE TABLE my_db.tbl_prefix_tblname4

LOAD DATA LOCAL INFILE '/local/path/to/files/filename1.csv' INTO TABLE my_db.tbl_prefix_tblname1 FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY 'n' (`fld1_NAME`,`fld1_AGE`)
LOAD DATA LOCAL INFILE '/local/path/to/files/filename2.csv' INTO TABLE my_db.tbl_prefix_tblname2 FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY 'n' (`fld2_NAME`,`fld2_AGE,`fld2_USER`,`fld2_TYPE`,`fld2_CODE`,`fld2_VALUE`,`fld2_tbl2_specific_field`,`fld2_tbl2_specific_field2`)
LOAD DATA LOCAL INFILE '/local/path/to/files/filename3.csv' INTO TABLE my_db.tbl_prefix_tblname3 FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY 'n' (`fld3_NAME`,`fld3_AGE,`fld3_USER`,`fld3_TYPE`,`fld3_CODE`,`fld3_VALUE`,`fld3_tbl3_specific_field`)
LOAD DATA LOCAL INFILE '/local/path/to/files/filename4.csv' INTO TABLE my_db.tbl_prefix_tblname4 FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY 'n' (`fld4_NAME`,`fld4_AGE,`fld4_USER`,`fld4_TYPE`,`fld4_CODE`,`fld4_VALUE`,`fld4_tbl4_specific_field`,`fld4_tbl4_specific_field2`,`fld4_tbl4_specific_field3`,`fld4_tbl4_specific_field4`)

csv imported successfully

File1 is getting imported as I need it to be, an example of the first file:

11;9

While the actual content of file2 (which isn't getting imported) looks like this:

11;9;25;5;FOOBAR;Z;333;321;123

Both first fields contain the same data, as they should, both tables have the same field definitions, same storage engine (InnoDB), collation (UTF-8)... I have no idea what's causing the problem, so any advice would be greatly appreciated.

 Answers

45

why didn't the PDO object spew errors back at me for this?

Because MySQL executed your query without any errors. Only because you've written the wrong query this must not mean that the query is that wrong that MySQL won't accept it.

Whenever you generate SQL queries programmatically, verify (by debugging or even better unit-tests), that the query has been created right for what you want to do.

If you want to get an exception each time an error occurs, enable that:

$db->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
Wednesday, March 31, 2021
 
Gersom
answered 7 Months ago
64

It seems that PDO::MYSQL_ATTR_FOUND_ROWS is a mysql connection option. Thus, it works only as PDO connection option as well. So, set it up this way

$opt  = array(
    PDO::MYSQL_ATTR_FOUND_ROWS   => TRUE,
    // you may wish to set other options as well
    PDO::ATTR_ERRMODE            => PDO::ERRMODE_EXCEPTION,
    PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_ASSOC,
);
$this->_db = new PDO($dsn,DB_USER,DB_PASS,$opt);
Wednesday, March 31, 2021
 
shivam
answered 7 Months ago
92

This should do it:

file_names <- dir("~/Desktop/data") 
df <- do.call(rbind, lapply(file_names, function(x) cbind(read.csv(x), name=strsplit(x,'\.')[[1]][1])))
Thursday, August 5, 2021
 
Isky
answered 3 Months ago
29

Your PDO is configured to emulate prepared queries, whereas mysqli is using true prepared queries.

The prepared query binds the string ''1'' as an integer parameter value. PHP coerces it to an integer using something like intval(). Any string with non-numeric leading characters is interpreted as 0 by PHP, so the parameter value sent after prepare is the value 0.

The fake prepared query uses string interpolation (instead of binding) to add the string ''1'' into the SQL query before MySQL parses it. But the result is similar, because SQL also treats a string with non-numeric leading characters in an integer context as the value 0.

The only difference is what ends up in the general query log when the parameter is bound before prepare versus after prepare.

You can also make PDO use real prepared queries, so it should act just like mysqli in this case:

$dbh->setAttribute(PDO::ATTR_EMULATE_PREPARES, false);

PS: This may demonstrate a good reason why it's customary to start id values at 1 instead of 0.

Wednesday, August 18, 2021
 
derobert
answered 2 Months ago
59

We can use map

library(purrr)
library(readr)
out_list <- map(data, read_csv)
names(out_list) <- sub("\.csv", "", data)
list2env(out_list, .GlobalEnv) # not recommended

Or using a for loop, create a NULL list to store the output of the read data, then loop over the 'data', read them and assign it to the output list element

out_list <- vector('list', length(data))
names(out_list) <- data
for(file in data) out_list[[file]] <- read_csv(file)

If we want to create multiple new objects (not recommended)

for(file in data) {
    assign(sub("\.csv", "", file), read_csv(file))
 }
Saturday, August 21, 2021
 
Andrew
answered 2 Months ago
Only authorized users can answer the question. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :