I've got this xml file, it is huge 442 mb dump. I want to export it into a mysql database. Now what I'm wondering is does php open the whole file when reading it, or does it do it only by what it finds. Because I want to export it but I'm afraid my computer wouldn't be able to open such a size, time does not matter I can leave my computer on. So is this the right way to do it, or is there a better way. Also not all of the XML I want to export, only parts of it and I want to be able to edit what goes into the mysql database. Thanks.
There's no time like the present.
I suspect that any of the built-in XML functions are going to end up loading everything into memory.
You might be able to do something like use fopen() and fgets(), reading one line at a time. You then need logic to parse the line, set variables to be used in the SQL, and when you find an end-of-record element execute the current query; roughly something like:
$fh = fopen('xmlfile', 'r') or die('fopen error');
$queryData = array();
while(($line = fgets($fh)) !== FALSE)
// build query from data saved in $queryData, then execute it
$queryData = array(); reinitialize
elseif(preg_match('/<tag1>([^<]*)</tag1>/', $line, $matches))
$queryData['tag1'] = $matches;
elseif(preg_match('/<tag2>([^<]*)</tag2>/', $line, $matches))
$queryData['tag2'] = $matches;
// etc. for each data element
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)