Stuck at processing large amount of data

Notice: This thread is very old.
aeon
Member | 6
+
0
-

Hello everyone!

Before we start, sorry for my english…

Okay, we have table with large amount of data, that stored in database (>15.000 rows).
We have not so much memory ⇒ we can't get all 15.000 rows at once. Solution is to retrieve part of data (for example 1.000 rows), process it and then continue to the next 1.000 row.

The question is: How to free memory from previous 1.000 rows, that already processed?

Example of code:

$offset = 0;
while (true) {
	$data = $this->context->createUnderpants()->limit(1000, $offset);
	if (!$data->count()) { break; }
	$result += processUnderpants($data);
	unset($data); // maybe Nette has equivalent method/function/whatever?
	$offset += 1000;
}

Technical information:

  • Nette 2.0.10 released on 2013–03–08.
  • MySQL 5.5.29
  • PHP 5.3.6

Last edited by aeon (2013-09-10 22:24)

mkoubik
Member | 728
+
0
-

Are you doing this from command line? If so I would use a separate process for each batch.

aeon
Member | 6
+
0
-

Unfortunately, no. Because query must be modified.

Jan Tvrdík
Nette guru | 2595
+
0
-

@aeon: This is probably caused by a known PDO bug, fixing it however required significant changes with BC breaks to Nette\Database, therefore the fix was not ported to 2.0.x. Try using the latest development version and it should be fine.