Stuck at processing large amount of data

1. 7 months ago

aeon
Member
Registered: 2012-05-03
Posts: 6

Stuck at processing large amount of data

Hello everyone!

Before we start, sorry for my english…

Okay, we have table with large amount of data, that stored in database (>15.000 rows).
We have not so much memory ⇒ we can't get all 15.000 rows at once. Solution is to retrieve part of data (for example 1.000 rows), process it and then continue to the next 1.000 row.

The question is: How to free memory from previous 1.000 rows, that already processed?

Example of code:

$offset = 0;
while (true) {
    $data = $this->context->createUnderpants()->limit(1000, $offset);
    if (!$data->count()) { break; }
    $result += processUnderpants($data);
    unset($data); // maybe Nette has equivalent method/function/whatever?
    $offset += 1000;
}

Technical information:

  • Nette 2.0.10 released on 2013–03–08.
  • MySQL 5.5.29
  • PHP 5.3.6

Last edited by aeon (2013-09-10 22:24)

 

2. 7 months ago

mkoubik
Member
Registered: 2009-04-28
Location: Praha
Posts: 559

Re: Stuck at processing large amount of data

Are you doing this from command line? If so I would use a separate process for each batch.

 

3. 7 months ago

aeon
Member
Registered: 2012-05-03
Posts: 6

Re: Stuck at processing large amount of data

Unfortunately, no. Because query must be modified.

 

4. 7 months ago

Jan Tvrdík
Nette guru
Registered: 2008-04-13
Location: Prostějov
Posts: 1883

Re: Stuck at processing large amount of data

@aeon: This is probably caused by a known PDO bug, fixing it however required significant changes with BC breaks to Nette\Database, therefore the fix was not ported to 2.0.x. Try using the latest development version and it should be fine.


 

Board footer

Switch to desktop