Skip to main content

This site requires you to update your browser. Your browsing experience maybe affected by not having the most up to date version.

Data Model Questions

DataObject::write() optimize memory usage


Reply

6 Posts   2308 Views

Avatar
imsas

15 October 2009 at 8:03am Community Member, 22 Posts

function write() dont frees all memory, then using large data inserting into database... Maybe some one try solve this optimization?

Avatar
imsas

15 October 2009 at 9:43am Community Member, 22 Posts

Im using curl connection ang inserting data into catalog, but I find out problem in my writed function CatalogData(). Im inserting ProductGroup() and Product() objects, memory usage looks good (can insert up to ~2 000 inserts and memony usage about 32MB), but then Im trying write ProductSpecification() its Product has many relation. Thats one cycle takes ~3mb memory usage and I cant find out how frees memory, thats happen whet I using write function....

Avatar
imsas

15 October 2009 at 10:25pm Community Member, 22 Posts

Okey Im testing object insertation. Here memory usage when executing only controler and writing ProductGroup() 726 objects:

-[ 21.763416MB iteration start
-[ 71.006488MB iteration end

-[ 71.0268MB iteration start
-[ 77.80532MB niteration end

-[ 77.826208MB iteration start
-[ 79.442648MB iteration end

-[ 79.4634MB iteration start
-[ 93.414464MB iteration end

-[ 93.434848MB iteration start
-[ 95.415392MB iteration end

-[ 95.436168MB iteration start
-[ 95.559544MB iteration end

-[ 95.58028MB iteration start
-[ 95.604808MB iteration end

-[ 95.625592MB iteration start
-[ 96.061632MB iteration end

Maybe have some ideas how to frees used memory back....?

Avatar
dalesaurus

16 October 2009 at 5:26pm Community Member, 283 Posts

I think the easiest solution would be to break up your inserts (or just use plain SQL and mysql from the CLI).

If you have 5000 entries and you can enter 1000 at a time, break the data up into 5 pieces.

Avatar
Ingo

17 October 2009 at 12:47pm Forum Moderator, 801 Posts

Its hard to give you any advice without having reproducible code - I don't know what your CatalogData() method does, so can't say anything about its memory usage.

Avatar
imsas

23 October 2009 at 3:25am Community Member, 22 Posts

problem is here:
$data - count ~ 10 000 entries. I use simple ecommerce module. I thinking write method searching is changed fields or something else.... When I debuging i see memory usage increasing by cycle iterations...

foreach ($data as $item) {
$catalog = new ProductGroup();
$catalog->Title = $item['title'];
...
$catalog->writeToStage ( 'Stage' );
$catalog->publish ( 'Stage', 'Live' );

foreach ($item['Products'] as $elem){
$product = new Product();
$product ->Title = $item['title'];
...
$product ->writeToStage ( 'Stage' );
$product ->publish ( 'Stage', 'Live' );
}

}

DB::Manipulate() is one of the fastest data insertions in the database mode. Maybe I something dont know...