Hi.
Im trying to do a benchmark on how ofbiz would handle large numbers of data. I tried to create a Purchase Order with 1000 products in it. I tried processing it and it was done at about 30 mins. Can anyone suggest a way to shorten time to process about a thousand items? Thanks. - ian |
Le mardi 10 novembre 2009 à 19:58 +0800, ian tabangay a écrit :
> Hi. > Im trying to do a benchmark on how ofbiz would handle large numbers of data. > I tried to create a Purchase Order with 1000 products in it. I tried > processing it and it was done at about 30 mins. Can anyone suggest a way to > shorten time to process about a thousand items? Hi, which ofbiz version/screen are you using ? I'm fighting with LookUpBulkAddSupplierProducts.groovy used to purchase order for a few days, because it doesn't work in trunk since r831676 And actually, I think that it needs a complete rewrite but it's quite difficult using Delegator and xml forms. The best thing would be to use sql row_number, but even if we could use it, there will be some pagination problems because the first number of products is not the same as the final one… I'm working on it, but I'll not be able to provide something quickly, so if anybody else want to give a try. Regards, Matthieu. |
oh actually im talking about processing (as in completing) a order with 1000
items. Im looking for ways to tweek ofbiz to make it faster. Has anyone had the same experience? Maybe on a different document (Shipment, Returns, etc)? Id like to get some input. Currently Im testing ofbiz with about 500 facilities, 1,500 suppliers, 18,000 products and 300 product categories. Its quite a load so Im checking if ofbiz can handle it. Thanks for any input anyone can provide. --- Ian Tabangay On Tue, Nov 10, 2009 at 11:00 PM, Matthieu Bollot < [hidden email]> wrote: > Hi, > which ofbiz version/screen are you using ? > > I'm fighting with LookUpBulkAddSupplierProducts.groovy used to purchase > order for a few days, because it doesn't work in trunk since r831676 > > And actually, I think that it needs a complete rewrite but it's quite > difficult using Delegator and xml forms. > > The best thing would be to use sql row_number, but even if we could use > it, there will be some pagination problems because the first number of > products is not the same as the final one… > > I'm working on it, but I'll not be able to provide something quickly, so > if anybody else want to give a try. > > > Regards, > Matthieu. > > |
Hi Ian
My guess is that because most people don't order 1000 different products in a single purchase order the code has never been optimized to deal with that scenario. The framework itself can certainly handle it so it's really just a matter of finding the bottlenecks and improving the offending code. If you can locate the portions of code that are slowing things down it will be easier to offer suggestions on how to improve it. Regards Scott HotWax Media http://www.hotwaxmedia.com On 11/11/2009, at 5:16 PM, ian tabangay wrote: > oh actually im talking about processing (as in completing) a order > with 1000 > items. Im looking for ways to tweek ofbiz to make it faster. Has > anyone had > the same experience? Maybe on a different document (Shipment, > Returns, etc)? > Id like to get some input. Currently Im testing ofbiz with about 500 > facilities, 1,500 suppliers, 18,000 products and 300 product > categories. Its > quite a load so Im checking if ofbiz can handle it. > Thanks for any input anyone can provide. > > > --- > Ian Tabangay > > > On Tue, Nov 10, 2009 at 11:00 PM, Matthieu Bollot < > [hidden email]> wrote: > >> Hi, >> which ofbiz version/screen are you using ? >> >> I'm fighting with LookUpBulkAddSupplierProducts.groovy used to >> purchase >> order for a few days, because it doesn't work in trunk since r831676 >> >> And actually, I think that it needs a complete rewrite but it's quite >> difficult using Delegator and xml forms. >> >> The best thing would be to use sql row_number, but even if we could >> use >> it, there will be some pagination problems because the first number >> of >> products is not the same as the final one… >> >> I'm working on it, but I'll not be able to provide something >> quickly, so >> if anybody else want to give a try. >> >> >> Regards, >> Matthieu. >> >> smime.p7s (4K) Download Attachment |
Hi Scott,
No its not the customers who are ordering. These are purchase orders of the 500 facilities (or stores). I think its mostly the sheer number of inserts/updates that is required to complete the process. I did a similar application that inserts/updates to the same entities directly to the database to complete a purchase order and the results, as compared to how ofbiz does, it wasn't significantly faster. Right now Im finding ways to divide the work and/or remove this load from the main ofbiz server to give way for other processes that will be maintained by ofbiz. What I wanted to try is to implement ofbiz as a tool to manage inventories and sales of multiple stores; about 500 stores selling about 18,000 products doing at least 1 sales transaction (average of 4 line items) per minute. Stores make purchase orders everyday for about 800 products each day. You mentioned that ofbiz wasnt intended for this kind of use? What would you say could be managable by ofbiz OOTB (or with minor changes)? --- Ian Tabangay On Wed, Nov 11, 2009 at 12:39 PM, Scott Gray <[hidden email]>wrote: > Hi Ian > > My guess is that because most people don't order 1000 different products in > a single purchase order the code has never been optimized to deal with that > scenario. The framework itself can certainly handle it so it's really just > a matter of finding the bottlenecks and improving the offending code. If > you can locate the portions of code that are slowing things down it will be > easier to offer suggestions on how to improve it. > > Regards > Scott > > HotWax Media > http://www.hotwaxmedia.com |
Administrator
|
Hi,
C-libre http://www.c-libre.net/control/main faced the same issue some years ago. Their clients (agro-alimentary mid sized entreprises) needed such requirements for sales also. They told me they rewrote this part and also refactored the promotions and price rules to speed them. But unfortunalety it's based on Neogia. So, at least for now (the Neogia team works on so called OFBiz add-ons), no synergies are possible. Jacques From: "ian tabangay" <[hidden email]> > Hi Scott, > > No its not the customers who are ordering. These are purchase orders of the > 500 facilities (or stores). I think its mostly the sheer number of > inserts/updates that is required to complete the process. I did a similar > application that inserts/updates to the same entities directly to the > database to complete a purchase order and the results, as compared to how > ofbiz does, it wasn't significantly faster. Right now Im finding ways to > divide the work and/or remove this load from the main ofbiz server to give > way for other processes that will be maintained by ofbiz. > What I wanted to try is to implement ofbiz as a tool to manage inventories > and sales of multiple stores; about 500 stores selling about 18,000 products > doing at least 1 sales transaction (average of 4 line items) per minute. > Stores make purchase orders everyday for about 800 products each day. You > mentioned that ofbiz wasnt intended for this kind of use? What would you say > could be managable by ofbiz OOTB (or with minor changes)? > > > --- > Ian Tabangay > > On Wed, Nov 11, 2009 at 12:39 PM, Scott Gray <[hidden email]>wrote: > >> Hi Ian >> >> My guess is that because most people don't order 1000 different products in >> a single purchase order the code has never been optimized to deal with that >> scenario. The framework itself can certainly handle it so it's really just >> a matter of finding the bottlenecks and improving the offending code. If >> you can locate the portions of code that are slowing things down it will be >> easier to offer suggestions on how to improve it. >> >> Regards >> Scott >> >> HotWax Media >> http://www.hotwaxmedia.com > |
In reply to this post by ian tabangay
Hi Ian,
I didn't say OFBiz wasn't intended to handle this sort of load, I just said that the purchase order code may not have been optimized for it. I have no doubt that OFBiz can handle what you're describing but you may find portions of the code could be improved to speed things up a bit, like using the cache where appropriate, using the entity iterator rather than retrieving full lists for large result sets, improving queries to make better use of indexes, etc. After that it really just comes down to your hardware. Regards Scott On 11/11/2009, at 8:26 PM, ian tabangay wrote: > Hi Scott, > > No its not the customers who are ordering. These are purchase orders > of the > 500 facilities (or stores). I think its mostly the sheer number of > inserts/updates that is required to complete the process. I did a > similar > application that inserts/updates to the same entities directly to the > database to complete a purchase order and the results, as compared > to how > ofbiz does, it wasn't significantly faster. Right now Im finding > ways to > divide the work and/or remove this load from the main ofbiz server > to give > way for other processes that will be maintained by ofbiz. > What I wanted to try is to implement ofbiz as a tool to manage > inventories > and sales of multiple stores; about 500 stores selling about 18,000 > products > doing at least 1 sales transaction (average of 4 line items) per > minute. > Stores make purchase orders everyday for about 800 products each > day. You > mentioned that ofbiz wasnt intended for this kind of use? What would > you say > could be managable by ofbiz OOTB (or with minor changes)? > > > --- > Ian Tabangay > > On Wed, Nov 11, 2009 at 12:39 PM, Scott Gray <[hidden email] > >wrote: > >> Hi Ian >> >> My guess is that because most people don't order 1000 different >> products in >> a single purchase order the code has never been optimized to deal >> with that >> scenario. The framework itself can certainly handle it so it's >> really just >> a matter of finding the bottlenecks and improving the offending >> code. If >> you can locate the portions of code that are slowing things down it >> will be >> easier to offer suggestions on how to improve it. >> >> Regards >> Scott >> >> HotWax Media >> http://www.hotwaxmedia.com smime.p7s (4K) Download Attachment |
In reply to this post by Jacques Le Roux
Hi Jacques,
I think were working along that path now. We already have some of the purchase orders being processed outside of ofbiz. Because of the significant size of the data, most of them are automatically generated anyways so most of the things that ofbiz does for a user is to provide a view for their data. Data are inserted using rpc calls to the ofbiz instead of manually creating them on the ofbiz web pages. The same is done for inserting sales from each stores. While everything could work according to plan, my concern as a developer is make updating ofbiz as seemless as possible. --- Ian Tabangay On Wed, Nov 11, 2009 at 3:59 PM, Jacques Le Roux < [hidden email]> wrote: > Hi, > > C-libre http://www.c-libre.net/control/main faced the same issue some > years ago. > Their clients (agro-alimentary mid sized entreprises) needed such > requirements for sales also. > They told me they rewrote this part and also refactored the promotions and > price rules to speed them. > But unfortunalety it's based on Neogia. So, at least for now (the Neogia > team works on so called OFBiz add-ons), no synergies are possible. > > Jacques |
In reply to this post by Scott Gray-2
Hi Scott,
Actually, my main concern is more on the processing of a document. This usually involves alot of inserts and updates at the very least. I think the bottleneck is more in the database inserts/updates. Indeed, scaling up our hardware (and alot of database load balancing/clustering and indexes) could help speed up these processes. As for querying a large set of result sets, we had problems dealing with the current implemtation of entity list iterator. There was a linear time difference for requesting for a small number of result sets as the total number of rows grow to the hundred thousands. We modified the sql processor to use limit and offset to make requesting time constant regardless of the number of rows the entity would have. The limitation of using a "non standard" sql command was acceptable for us since we wouldnt be changing our databases anyways. --- Ian Tabangay On Wed, Nov 11, 2009 at 4:02 PM, Scott Gray <[hidden email]>wrote: > Hi Ian, > > I didn't say OFBiz wasn't intended to handle this sort of load, I just said > that the purchase order code may not have been optimized for it. I have no > doubt that OFBiz can handle what you're describing but you may find portions > of the code could be improved to speed things up a bit, like using the cache > where appropriate, using the entity iterator rather than retrieving full > lists for large result sets, improving queries to make better use of > indexes, etc. After that it really just comes down to your hardware. > > Regards > Scott |
Usually the reason for the slow query time when using the iterator is
because your database isn't actually capable of providing cursors and it is actually the entire result set being loaded. For example MySQL will only use a cursor if you use a forward only result + you need to set your fetch size to Integer.MIN_VALUE (or whatever that constant is called). In those cases where you don't have a cursor using the entity iterator isn't much different from not using it. Setting max rows is the main way to get around these sorts of problems. Aside from the non standardness of using an offset, the only time I've really seen a good potential use for it is in list pagination but even then if a user is attempting to view page 1000 then I think it's more of a UI issue than anything else. IMO if the user can quickly view any of the first maybe 50 odd pages then that it perfectly fine. Regards Scott On 11/11/2009, at 9:41 PM, ian tabangay wrote: > Hi Scott, > > Actually, my main concern is more on the processing of a document. > This > usually involves alot of inserts and updates at the very least. I > think the > bottleneck is more in the database inserts/updates. Indeed, scaling > up our > hardware (and alot of database load balancing/clustering and > indexes) could > help speed up these processes. As for querying a large set of result > sets, > we had problems dealing with the current implemtation of entity list > iterator. There was a linear time difference for requesting for a > small > number of result sets as the total number of rows grow to the hundred > thousands. We modified the sql processor to use limit and offset to > make > requesting time constant regardless of the number of rows the entity > would > have. The limitation of using a "non standard" sql command was > acceptable > for us since we wouldnt be changing our databases anyways. > > > --- > Ian Tabangay > > On Wed, Nov 11, 2009 at 4:02 PM, Scott Gray <[hidden email] > >wrote: > >> Hi Ian, >> >> I didn't say OFBiz wasn't intended to handle this sort of load, I >> just said >> that the purchase order code may not have been optimized for it. I >> have no >> doubt that OFBiz can handle what you're describing but you may find >> portions >> of the code could be improved to speed things up a bit, like using >> the cache >> where appropriate, using the entity iterator rather than retrieving >> full >> lists for large result sets, improving queries to make better use of >> indexes, etc. After that it really just comes down to your hardware. >> >> Regards >> Scott smime.p7s (4K) Download Attachment |
Hi Scott,
Thanks Ill look into those as well. And I agree with your second point. I dont show the whole list since this would easily result to an OutOfMemoryException for our customized UI. I usually show them in batches of 50s. Thanks for all your input. --- Ian Tabangay On Wed, Nov 11, 2009 at 4:55 PM, Scott Gray <[hidden email]>wrote: > Usually the reason for the slow query time when using the iterator is > because your database isn't actually capable of providing cursors and it is > actually the entire result set being loaded. For example MySQL will only > use a cursor if you use a forward only result + you need to set your fetch > size to Integer.MIN_VALUE (or whatever that constant is called). In those > cases where you don't have a cursor using the entity iterator isn't much > different from not using it. > > Setting max rows is the main way to get around these sorts of problems. > Aside from the non standardness of using an offset, the only time I've > really seen a good potential use for it is in list pagination but even then > if a user is attempting to view page 1000 then I think it's more of a UI > issue than anything else. IMO if the user can quickly view any of the first > maybe 50 odd pages then that it perfectly fine. > > Regards > Scott |
Free forum by Nabble | Edit this page |