Team,
We did some performance test. Basically aim was to compare existing app based on our framework to app deployed on ofbiz framework. This is our result. Do you think are we close to benchmark result OR there is room for improvement. What we need in ofbiz is "BenchMark" Performance. e.g XXX type machine with this configuration ; YY cpus and ZZZ memory; expect performance to be ABC transactions-per-seconds. etc. Helps internal selling. Just a thought. Dear Uche, Can we share with each other our testing experience in a way it is beneficial to all . Can we add your code to ofbiz test in sandbox. Chand === Our app C2C transfer service into ofbiz framework for comparison. We are doing these tests on 2 productions grade HP servers (one DB and one Apps).We have done the tests by running same service on our existing framework and Ofbiz framework on same hardware one by one. Attached is the performance test analysis including configuration and framework code changes done to achieve the same. ----- Original Message ----- From: "Akotaobi, Uche" <[hidden email]> To: <[hidden email]> Sent: Wednesday, February 07, 2007 9:02 PM Subject: OFBiz, CruiseControl, and JUnit reports > Hi, everyone. I'd like to start of by saying that we've been very > impressed > with the OFBiz product in the brief time we've been using it. > > > Our team is big on continuous integration and unit testing, so it was > natural > for us to write JUnit TestCases for most of the major functionality of our > OFBiz hot-deploy component, and then add those TestCases directly to our > <test-suite/> XML file. We could then run our unit tests by hand with > "ant > run-tests" from the ofbiz/ directory. Good, great. > > The problem we had was making sure that running the unit tests would > generate > XML report files suitable for merging via Ant's <junitreport/> task. It > expected XML, and run-tests couldn't give us that XML, so the > CruiseControl > server couldn't really display a report of which of our tests passed or > failed. > > Mind you, we solved the problem eventually, but our solution seems a > tad...inelegant: > > > - Drop Ant 1.7.0's ant-junit.jar directly inside framework/base/lib and > hope > for the best. > - Declare an > org.apache.tools.ant.taskdefs.optional.junit.JUnitXMLFormatter > directly inside TestRunContainer.java. Create JUnitTest objects for each > class being tested [1], runner.startTestSuite() on each of the > JUnitTests, > and set the output to go to a FileOutputStream. Add each formatter as a > listener to the TestResult. > - After the TestSuite run()s, tell each runner to endTestSuite() on those > JUnitTest objects in order to write the XML output files. > - Add an extra rule to the main OFBiz build file to turn all this stuff > on. > > So now our CruiseControl runs "ant run-tests-xml" to give us the output we > need, suppressing the normal OFBiz test result output in the process. > It's > rough around the edges, but it works. > > > My question is just this: isn't there a better way? Surely I'm not the > first > person to want to build and test an OFBiz component with a continuous > integration server? Those of you who have done this, how do you manage > without XML output? Maybe there's some patch sitting in the bowels of > Jira > that solves this problem already? > > Thanks in advance for your responses! > > > [1] This required further modifications to ModelTestSuite.java, since it's > the > last part of the test workflow that has direct access to those class names > before they are converted to Test objects. Maybe I'm wrong, though. > > -- > Uche O. Akotaobi > Workflow Engineer > Xerox Corporation > 701 South Aviation Blvd., ESAE-116 > El Segundo, CA 90245 > Phone (310) 333-2403 Internal 8*823-2403 > Fax (310) 333-8419 > [hidden email] > > XEROX > Technology. Document Management. Consulting Services > > www.xerox.com > > |
Team ,
resending because Mailing list did not allow word attachments. Performance Test Report ================== Ofbiz Framework code changes 1. LRU Map entry commenting in Service dispatcher (logService() method) 2. Synchronization in GenericEntity class has been changed from method level to block level to decrease the scope. 3. Entry in Visits table in Request handler class is commented 4. Following classes were changed for passing connection from service to SQLProcessor a. GenericDelegator b. GenericHelperDAO c. GenericHelper (This is done to get the connection from pool and use this connection throughout the request rather than each time when database transaction is to be made. This has reduced synchronized access which was earlier 58 times(as there were 58 database interaction in the process) to only 3 times. Configuration Changes 1. Service input and output validations are set to false in service definition files 2. All listeners(login event and control events) in web.xml are commented 3. All Pre and post processors in contoller.xml are commented 4. Expiry time for cache and cache.properties has been set to infinite(=0). 5. Server stats disable (servicestats.properties) 6. Disabled polling of thread pool (job scheduler) in service.xml(pool-enabled=”false”) Other Changes 1. Decreased number of services from 15 to 6 Before above changes Our Original App Ofbiz Remarks TPS 40 No of Request 10000 Debug OFF OFF Application server CPU utilization 40% 95-100% Database server CPU utilization 30% 80-90% Database pool size 20 50 Success rate 100% Behavior becomes unpredictable, as cpu was consistently 95 to 100% utilized and request-processing time reaches more than 30 sec.Also requests were failed due to database transaction timeouts Request Processing Time Minimum Time 152ms Maximum time 726ms <300ms 300-500ms 500-800ms After above changes ==================== Our Original App Ofbiz Remarks TPS 40 No of Request 10000 The no of request has also been increased to 250000 request with same performance and results Debug OFF OFF Application server CPU utilization 40% 50% Database server CPU utilization 30% 60% Database pool size 20 50 Success rate 100% 100% Request Processing Time Minimum Time 152ms 186ms Maximum time 726ms 893ms 99% txn completed in 338ms Our Original App Ofbiz Remarks TPS 50 No of Request 10000 Ran till 2000 as cpu reaches 97% Debug OFF OFF Application server CPU utilization 50% 97% Database server CPU utilization 40% 85% Database pool size 20 50 Success rate 100% 100% Request Processing Time Minimum Time 181ms 210ms Maximum time 1721ms More than 20 sec ----- Original Message ----- From: "Chandresh Turakhia" <[hidden email]> To: <[hidden email]>; <[hidden email]> Cc: <[hidden email]> Sent: Wednesday, February 07, 2007 9:26 PM Subject: Re: OFBiz, CruiseControl, and JUnit reports > Team, > > We did some performance test. Basically aim was to compare existing app > based on our framework to app deployed on ofbiz framework. This is our > result. Do you think are we close to benchmark result OR there is room for > improvement. > > What we need in ofbiz is "BenchMark" Performance. e.g XXX type machine > with this configuration ; YY cpus and ZZZ memory; expect performance to be > ABC transactions-per-seconds. etc. Helps internal selling. Just a thought. > > Dear Uche, > > Can we share with each other our testing experience in a way it is > beneficial to all . > > Can we add your code to ofbiz test in sandbox. > > Chand > > > > === > > Our app C2C transfer service > into ofbiz framework for comparison. We are doing these tests on 2 > productions grade HP servers (one DB and one Apps).We have done the tests by > running same service on our existing framework and Ofbiz framework on > same hardware one by one. > > Attached is the performance test analysis including configuration and > framework code changes done to achieve the same. > > ----- Original Message ----- > From: "Akotaobi, Uche" <[hidden email]> > To: <[hidden email]> > Sent: Wednesday, February 07, 2007 9:02 PM > Subject: OFBiz, CruiseControl, and JUnit reports > > >> Hi, everyone. I'd like to start of by saying that we've been very >> impressed >> with the OFBiz product in the brief time we've been using it. >> >> >> Our team is big on continuous integration and unit testing, so it was >> natural >> for us to write JUnit TestCases for most of the major functionality of our >> OFBiz hot-deploy component, and then add those TestCases directly to our >> <test-suite/> XML file. We could then run our unit tests by hand with >> "ant >> run-tests" from the ofbiz/ directory. Good, great. >> >> The problem we had was making sure that running the unit tests would >> generate >> XML report files suitable for merging via Ant's <junitreport/> task. It >> expected XML, and run-tests couldn't give us that XML, so the >> CruiseControl >> server couldn't really display a report of which of our tests passed or >> failed. >> >> Mind you, we solved the problem eventually, but our solution seems a >> tad...inelegant: >> >> >> - Drop Ant 1.7.0's ant-junit.jar directly inside framework/base/lib and >> hope >> for the best. >> - Declare an >> org.apache.tools.ant.taskdefs.optional.junit.JUnitXMLFormatter >> directly inside TestRunContainer.java. Create JUnitTest objects for each >> class being tested [1], runner.startTestSuite() on each of the >> JUnitTests, >> and set the output to go to a FileOutputStream. Add each formatter as a >> listener to the TestResult. >> - After the TestSuite run()s, tell each runner to endTestSuite() on those >> JUnitTest objects in order to write the XML output files. >> - Add an extra rule to the main OFBiz build file to turn all this stuff >> on. >> >> So now our CruiseControl runs "ant run-tests-xml" to give us the output we >> need, suppressing the normal OFBiz test result output in the process. >> It's >> rough around the edges, but it works. >> >> >> My question is just this: isn't there a better way? Surely I'm not the >> first >> person to want to build and test an OFBiz component with a continuous >> integration server? Those of you who have done this, how do you manage >> without XML output? Maybe there's some patch sitting in the bowels of >> Jira >> that solves this problem already? >> >> Thanks in advance for your responses! >> >> >> [1] This required further modifications to ModelTestSuite.java, since it's >> the >> last part of the test workflow that has direct access to those class names >> before they are converted to Test objects. Maybe I'm wrong, though. >> >> -- >> Uche O. Akotaobi >> Workflow Engineer >> Xerox Corporation >> 701 South Aviation Blvd., ESAE-116 >> El Segundo, CA 90245 >> Phone (310) 333-2403 Internal 8*823-2403 >> Fax (310) 333-8419 >> [hidden email] >> >> XEROX >> Technology. Document Management. Consulting Services >> >> www.xerox.com >> >> > |
Free forum by Nabble | Edit this page |