Enable JMeter batch runs (#196)

* Fix arg name in jmeter build.gradle

* Run ssh tunnels in daemon threads

* Tweak jmeter properties to ensure batch runs always exit

* Testplan for 40000 test issuances plus README explanation of non GUI run.
This commit is contained in:
Christian Sailer 2017-12-14 17:01:57 +00:00 committed by GitHub
parent 158993edb1
commit 2f246afc28
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
5 changed files with 203 additions and 3 deletions

View File

@ -1,9 +1,12 @@
## JMeter for controlling CORDA performance runs
This module contains gradle tasks to make running the JMeter (http://jmeter.apache.org) This module contains gradle tasks to make running the JMeter (http://jmeter.apache.org)
load generation tool against Corda nodes much easier and more useful. It does this by load generation tool against Corda nodes much easier and more useful. It does this by
providing a simple way to launch JMeter with the actual JMeter install coming providing a simple way to launch JMeter with the actual JMeter install coming
from downloaded dependencies, and by providing some Samplers that interact with from downloaded dependencies, and by providing some Samplers that interact with
the Corda node via RPC. the Corda node via RPC.
### Running via the interactive GUI
To run up the JMeter UI, using the jmeter.properties in the resources folder, To run up the JMeter UI, using the jmeter.properties in the resources folder,
type the following: type the following:
@ -32,6 +35,8 @@ Embedded in the JAR is all of the corda code for flows and RPC, as well as the j
JAR will also include a properties file based on the hostname in the JMeter configuration, JAR will also include a properties file based on the hostname in the JMeter configuration,
so we allocate different SSH tunneled port numbers this way. so we allocate different SSH tunneled port numbers this way.
#### SSH Tunnels
To launch JMeter with the tunnels automatically created: To launch JMeter with the tunnels automatically created:
`./gradlew tools:jmeter:run -PjmeterHosts="['hostname1', 'hostname2']"` `./gradlew tools:jmeter:run -PjmeterHosts="['hostname1', 'hostname2']"`
@ -56,6 +61,8 @@ can be used to set this, or in the gradle call:
`./gradlew tools:jmeter:runSsh -PjmeterHosts="['hostname1', 'hostname2']" -PsshUser="'username'"` `./gradlew tools:jmeter:runSsh -PjmeterHosts="['hostname1', 'hostname2']" -PsshUser="'username'"`
#### Running locally with driver
To run up 3 nodes (2 nodes, 1 non-validating notary) locally for testing anything in the `perftestcordapp` (e.g. samplers, To run up 3 nodes (2 nodes, 1 non-validating notary) locally for testing anything in the `perftestcordapp` (e.g. samplers,
custom flows), you can use gradle to run: custom flows), you can use gradle to run:
@ -64,3 +71,18 @@ custom flows), you can use gradle to run:
This uses the driver test infrastructure to fire up the nodes. See `StartLocalPerfCorDapp` for X500 names of nodes, This uses the driver test infrastructure to fire up the nodes. See `StartLocalPerfCorDapp` for X500 names of nodes,
RPC user logins etc. The RPC port of Bank A is typically 10004, but they are all reporting in the console output. A RPC user logins etc. The RPC port of Bank A is typically 10004, but they are all reporting in the console output. A
sample JMeter config for this setup has been included as `LocalIssueAndPay Request.jmx` under resources. sample JMeter config for this setup has been included as `LocalIssueAndPay Request.jmx` under resources.
### Running in non-interactive test/batch mode
To run Jmeter in performance test mode, we want to run a predefined test without starting the UI and record the results
in a csv file. In order to do this, additional arguments need to be passed to JMeter. Using gradle, the command line
would look something like this:
```./gradlew tools:jmeter:run -PjmeterArgs="['-n', '-t', 'build/resources/main/Testplans/CashIssuance_40k.jmx', '-l', 'CashIssuance_40k.jtl', '-R', '127.0.0.1:20100']" -PjmeterHosts="['perf-node-4.corda.r3cev.com']"```
The interesting bit here are the `jmeterArgs`:
- `-n` tells JMeter to run non-interactively
- `-t <filename>` loads the testplan to run
- `-l <filename>` specifies the output to write to (if it exists, it will be appended)
- `-R <hostname:port>` specifies the host to run against - note this is localhost in this case as we are using ssh
tunnels to reach the test nodes.

View File

@ -100,7 +100,7 @@ run {
args+= [ "-p", sourceSets.main.resources.getSrcDirs().first().getPath()+"/jmeter.properties", args+= [ "-p", sourceSets.main.resources.getSrcDirs().first().getPath()+"/jmeter.properties",
"-d", sourceSets.main.resources.getSrcDirs().first().getPath() ] "-d", sourceSets.main.resources.getSrcDirs().first().getPath() ]
if ( project.hasProperty("jmeterArgs") ) { if ( project.hasProperty("jmeterArgs") ) {
args+= Eval.me(jmeterHosts) args+= Eval.me(jmeterArgs)
} }
if ( project.hasProperty("sshUser") ){ if ( project.hasProperty("sshUser") ){
args+= "-XsshUser" args+= "-XsshUser"

View File

@ -113,6 +113,7 @@ class Ssh {
val session = jSch.getSession(remoteUserName, remoteHost, 22) val session = jSch.getSession(remoteUserName, remoteHost, 22)
// We don't check the host fingerprints because they may change often // We don't check the host fingerprints because they may change often
session.setConfig("StrictHostKeyChecking", "no") session.setConfig("StrictHostKeyChecking", "no")
session.setDaemonThread(true)
log.info("Connecting to $remoteHost...") log.info("Connecting to $remoteHost...")
session.connect() session.connect()
log.info("Connected to $remoteHost!") log.info("Connected to $remoteHost!")

View File

@ -0,0 +1,177 @@
<?xml version="1.0" encoding="UTF-8"?>
<jmeterTestPlan version="1.2" properties="3.2" jmeter="3.3 r1808647">
<hashTree>
<TestPlan guiclass="TestPlanGui" testclass="TestPlan" testname="Test Plan" enabled="true">
<stringProp name="TestPlan.comments"></stringProp>
<boolProp name="TestPlan.functional_mode">false</boolProp>
<boolProp name="TestPlan.serialize_threadgroups">false</boolProp>
<elementProp name="TestPlan.user_defined_variables" elementType="Arguments" guiclass="ArgumentsPanel" testclass="Arguments" testname="User Defined Variables" enabled="true">
<collectionProp name="Arguments.arguments"/>
</elementProp>
<stringProp name="TestPlan.user_define_classpath"></stringProp>
</TestPlan>
<hashTree>
<ThreadGroup guiclass="ThreadGroupGui" testclass="ThreadGroup" testname="Thread Group" enabled="true">
<stringProp name="ThreadGroup.on_sample_error">continue</stringProp>
<elementProp name="ThreadGroup.main_controller" elementType="LoopController" guiclass="LoopControlPanel" testclass="LoopController" testname="Loop Controller" enabled="true">
<boolProp name="LoopController.continue_forever">false</boolProp>
<stringProp name="LoopController.loops">40000</stringProp>
</elementProp>
<stringProp name="ThreadGroup.num_threads">3</stringProp>
<stringProp name="ThreadGroup.ramp_time"></stringProp>
<longProp name="ThreadGroup.start_time">1509455820000</longProp>
<longProp name="ThreadGroup.end_time">1509455820000</longProp>
<boolProp name="ThreadGroup.scheduler">false</boolProp>
<stringProp name="ThreadGroup.duration"></stringProp>
<stringProp name="ThreadGroup.delay"></stringProp>
</ThreadGroup>
<hashTree>
<JavaSampler guiclass="JavaTestSamplerGui" testclass="JavaSampler" testname="Cash Issue Request" enabled="true">
<elementProp name="arguments" elementType="Arguments" guiclass="ArgumentsPanel" testclass="Arguments" enabled="true">
<collectionProp name="Arguments.arguments">
<elementProp name="host" elementType="Argument">
<stringProp name="Argument.name">host</stringProp>
<stringProp name="Argument.value">localhost</stringProp>
<stringProp name="Argument.metadata">=</stringProp>
</elementProp>
<elementProp name="port" elementType="Argument">
<stringProp name="Argument.name">port</stringProp>
<stringProp name="Argument.value">10003</stringProp>
<stringProp name="Argument.metadata">=</stringProp>
</elementProp>
<elementProp name="username" elementType="Argument">
<stringProp name="Argument.name">username</stringProp>
<stringProp name="Argument.value">corda</stringProp>
<stringProp name="Argument.metadata">=</stringProp>
</elementProp>
<elementProp name="password" elementType="Argument">
<stringProp name="Argument.name">password</stringProp>
<stringProp name="Argument.value">corda_is_awesome</stringProp>
<stringProp name="Argument.metadata">=</stringProp>
</elementProp>
<elementProp name="notaryName" elementType="Argument">
<stringProp name="Argument.name">notaryName</stringProp>
<stringProp name="Argument.value">O=Perf-10.155.0.8,OU=Corda,L=London,C=GB,CN=corda.notary.simple</stringProp>
<stringProp name="Argument.metadata">=</stringProp>
</elementProp>
</collectionProp>
</elementProp>
<stringProp name="classname">com.r3.corda.jmeter.CashIssueSampler</stringProp>
</JavaSampler>
<hashTree/>
</hashTree>
<ResultCollector guiclass="StatGraphVisualizer" testclass="ResultCollector" testname="Aggregate Graph" enabled="true">
<boolProp name="ResultCollector.error_logging">false</boolProp>
<objProp>
<name>saveConfig</name>
<value class="SampleSaveConfiguration">
<time>true</time>
<latency>true</latency>
<timestamp>true</timestamp>
<success>true</success>
<label>true</label>
<code>true</code>
<message>true</message>
<threadName>true</threadName>
<dataType>true</dataType>
<encoding>false</encoding>
<assertions>true</assertions>
<subresults>true</subresults>
<responseData>false</responseData>
<samplerData>false</samplerData>
<xml>false</xml>
<fieldNames>true</fieldNames>
<responseHeaders>false</responseHeaders>
<requestHeaders>false</requestHeaders>
<responseDataOnError>false</responseDataOnError>
<saveAssertionResultsFailureMessage>true</saveAssertionResultsFailureMessage>
<assertionsResultsToSave>0</assertionsResultsToSave>
<bytes>true</bytes>
<sentBytes>true</sentBytes>
<threadCounts>true</threadCounts>
<idleTime>true</idleTime>
<connectTime>true</connectTime>
</value>
</objProp>
<stringProp name="filename"></stringProp>
</ResultCollector>
<hashTree/>
<ResultCollector guiclass="GraphVisualizer" testclass="ResultCollector" testname="Graph Results" enabled="true">
<boolProp name="ResultCollector.error_logging">false</boolProp>
<objProp>
<name>saveConfig</name>
<value class="SampleSaveConfiguration">
<time>true</time>
<latency>true</latency>
<timestamp>true</timestamp>
<success>true</success>
<label>true</label>
<code>true</code>
<message>true</message>
<threadName>true</threadName>
<dataType>true</dataType>
<encoding>false</encoding>
<assertions>true</assertions>
<subresults>true</subresults>
<responseData>false</responseData>
<samplerData>false</samplerData>
<xml>false</xml>
<fieldNames>true</fieldNames>
<responseHeaders>false</responseHeaders>
<requestHeaders>false</requestHeaders>
<responseDataOnError>false</responseDataOnError>
<saveAssertionResultsFailureMessage>true</saveAssertionResultsFailureMessage>
<assertionsResultsToSave>0</assertionsResultsToSave>
<bytes>true</bytes>
<sentBytes>true</sentBytes>
<threadCounts>true</threadCounts>
<idleTime>true</idleTime>
<connectTime>true</connectTime>
</value>
</objProp>
<stringProp name="filename"></stringProp>
</ResultCollector>
<hashTree/>
<ResultCollector guiclass="TableVisualizer" testclass="ResultCollector" testname="View Results in Table" enabled="true">
<boolProp name="ResultCollector.error_logging">false</boolProp>
<objProp>
<name>saveConfig</name>
<value class="SampleSaveConfiguration">
<time>true</time>
<latency>true</latency>
<timestamp>true</timestamp>
<success>true</success>
<label>true</label>
<code>true</code>
<message>true</message>
<threadName>true</threadName>
<dataType>true</dataType>
<encoding>false</encoding>
<assertions>true</assertions>
<subresults>true</subresults>
<responseData>false</responseData>
<samplerData>false</samplerData>
<xml>false</xml>
<fieldNames>true</fieldNames>
<responseHeaders>false</responseHeaders>
<requestHeaders>false</requestHeaders>
<responseDataOnError>false</responseDataOnError>
<saveAssertionResultsFailureMessage>true</saveAssertionResultsFailureMessage>
<assertionsResultsToSave>0</assertionsResultsToSave>
<bytes>true</bytes>
<sentBytes>true</sentBytes>
<threadCounts>true</threadCounts>
<idleTime>true</idleTime>
<connectTime>true</connectTime>
</value>
</objProp>
<stringProp name="filename"></stringProp>
</ResultCollector>
<hashTree/>
</hashTree>
<WorkBench guiclass="WorkBenchGui" testclass="WorkBench" testname="WorkBench" enabled="true">
<boolProp name="WorkBench.save">true</boolProp>
</WorkBench>
<hashTree/>
</hashTree>
</jmeterTestPlan>

View File

@ -1035,13 +1035,13 @@ cookies=cookies
# Whether to call System.exit(1) on failure to stop threads in non-GUI mode. # Whether to call System.exit(1) on failure to stop threads in non-GUI mode.
# This only takes effect if the test was explicitly requested to stop. # This only takes effect if the test was explicitly requested to stop.
# If this is disabled, it may be necessary to kill the JVM externally # If this is disabled, it may be necessary to kill the JVM externally
#jmeterengine.stopfail.system.exit=true jmeterengine.stopfail.system.exit=true
# Whether to force call System.exit(0) at end of test in non-GUI mode, even if # Whether to force call System.exit(0) at end of test in non-GUI mode, even if
# there were no failures and the test was not explicitly asked to stop. # there were no failures and the test was not explicitly asked to stop.
# Without this, the JVM may never exit if there are other threads spawned by # Without this, the JVM may never exit if there are other threads spawned by
# the test which never exit. # the test which never exit.
#jmeterengine.force.system.exit=false jmeterengine.force.system.exit=true
# How long to pause (in ms) in the daemon thread before reporting that the JVM has failed to exit. # How long to pause (in ms) in the daemon thread before reporting that the JVM has failed to exit.
# If the value is <= 0, the JMeter does not start the daemon thread # If the value is <= 0, the JMeter does not start the daemon thread