TM-51 Read and write test results to artifactory. (#5597)

* TM-51  Prep for reading and writing test results to artifactory.

* TM-51  Tests from target branch if no tests for current branch

* TM-51  Placeholder for test averaging over runs.

* TM-51  Replace slashes in branch names used as tags.

* TM-51  More placeholder work for the mean duration work.

* TM-51  Write out average tests results as as csv.

The csv file should grow and be updated on each run.  This includes whether or not we are running unit tests, integration tests and so on.

* TM-51  Comment out old junit test archiving, add more comments.

* TM-51  Zip task needs to depend on a csv creation task.

If there isn't a csv file present, then the zip task doesn't run due to 'NO-SOURCE'

* TM-51  Zip task should ignore empty dirs

* TM-51  Fix up loading of test results.

We were looking for the wrong artifact name.
Add a bit more logging.

* TM-51  Fix up possible problem with allocating by class distribution.

If we encounter a class we haven't seen before, there won't be any tests.
This means we should give it some weight.  '1' is far too small.

* TM-51  Test that we are definitely increment the run count.

Tracking down whether the zipped csv file should have incremented.

* TM-51  Better default value for missing test/class names.

Begin by using mean unit test duration, but we have the option to bump
that to the mean class unit tests duration.

* TM-51  More debug information around csv writing.

We should be incrementing the tests.

* TM-51  Reload the csv before updating it.

* TM-51  Reduce verbosity of logging.

* TM-51  Reinstate unit tests.  Remove logging verbosity.

* TM-51  Load tests from artifactory in memory and avoid interim file.

* TM-51  Better handling of zero duration tests.

Ensure we return zero times from junit artifacts which may either be zero or have no recorded time.  Before writing the tests duration csv file, store those with a known time, and then store those with zero using the average time.

* TM-51  Log whether we have recorded a test.

Tracking down the curious case where we seem to not be rerunning the
same set of tests on the second run.

* TM-51  Capture junit files as well.

Trying to track down whether some tests are intermittently run.

* TM-51  Change task dependencies to ensure ziptask is triggered.

* TM-51  Remove test assertion, and trigger build

* TM-51  Add corda/enterprise to artifactory tag name.

Moved properties to own file.

* TM-51  Remove unnecessary mean class-based duration.

* TM-51  Add more BucketingAllocator tests.

We need these to nail down its behaviour some more.

* TM-51  Further log information.

We don't seem to be finding the tests in the 'production' runs which is odd.

* TM-51  corda type double set?

* TM-51  do not set the project type in the properties.

SRP and all that.

* TM-51  better plan reporting

* TM-51  duration may be zero

Another runtime problem that doesn't show in tests.

* TM-51  better plan reporting

* fix missing space after image id

* fix merge issue in DistributedTesting

* TM-51  remove unused code when GET/PUT-ting to Artifactory.

* TM-51  put tasks in gradle group and tidy up zip task creation

* TM-51 Fix the junit XML path.

* TM-51 Fix the task graph

* TM-51 Less logging
This commit is contained in:
Barry 2019-11-02 09:07:53 +00:00 committed by Stefano Franz
parent cf849fbdbd
commit 91e6c9783f
15 changed files with 1628 additions and 3302 deletions

View File

@ -32,6 +32,9 @@
<option name="IF_RPAREN_ON_NEW_LINE" value="false" /> <option name="IF_RPAREN_ON_NEW_LINE" value="false" />
<option name="CODE_STYLE_DEFAULTS" value="KOTLIN_OFFICIAL" /> <option name="CODE_STYLE_DEFAULTS" value="KOTLIN_OFFICIAL" />
</JetCodeStyleSettings> </JetCodeStyleSettings>
<MarkdownNavigatorCodeStyleSettings>
<option name="RIGHT_MARGIN" value="72" />
</MarkdownNavigatorCodeStyleSettings>
<editorconfig> <editorconfig>
<option name="ENABLED" value="false" /> <option name="ENABLED" value="false" />
</editorconfig> </editorconfig>

7
Jenkinsfile vendored
View File

@ -12,6 +12,7 @@ pipeline {
DOCKER_TAG_TO_USE = "${env.GIT_COMMIT.subSequence(0, 8)}" DOCKER_TAG_TO_USE = "${env.GIT_COMMIT.subSequence(0, 8)}"
EXECUTOR_NUMBER = "${env.EXECUTOR_NUMBER}" EXECUTOR_NUMBER = "${env.EXECUTOR_NUMBER}"
BUILD_ID = "${env.BUILD_ID}-${env.JOB_NAME}" BUILD_ID = "${env.BUILD_ID}-${env.JOB_NAME}"
ARTIFACTORY_CREDENTIALS = credentials('artifactory-credentials')
} }
stages { stages {
@ -36,7 +37,11 @@ pipeline {
sh "./gradlew " + sh "./gradlew " +
"-DbuildId=\"\${BUILD_ID}\" " + "-DbuildId=\"\${BUILD_ID}\" " +
"-Dkubenetize=true " + "-Dkubenetize=true " +
"-Ddocker.run.tag=\"\${DOCKER_TAG_TO_USE}\"" + "-Ddocker.run.tag=\"\${DOCKER_TAG_TO_USE}\" " +
"-Dartifactory.username=\"\${ARTIFACTORY_CREDENTIALS_USR}\" " +
"-Dartifactory.password=\"\${ARTIFACTORY_CREDENTIALS_PSW}\" " +
"-Dgit.branch=\"\${GIT_BRANCH}\" " +
"-Dgit.target.branch=\"\${CHANGE_TARGET}\" " +
" deAllocateForAllParallelIntegrationTest allParallelIntegrationTest --stacktrace" " deAllocateForAllParallelIntegrationTest allParallelIntegrationTest --stacktrace"
} }
} }

View File

@ -0,0 +1,146 @@
package net.corda.testing;
import okhttp3.*;
import org.apache.commons.compress.utils.IOUtils;
import org.jetbrains.annotations.NotNull;
import org.jetbrains.annotations.Nullable;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.*;
/**
* Used by TestArtifacts
*/
public class Artifactory {
//<editor-fold desc="Statics">
private static final Logger LOG = LoggerFactory.getLogger(Artifactory.class);
private static String authorization() {
return Credentials.basic(Properties.getUsername(), Properties.getPassword());
}
/**
* Construct the URL in a style that Artifactory prefers.
*
* @param baseUrl e.g. https://software.r3.com/artifactory/corda-releases/net/corda/corda/
* @param theTag e.g. 4.3-RC0
* @param artifact e.g. corda
* @param extension e.g. jar
* @return full URL to artifact.
*/
private static String getFullUrl(@NotNull final String baseUrl,
@NotNull final String theTag,
@NotNull final String artifact,
@NotNull final String extension) {
return baseUrl + "/" + theTag + "/" + getFileName(artifact, extension, theTag);
}
/**
* @param artifact e.g. corda
* @param extension e.g. jar
* @param theTag e.g. 4.3
* @return e.g. corda-4.3.jar
*/
static String getFileName(@NotNull final String artifact,
@NotNull final String extension,
@Nullable final String theTag) {
StringBuilder sb = new StringBuilder().append(artifact);
if (theTag != null) {
sb.append("-").append(theTag);
}
sb.append(".").append(extension);
return sb.toString();
}
//</editor-fold>
/**
* Get the unit tests, synchronous get.
* See https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API#ArtifactoryRESTAPI-RetrieveLatestArtifact
*
* @return true if successful, false otherwise.
*/
boolean get(@NotNull final String baseUrl,
@NotNull final String theTag,
@NotNull final String artifact,
@NotNull final String extension,
@NotNull final OutputStream outputStream) {
final String url = getFullUrl(baseUrl, theTag, artifact, extension);
final Request request = new Request.Builder()
.addHeader("Authorization", authorization())
.url(url)
.build();
final OkHttpClient client = new OkHttpClient();
try (Response response = client.newCall(request).execute()) {
handleResponse(response);
if (response.body() != null) {
outputStream.write(response.body().bytes());
} else {
LOG.warn("Response body was empty");
}
} catch (IOException e) {
LOG.warn("Unable to execute GET via REST: ", e);
return false;
}
LOG.warn("Ok. REST GET successful");
return true;
}
/**
* Post an artifact, synchronous PUT
* See https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API#ArtifactoryRESTAPI-DeployArtifact
*
* @return true if successful
*/
boolean put(@NotNull final String baseUrl,
@NotNull final String theTag,
@NotNull final String artifact,
@NotNull final String extension,
@NotNull final InputStream inputStream) {
final MediaType contentType = MediaType.parse("application/zip, application/octet-stream");
final String url = getFullUrl(baseUrl, theTag, artifact, extension);
final OkHttpClient client = new OkHttpClient();
byte[] bytes;
try {
bytes = IOUtils.toByteArray(inputStream);
} catch (IOException e) {
LOG.warn("Unable to execute PUT tests via REST: ", e);
return false;
}
final Request request = new Request.Builder()
.addHeader("Authorization", authorization())
.url(url)
.put(RequestBody.create(contentType, bytes))
.build();
try (Response response = client.newCall(request).execute()) {
handleResponse(response);
} catch (IOException e) {
LOG.warn("Unable to execute PUT via REST: ", e);
return false;
}
return true;
}
private void handleResponse(@NotNull final Response response) throws IOException {
if (response.isSuccessful()) return;
LOG.warn("Bad response from server: {}", response.toString());
LOG.warn(response.toString());
if (response.code() == 401) {
throw new IOException("Not authorized - incorrect credentials?");
}
throw new IOException(response.message());
}
}

View File

@ -5,20 +5,29 @@ package net.corda.testing;
import groovy.lang.Tuple2; import groovy.lang.Tuple2;
import org.gradle.api.tasks.TaskAction; import org.gradle.api.tasks.TaskAction;
import org.jetbrains.annotations.NotNull; import org.jetbrains.annotations.NotNull;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.*; import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import java.util.function.Supplier; import java.util.function.Supplier;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import java.util.stream.IntStream; import java.util.stream.IntStream;
public class BucketingAllocator { public class BucketingAllocator {
private static final Logger LOG = LoggerFactory.getLogger(BucketingAllocator.class);
private List<Tuple2<TestLister, Object>> sources = new ArrayList<>();
private final List<TestsForForkContainer> forkContainers; private final List<TestsForForkContainer> forkContainers;
private final Supplier<List<Tuple2<String, Double>>> timedTestsProvider; private final Supplier<Tests> timedTestsProvider;
private List<Tuple2<TestLister, Object>> sources = new ArrayList<>();
public BucketingAllocator(Integer forkCount, Supplier<List<Tuple2<String, Double>>> timedTestsProvider) { public BucketingAllocator(Integer forkCount, Supplier<Tests> timedTestsProvider) {
this.forkContainers = IntStream.range(0, forkCount).mapToObj(TestsForForkContainer::new).collect(Collectors.toList()); this.forkContainers = IntStream.range(0, forkCount).mapToObj(TestsForForkContainer::new).collect(Collectors.toList());
this.timedTestsProvider = timedTestsProvider; this.timedTestsProvider = timedTestsProvider;
} }
@ -33,9 +42,9 @@ public class BucketingAllocator {
@TaskAction @TaskAction
public void generateTestPlan() { public void generateTestPlan() {
List<Tuple2<String, Double>> allTestsFromCSV = timedTestsProvider.get(); Tests allTestsFromFile = timedTestsProvider.get();
List<Tuple2<String, Object>> allDiscoveredTests = getTestsOnClasspathOfTestingTasks(); List<Tuple2<String, Object>> allDiscoveredTests = getTestsOnClasspathOfTestingTasks();
List<TestBucket> matchedTests = matchClasspathTestsToCSV(allTestsFromCSV, allDiscoveredTests); List<TestBucket> matchedTests = matchClasspathTestsToFile(allTestsFromFile, allDiscoveredTests);
//use greedy algo - for each testbucket find the currently smallest container and add to it //use greedy algo - for each testbucket find the currently smallest container and add to it
allocateTestsToForks(matchedTests); allocateTestsToForks(matchedTests);
@ -44,15 +53,31 @@ public class BucketingAllocator {
printSummary(); printSummary();
} }
static String getDuration(long nanos) {
long t = TimeUnit.NANOSECONDS.toMinutes(nanos);
if (t > 0) {
return t + " mins";
}
t = TimeUnit.NANOSECONDS.toSeconds(nanos);
if (t > 0) {
return t + " secs";
}
t = TimeUnit.NANOSECONDS.toMillis(nanos);
if (t > 0) {
return t + " ms";
}
return nanos + " ns";
}
private void printSummary() { private void printSummary() {
forkContainers.forEach(container -> { forkContainers.forEach(container -> {
System.out.println("####### TEST PLAN SUMMARY ( " + container.forkIdx + " ) #######"); System.out.println("####### TEST PLAN SUMMARY ( " + container.forkIdx + " ) #######");
System.out.println("Duration: " + container.getCurrentDuration()); System.out.println("Duration: " + getDuration(container.getCurrentDuration()));
System.out.println("Number of tests: " + container.testsForFork.stream().mapToInt(b -> b.foundTests.size()).sum()); System.out.println("Number of tests: " + container.testsForFork.stream().mapToInt(b -> b.foundTests.size()).sum());
System.out.println("Tests to Run: "); System.out.println("Tests to Run: ");
container.testsForFork.forEach(tb -> { container.testsForFork.forEach(tb -> {
System.out.println(tb.testName); System.out.println(tb.testName);
tb.foundTests.forEach(ft -> System.out.println("\t" + ft.getFirst() + ", " + ft.getSecond())); tb.foundTests.forEach(ft -> System.out.println("\t" + ft.getFirst() + ", " + getDuration(ft.getSecond())));
}); });
}); });
} }
@ -64,12 +89,23 @@ public class BucketingAllocator {
}); });
} }
private List<TestBucket> matchClasspathTestsToCSV(List<Tuple2<String, Double>> allTestsFromCSV, @NotNull List<Tuple2<String, Object>> allDiscoveredTests) { List<TestsForForkContainer> getForkContainers() {
return forkContainers;
}
private List<TestBucket> matchClasspathTestsToFile(@NotNull final Tests tests,
@NotNull final List<Tuple2<String, Object>> allDiscoveredTests) {
// Note that this does not preserve the order of tests with known and unknown durations, as we
// always return a duration from 'tests.startsWith'.
return allDiscoveredTests.stream().map(tuple -> { return allDiscoveredTests.stream().map(tuple -> {
String testName = tuple.getFirst(); final String testName = tuple.getFirst();
Object task = tuple.getSecond(); final Object task = tuple.getSecond();
//2DO [can this filtering algorithm be improved - the test names are sorted, it should be possible to do something using binary search]
List<Tuple2<String, Double>> matchingTests = allTestsFromCSV.stream().filter(testFromCSV -> testFromCSV.getFirst().startsWith(testName)).collect(Collectors.toList()); // If the gradle task is distributing by class rather than method, then 'testName' will be the className
// and not className.testName
// No matter which it is, we return the mean test duration as the duration value if not found.
final List<Tuple2<String, Long>> matchingTests = tests.startsWith(testName);
return new TestBucket(task, testName, matchingTests); return new TestBucket(task, testName, matchingTests);
}).sorted(Comparator.comparing(TestBucket::getDuration).reversed()).collect(Collectors.toList()); }).sorted(Comparator.comparing(TestBucket::getDuration).reversed()).collect(Collectors.toList());
} }
@ -85,18 +121,20 @@ public class BucketingAllocator {
public static class TestBucket { public static class TestBucket {
final Object testTask; final Object testTask;
final String testName; final String testName;
final List<Tuple2<String, Double>> foundTests; final List<Tuple2<String, Long>> foundTests;
final Double duration; final long durationNanos;
public TestBucket(Object testTask, String testName, List<Tuple2<String, Double>> foundTests) { public TestBucket(@NotNull final Object testTask,
@NotNull final String testName,
@NotNull final List<Tuple2<String, Long>> foundTests) {
this.testTask = testTask; this.testTask = testTask;
this.testName = testName; this.testName = testName;
this.foundTests = foundTests; this.foundTests = foundTests;
duration = Math.max(foundTests.stream().mapToDouble(tp -> Math.max(tp.getSecond(), 1)).sum(), 1); this.durationNanos = foundTests.stream().mapToLong(tp -> Math.max(tp.getSecond(), 1)).sum();
} }
public Double getDuration() { public long getDuration() {
return duration; return durationNanos;
} }
@Override @Override
@ -105,17 +143,16 @@ public class BucketingAllocator {
"testTask=" + testTask + "testTask=" + testTask +
", nameWithAsterix='" + testName + '\'' + ", nameWithAsterix='" + testName + '\'' +
", foundTests=" + foundTests + ", foundTests=" + foundTests +
", duration=" + duration + ", durationNanos=" + durationNanos +
'}'; '}';
} }
} }
public static class TestsForForkContainer { public static class TestsForForkContainer {
private Double runningDuration = 0.0;
private final Integer forkIdx; private final Integer forkIdx;
private final List<TestBucket> testsForFork = Collections.synchronizedList(new ArrayList<>()); private final List<TestBucket> testsForFork = Collections.synchronizedList(new ArrayList<>());
private final Map<Object, List<TestBucket>> frozenTests = new HashMap<>(); private final Map<Object, List<TestBucket>> frozenTests = new HashMap<>();
private long runningDuration = 0L;
public TestsForForkContainer(Integer forkIdx) { public TestsForForkContainer(Integer forkIdx) {
this.forkIdx = forkIdx; this.forkIdx = forkIdx;
@ -123,10 +160,10 @@ public class BucketingAllocator {
public void addBucket(TestBucket tb) { public void addBucket(TestBucket tb) {
this.testsForFork.add(tb); this.testsForFork.add(tb);
this.runningDuration = runningDuration + tb.duration; this.runningDuration = this.runningDuration + tb.durationNanos;
} }
public Double getCurrentDuration() { public Long getCurrentDuration() {
return runningDuration; return runningDuration;
} }
@ -154,6 +191,4 @@ public class BucketingAllocator {
'}'; '}';
} }
} }
} }

View File

@ -1,39 +1,19 @@
package net.corda.testing; package net.corda.testing;
import groovy.lang.Tuple2;
import org.apache.commons.csv.CSVFormat;
import org.apache.commons.csv.CSVRecord;
import org.gradle.api.DefaultTask; import org.gradle.api.DefaultTask;
import org.gradle.api.tasks.TaskAction; import org.gradle.api.tasks.TaskAction;
import org.gradle.api.tasks.testing.Test; import org.gradle.api.tasks.testing.Test;
import javax.inject.Inject; import javax.inject.Inject;
import java.io.File;
import java.io.FileReader;
import java.io.IOException;
import java.io.Reader;
import java.util.Collections;
import java.util.Comparator;
import java.util.List; import java.util.List;
import java.util.Objects;
import java.util.function.Supplier;
import java.util.stream.Collectors; import java.util.stream.Collectors;
public class BucketingAllocatorTask extends DefaultTask { public class BucketingAllocatorTask extends DefaultTask {
private static final String DEFAULT_TESTING_TEST_TIMES_CSV = "testing/test-times.csv";
private final BucketingAllocator allocator; private final BucketingAllocator allocator;
@Inject @Inject
public BucketingAllocatorTask(Integer forkCount) { public BucketingAllocatorTask(Integer forkCount) {
Supplier<List<Tuple2<String, Double>>> defaultTestCSV = () -> { this.allocator = new BucketingAllocator(forkCount, TestDurationArtifacts.getTestsSupplier());
try {
FileReader csvSource = new FileReader(new File(BucketingAllocatorTask.this.getProject().getRootDir(), DEFAULT_TESTING_TEST_TIMES_CSV));
return fromCSV(csvSource);
} catch (IOException e) {
return Collections.emptyList();
}
};
this.allocator = new BucketingAllocator(forkCount, defaultTestCSV);
} }
public void addSource(TestLister source, Test testTask) { public void addSource(TestLister source, Test testTask) {
@ -49,21 +29,4 @@ public class BucketingAllocatorTask extends DefaultTask {
public void allocate() { public void allocate() {
allocator.generateTestPlan(); allocator.generateTestPlan();
} }
public static List<Tuple2<String, Double>> fromCSV(Reader reader) throws IOException {
String name = "Test Name";
String duration = "Duration(ms)";
List<CSVRecord> records = CSVFormat.DEFAULT.withHeader().parse(reader).getRecords();
return records.stream().map(record -> {
try {
String testName = record.get(name);
String testDuration = record.get(duration);
return new Tuple2<>(testName, Math.max(Double.parseDouble(testDuration), 1));
} catch (IllegalArgumentException | IllegalStateException e) {
return null;
}
}).filter(Objects::nonNull).sorted(Comparator.comparing(Tuple2::getFirst)).collect(Collectors.toList());
}
} }

View File

@ -13,6 +13,8 @@ import org.gradle.api.tasks.testing.Test
*/ */
class DistributedTesting implements Plugin<Project> { class DistributedTesting implements Plugin<Project> {
public static final String GRADLE_GROUP = "Distributed Testing";
static def getPropertyAsInt(Project proj, String property, Integer defaultValue) { static def getPropertyAsInt(Project proj, String property, Integer defaultValue) {
return proj.hasProperty(property) ? Integer.parseInt(proj.property(property).toString()) : defaultValue return proj.hasProperty(property) ? Integer.parseInt(proj.property(property).toString()) : defaultValue
} }
@ -20,6 +22,7 @@ class DistributedTesting implements Plugin<Project> {
@Override @Override
void apply(Project project) { void apply(Project project) {
if (System.getProperty("kubenetize") != null) { if (System.getProperty("kubenetize") != null) {
Properties.setRootProjectType(project.rootProject.name)
Integer forks = getPropertyAsInt(project, "dockerForks", 1) Integer forks = getPropertyAsInt(project, "dockerForks", 1)
@ -30,6 +33,9 @@ class DistributedTesting implements Plugin<Project> {
String tagToUseForRunningTests = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_RUNNING_PROPERTY) String tagToUseForRunningTests = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_RUNNING_PROPERTY)
String tagToUseForBuilding = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_RUNNING_PROPERTY) String tagToUseForBuilding = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_RUNNING_PROPERTY)
BucketingAllocatorTask globalAllocator = project.tasks.create("bucketingAllocator", BucketingAllocatorTask, forks) BucketingAllocatorTask globalAllocator = project.tasks.create("bucketingAllocator", BucketingAllocatorTask, forks)
globalAllocator.group = GRADLE_GROUP
globalAllocator.description = "Allocates tests to buckets"
Set<String> requestedTaskNames = project.gradle.startParameter.taskNames.toSet() Set<String> requestedTaskNames = project.gradle.startParameter.taskNames.toSet()
def requestedTasks = requestedTaskNames.collect { project.tasks.findByPath(it) } def requestedTasks = requestedTaskNames.collect { project.tasks.findByPath(it) }
@ -41,14 +47,14 @@ class DistributedTesting implements Plugin<Project> {
//4. after each completed test write its name to a file to keep track of what finished for restart purposes //4. after each completed test write its name to a file to keep track of what finished for restart purposes
project.subprojects { Project subProject -> project.subprojects { Project subProject ->
subProject.tasks.withType(Test) { Test task -> subProject.tasks.withType(Test) { Test task ->
println "Evaluating ${task.getPath()}" project.logger.info("Evaluating ${task.getPath()}")
if (task in requestedTasks && !task.hasProperty("ignoreForDistribution")) { if (task in requestedTasks && !task.hasProperty("ignoreForDistribution")) {
println "Modifying ${task.getPath()}" project.logger.info "Modifying ${task.getPath()}"
ListTests testListerTask = createTestListingTasks(task, subProject) ListTests testListerTask = createTestListingTasks(task, subProject)
globalAllocator.addSource(testListerTask, task) globalAllocator.addSource(testListerTask, task)
Test modifiedTestTask = modifyTestTaskForParallelExecution(subProject, task, globalAllocator) Test modifiedTestTask = modifyTestTaskForParallelExecution(subProject, task, globalAllocator)
} else { } else {
println "Skipping modification of ${task.getPath()} as it's not scheduled for execution" project.logger.info "Skipping modification of ${task.getPath()} as it's not scheduled for execution"
} }
if (!task.hasProperty("ignoreForDistribution")) { if (!task.hasProperty("ignoreForDistribution")) {
//this is what enables execution of a single test suite - for example node:parallelTest would execute all unit tests in node, node:parallelIntegrationTest would do the same for integration tests //this is what enables execution of a single test suite - for example node:parallelTest would execute all unit tests in node, node:parallelIntegrationTest would do the same for integration tests
@ -73,10 +79,12 @@ class DistributedTesting implements Plugin<Project> {
userGroups.forEach { testGrouping -> userGroups.forEach { testGrouping ->
//for each "group" (ie: test, integrationTest) within the grouping find all the Test tasks which have the same name. //for each "group" (ie: test, integrationTest) within the grouping find all the Test tasks which have the same name.
List<Test> groups = ((ParallelTestGroup) testGrouping).groups.collect { allTestTasksGroupedByType.get(it) }.flatten() List<Test> testTasksToRunInGroup = ((ParallelTestGroup) testGrouping).groups.collect {
allTestTasksGroupedByType.get(it)
}.flatten()
//join up these test tasks into a single set of tasks to invoke (node:test, node:integrationTest...) //join up these test tasks into a single set of tasks to invoke (node:test, node:integrationTest...)
String superListOfTasks = groups.collect { it.path }.join(" ") String superListOfTasks = testTasksToRunInGroup.collect { it.path }.join(" ")
//generate a preAllocate / deAllocate task which allows you to "pre-book" a node during the image building phase //generate a preAllocate / deAllocate task which allows you to "pre-book" a node during the image building phase
//this prevents time lost to cloud provider node spin up time (assuming image build time > provider spin up time) //this prevents time lost to cloud provider node spin up time (assuming image build time > provider spin up time)
@ -88,6 +96,8 @@ class DistributedTesting implements Plugin<Project> {
} }
def userDefinedParallelTask = project.rootProject.tasks.create("userDefined" + testGrouping.name.capitalize(), KubesTest) { def userDefinedParallelTask = project.rootProject.tasks.create("userDefined" + testGrouping.name.capitalize(), KubesTest) {
group = GRADLE_GROUP
if (!tagToUseForRunningTests) { if (!tagToUseForRunningTests) {
dependsOn imagePushTask dependsOn imagePushTask
} }
@ -108,6 +118,7 @@ class DistributedTesting implements Plugin<Project> {
} }
} }
def reportOnAllTask = project.rootProject.tasks.create("userDefinedReports${testGrouping.name.capitalize()}", KubesReporting) { def reportOnAllTask = project.rootProject.tasks.create("userDefinedReports${testGrouping.name.capitalize()}", KubesReporting) {
group = GRADLE_GROUP
dependsOn userDefinedParallelTask dependsOn userDefinedParallelTask
destinationDir new File(project.rootProject.getBuildDir(), "userDefinedReports${testGrouping.name.capitalize()}") destinationDir new File(project.rootProject.getBuildDir(), "userDefinedReports${testGrouping.name.capitalize()}")
doFirst { doFirst {
@ -117,15 +128,25 @@ class DistributedTesting implements Plugin<Project> {
reportOn(userDefinedParallelTask.testOutput) reportOn(userDefinedParallelTask.testOutput)
} }
} }
// Task to zip up test results, and upload them to somewhere (Artifactory).
def zipTask = TestDurationArtifacts.createZipTask(project.rootProject, testGrouping.name, userDefinedParallelTask)
userDefinedParallelTask.finalizedBy(reportOnAllTask) userDefinedParallelTask.finalizedBy(reportOnAllTask)
testGrouping.dependsOn(userDefinedParallelTask) zipTask.dependsOn(userDefinedParallelTask)
testGrouping.dependsOn(zipTask)
} }
} }
// Added only so that we can manually run zipTask on the command line as a test.
TestDurationArtifacts.createZipTask(project.rootProject, "zipTask", null)
.setDescription("Zip task that can be run locally for testing");
} }
private List<Task> generatePreAllocateAndDeAllocateTasksForGrouping(Project project, ParallelTestGroup testGrouping) { private List<Task> generatePreAllocateAndDeAllocateTasksForGrouping(Project project, ParallelTestGroup testGrouping) {
PodAllocator allocator = new PodAllocator(project.getLogger()) PodAllocator allocator = new PodAllocator(project.getLogger())
Task preAllocateTask = project.rootProject.tasks.create("preAllocateFor" + testGrouping.name.capitalize()) { Task preAllocateTask = project.rootProject.tasks.create("preAllocateFor" + testGrouping.name.capitalize()) {
group = GRADLE_GROUP
doFirst { doFirst {
String dockerTag = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_BUILDING_PROPERTY) String dockerTag = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_BUILDING_PROPERTY)
if (dockerTag == null) { if (dockerTag == null) {
@ -142,6 +163,7 @@ class DistributedTesting implements Plugin<Project> {
} }
Task deAllocateTask = project.rootProject.tasks.create("deAllocateFor" + testGrouping.name.capitalize()) { Task deAllocateTask = project.rootProject.tasks.create("deAllocateFor" + testGrouping.name.capitalize()) {
group = GRADLE_GROUP
doFirst { doFirst {
String dockerTag = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_RUNNING_PROPERTY) String dockerTag = System.getProperty(ImageBuilding.PROVIDE_TAG_FOR_RUNNING_PROPERTY)
if (dockerTag == null) { if (dockerTag == null) {
@ -160,6 +182,7 @@ class DistributedTesting implements Plugin<Project> {
def capitalizedTaskName = task.getName().capitalize() def capitalizedTaskName = task.getName().capitalize()
KubesTest createdParallelTestTask = projectContainingTask.tasks.create("parallel" + capitalizedTaskName, KubesTest) { KubesTest createdParallelTestTask = projectContainingTask.tasks.create("parallel" + capitalizedTaskName, KubesTest) {
group = GRADLE_GROUP + " Parallel Test Tasks"
if (!providedTag) { if (!providedTag) {
dependsOn imageBuildingTask dependsOn imageBuildingTask
} }
@ -232,6 +255,7 @@ class DistributedTesting implements Plugin<Project> {
//determine all the tests which are present in this test task. //determine all the tests which are present in this test task.
//this list will then be shared between the various worker forks //this list will then be shared between the various worker forks
def createdListTask = subProject.tasks.create("listTestsFor" + capitalizedTaskName, ListTests) { def createdListTask = subProject.tasks.create("listTestsFor" + capitalizedTaskName, ListTests) {
group = GRADLE_GROUP
//the convention is that a testing task is backed by a sourceSet with the same name //the convention is that a testing task is backed by a sourceSet with the same name
dependsOn subProject.getTasks().getByName("${taskName}Classes") dependsOn subProject.getTasks().getByName("${taskName}Classes")
doFirst { doFirst {
@ -242,6 +266,7 @@ class DistributedTesting implements Plugin<Project> {
//convenience task to utilize the output of the test listing task to display to local console, useful for debugging missing tests //convenience task to utilize the output of the test listing task to display to local console, useful for debugging missing tests
def createdPrintTask = subProject.tasks.create("printTestsFor" + capitalizedTaskName) { def createdPrintTask = subProject.tasks.create("printTestsFor" + capitalizedTaskName) {
group = GRADLE_GROUP
dependsOn createdListTask dependsOn createdListTask
doLast { doLast {
createdListTask.getTestsForFork( createdListTask.getTestsForFork(

View File

@ -405,10 +405,20 @@ public class KubesTest extends DefaultTask {
} }
private String[] getBuildCommand(int numberOfPods, int podIdx) { private String[] getBuildCommand(int numberOfPods, int podIdx) {
final String gitBranch = " -Dgit.branch=" + Properties.getGitBranch();
final String gitTargetBranch = " -Dgit.target.branch=" + Properties.getTargetGitBranch();
final String artifactoryUsername = " -Dartifactory.username=" + Properties.getUsername() + " ";
final String artifactoryPassword = " -Dartifactory.password=" + Properties.getPassword() + " ";
String shellScript = "(let x=1 ; while [ ${x} -ne 0 ] ; do echo \"Waiting for DNS\" ; curl services.gradle.org > /dev/null 2>&1 ; x=$? ; sleep 1 ; done ) && " String shellScript = "(let x=1 ; while [ ${x} -ne 0 ] ; do echo \"Waiting for DNS\" ; curl services.gradle.org > /dev/null 2>&1 ; x=$? ; sleep 1 ; done ) && "
+ " cd /tmp/source && " + + " cd /tmp/source && " +
"(let y=1 ; while [ ${y} -ne 0 ] ; do echo \"Preparing build directory\" ; ./gradlew testClasses integrationTestClasses --parallel 2>&1 ; y=$? ; sleep 1 ; done ) && " + "(let y=1 ; while [ ${y} -ne 0 ] ; do echo \"Preparing build directory\" ; ./gradlew testClasses integrationTestClasses --parallel 2>&1 ; y=$? ; sleep 1 ; done ) && " +
"(./gradlew -D" + ListTests.DISTRIBUTION_PROPERTY + "=" + distribution.name() + " -Dkubenetize -PdockerFork=" + podIdx + " -PdockerForks=" + numberOfPods + " " + fullTaskToExecutePath + " " + getLoggingLevel() + " 2>&1) ; " + "(./gradlew -D" + ListTests.DISTRIBUTION_PROPERTY + "=" + distribution.name() +
gitBranch +
gitTargetBranch +
artifactoryUsername +
artifactoryPassword +
"-Dkubenetize -PdockerFork=" + podIdx + " -PdockerForks=" + numberOfPods + " " + fullTaskToExecutePath + " " + getLoggingLevel() + " 2>&1) ; " +
"let rs=$? ; sleep 10 ; exit ${rs}"; "let rs=$? ; sleep 10 ; exit ${rs}";
return new String[]{"bash", "-c", shellScript}; return new String[]{"bash", "-c", shellScript};
} }

View File

@ -0,0 +1,87 @@
package net.corda.testing;
import org.jetbrains.annotations.NotNull;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
* A single class to hold some of the properties we need to get from the command line
* in order to store test results in Artifactory.
*/
public class Properties {
private static final Logger LOG = LoggerFactory.getLogger(Properties.class);
private static String ROOT_PROJECT_TYPE = "corda"; // corda or enterprise
/**
* Get the Corda type. Used in the tag names when we store in Artifactory.
*
* @return either 'corda' or 'enterprise'
*/
static String getRootProjectType() {
return ROOT_PROJECT_TYPE;
}
/**
* Set the Corda (repo) type - either enterprise, or corda (open-source).
* Used in the tag names when we store in Artifactory.
*
* @param rootProjectType the corda repo type.
*/
static void setRootProjectType(@NotNull final String rootProjectType) {
ROOT_PROJECT_TYPE = rootProjectType;
}
/**
* Get property with logging
*
* @param key property to get
* @return empty string, or trimmed value
*/
@NotNull
static String getProperty(@NotNull final String key) {
final String value = System.getProperty(key, "").trim();
if (value.isEmpty()) {
LOG.debug("Property '{}' not set", key);
} else {
LOG.debug("Ok. Property '{}' is set", key);
}
return value;
}
/**
* Get Artifactory username
*
* @return the username
*/
static String getUsername() {
return getProperty("artifactory.username");
}
/**
* Get Artifactory password
*
* @return the password
*/
static String getPassword() {
return getProperty("artifactory.password");
}
/**
* The current branch/tag
*
* @return the current branch
*/
@NotNull
static String getGitBranch() {
return getProperty("git.branch").replace('/', '-');
}
/**
* @return the branch that this branch was likely checked out from.
*/
@NotNull
static String getTargetGitBranch() {
return getProperty("git.target.branch").replace('/', '-');
}
}

View File

@ -0,0 +1,429 @@
package net.corda.testing;
import groovy.lang.Tuple2;
import org.apache.commons.compress.archivers.ArchiveEntry;
import org.apache.commons.compress.archivers.ArchiveException;
import org.apache.commons.compress.archivers.ArchiveInputStream;
import org.apache.commons.compress.archivers.ArchiveStreamFactory;
import org.apache.commons.compress.utils.IOUtils;
import org.gradle.api.Project;
import org.gradle.api.Task;
import org.gradle.api.tasks.bundling.Zip;
import org.jetbrains.annotations.NotNull;
import org.jetbrains.annotations.Nullable;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.w3c.dom.Document;
import org.w3c.dom.NamedNodeMap;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import org.xml.sax.SAXException;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException;
import javax.xml.xpath.XPath;
import javax.xml.xpath.XPathConstants;
import javax.xml.xpath.XPathExpression;
import javax.xml.xpath.XPathExpressionException;
import javax.xml.xpath.XPathFactory;
import java.io.BufferedInputStream;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileWriter;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.nio.file.FileSystems;
import java.nio.file.FileVisitResult;
import java.nio.file.FileVisitor;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.PathMatcher;
import java.nio.file.attribute.BasicFileAttributes;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.function.BiFunction;
import java.util.function.Supplier;
/**
* Get or put test artifacts to/from a REST endpoint. The expected format is a zip file of junit XML files.
* See https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API
*/
public class TestDurationArtifacts {
private static final String EXTENSION = "zip";
private static final String BASE_URL = "https://software.r3.com/artifactory/corda-test-results/net/corda";
private static final Logger LOG = LoggerFactory.getLogger(TestDurationArtifacts.class);
private static final String ARTIFACT = "tests-durations";
// The one and only set of tests information. We load these at the start of a build, and update them and save them at the end.
static Tests tests = new Tests();
// Artifactory API
private final Artifactory artifactory = new Artifactory();
/**
* Write out the test durations as a CSV file.
* Reload the tests from artifactory and update with the latest run.
*
* @param project project that we are attaching the test to.
* @param name basename for the test.
* @return the csv task
*/
private static Task createCsvTask(@NotNull final Project project, @NotNull final String name) {
return project.getTasks().create("createCsvFromXmlFor" + capitalize(name), Task.class, t -> {
t.setGroup(DistributedTesting.GRADLE_GROUP);
t.setDescription("Create csv from all discovered junit xml files");
// Parse all the junit results and write them to a csv file.
t.doFirst(task -> {
project.getLogger().warn("About to create CSV file and zip it");
// Reload the test object from artifactory
loadTests();
// Get the list of junit xml artifacts
final List<Path> testXmlFiles = getTestXmlFiles(project.getBuildDir().getAbsoluteFile().toPath());
project.getLogger().warn("Found {} xml junit files", testXmlFiles.size());
// Read test xml files for tests and duration and add them to the `Tests` object
// This adjusts the runCount and over all average duration for existing tests.
for (Path testResult : testXmlFiles) {
try {
final List<Tuple2<String, Long>> unitTests = fromJunitXml(new FileInputStream(testResult.toFile()));
// Add the non-zero duration tests to build up an average.
unitTests.stream()
.filter(t2 -> t2.getSecond() > 0L)
.forEach(unitTest -> tests.addDuration(unitTest.getFirst(), unitTest.getSecond()));
final long meanDurationForTests = tests.getMeanDurationForTests();
// Add the zero duration tests using the mean value so they are fairly distributed over the pods in the next run.
// If we used 'zero' they would all be added to the smallest bucket.
unitTests.stream()
.filter(t2 -> t2.getSecond() <= 0L)
.forEach(unitTest -> tests.addDuration(unitTest.getFirst(), meanDurationForTests));
} catch (FileNotFoundException ignored) {
}
}
// Write the test file to disk.
try {
final FileWriter writer = new FileWriter(new File(project.getRootDir(), ARTIFACT + ".csv"));
tests.write(writer);
LOG.warn("Written tests csv file with {} tests", tests.size());
} catch (IOException ignored) {
}
});
});
}
@NotNull
static String capitalize(@NotNull final String str) {
return str.substring(0, 1).toUpperCase() + str.substring(1); // groovy has this as an extension method
}
/**
* Discover junit xml files, zip them, and upload to artifactory.
*
* @param project root project
* @param name task name that we're 'extending'
* @return gradle task
*/
@NotNull
private static Task createJunitZipTask(@NotNull final Project project, @NotNull final String name) {
return project.getTasks().create("zipJunitXmlFilesAndUploadFor" + capitalize(name), Zip.class, z -> {
z.setGroup(DistributedTesting.GRADLE_GROUP);
z.setDescription("Zip junit files and upload to artifactory");
z.getArchiveFileName().set(Artifactory.getFileName("junit", EXTENSION, getBranchTag()));
z.getDestinationDirectory().set(project.getRootDir());
z.setIncludeEmptyDirs(false);
z.from(project.getRootDir(), task -> task.include("**/build/test-results-xml/**/*.xml", "**/build/test-results/**/*.xml"));
z.doLast(task -> {
try (FileInputStream inputStream = new FileInputStream(new File(z.getArchiveFileName().get()))) {
new Artifactory().put(BASE_URL, getBranchTag(), "junit", EXTENSION, inputStream);
} catch (Exception ignored) {
}
});
});
}
/**
* Zip and upload test-duration csv files to artifactory
*
* @param project root project that we're attaching the task to
* @param name the task name we're 'extending'
* @return gradle task
*/
@NotNull
private static Task createCsvZipAndUploadTask(@NotNull final Project project, @NotNull final String name) {
return project.getTasks().create("zipCsvFilesAndUploadFor" + capitalize(name), Zip.class, z -> {
z.setGroup(DistributedTesting.GRADLE_GROUP);
z.setDescription("Zips test duration csv and uploads to artifactory");
z.getArchiveFileName().set(Artifactory.getFileName(ARTIFACT, EXTENSION, getBranchTag()));
z.getDestinationDirectory().set(project.getRootDir());
z.setIncludeEmptyDirs(false);
// There's only one csv, but glob it anyway.
z.from(project.getRootDir(), task -> task.include("**/" + ARTIFACT + ".csv"));
// ...base class method zips the CSV...
z.doLast(task -> {
// We've now created the one csv file containing the tests and their mean durations,
// this task has zipped it, so we now just upload it.
project.getLogger().warn("SAVING tests");
project.getLogger().warn("Attempting to upload {}", z.getArchiveFileName().get());
try (FileInputStream inputStream = new FileInputStream(new File(z.getArchiveFileName().get()))) {
if (!new TestDurationArtifacts().put(getBranchTag(), inputStream)) {
project.getLogger().warn("Could not upload zip of tests");
} else {
project.getLogger().warn("SAVED tests");
}
} catch (Exception e) {
project.getLogger().warn("Problem trying to upload: {} {}", z.getArchiveFileName().get(), e.toString());
}
});
});
}
/**
* Create the Gradle Zip task to gather test information
*
* @param project project to attach this task to
* @param name name of the task
* @param task a task that we depend on when creating the csv so Gradle produces the correct task graph.
* @return a reference to the created task.
*/
@NotNull
public static Task createZipTask(@NotNull final Project project, @NotNull final String name, @Nullable final Task task) {
final Task zipJunitTask = createJunitZipTask(project, name);
final Task csvTask = createCsvTask(project, name);
csvTask.dependsOn(zipJunitTask);
// For debugging - can be removed - this simply gathers junit xml and uploads them to artifactory
// so that we can inspect them.
if (task != null) {
csvTask.dependsOn(task);
}
final Task zipCsvTask = createCsvZipAndUploadTask(project, name);
zipCsvTask.dependsOn(csvTask); // we have to create the csv before we can zip it.
return zipCsvTask;
}
static List<Path> getTestXmlFiles(@NotNull final Path rootDir) {
List<Path> paths = new ArrayList<>();
List<PathMatcher> matchers = new ArrayList<>();
matchers.add(FileSystems.getDefault().getPathMatcher("glob:**/test-results-xml/**/*.xml"));
try {
Files.walkFileTree(rootDir, new FileVisitor<Path>() {
@Override
public FileVisitResult preVisitDirectory(Path dir, BasicFileAttributes attrs) {
return FileVisitResult.CONTINUE;
}
@Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) {
for (PathMatcher matcher : matchers) {
if (matcher.matches(file)) {
paths.add(file);
break;
}
}
return FileVisitResult.CONTINUE;
}
@Override
public FileVisitResult visitFileFailed(Path file, IOException exc) {
return FileVisitResult.CONTINUE;
}
@Override
public FileVisitResult postVisitDirectory(Path dir, IOException exc) {
return FileVisitResult.CONTINUE;
}
});
} catch (IOException e) {
LOG.warn("Could not walk tree and get all test xml files: {}", e.toString());
}
return paths;
}
/**
* Unzip test results in memory and return test names and durations.
* Assumes the input stream contains only csv files of the correct format.
*
* @param tests reference to the Tests object to be populated.
* @param zippedInputStream stream containing zipped result file(s)
*/
static void addTestsFromZippedCsv(@NotNull final Tests tests,
@NotNull final InputStream zippedInputStream) {
// We need this because ArchiveStream requires the `mark` functionality which is supported in buffered streams.
final BufferedInputStream bufferedInputStream = new BufferedInputStream(zippedInputStream);
try (ArchiveInputStream archiveInputStream = new ArchiveStreamFactory().createArchiveInputStream(bufferedInputStream)) {
ArchiveEntry e;
while ((e = archiveInputStream.getNextEntry()) != null) {
if (e.isDirectory()) continue;
// We seem to need to take a copy of the original input stream (as positioned by the ArchiveEntry), because
// the XML parsing closes the stream after it has finished. This has the side effect of only parsing the first
// entry in the archive.
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
IOUtils.copy(archiveInputStream, outputStream);
ByteArrayInputStream byteInputStream = new ByteArrayInputStream(outputStream.toByteArray());
// Read the tests from the (csv) stream
final InputStreamReader reader = new InputStreamReader(byteInputStream);
// Add the tests to the Tests object
tests.addTests(Tests.read(reader));
}
} catch (ArchiveException | IOException e) {
LOG.warn("Problem unzipping XML test results");
}
LOG.debug("Discovered {} tests", tests.size());
}
/**
* For a given stream, return the testcase names and durations.
* <p>
* NOTE: the input stream will be closed by this method.
*
* @param inputStream an InputStream, closed once parsed
* @return a list of test names and their durations in nanos.
*/
@NotNull
static List<Tuple2<String, Long>> fromJunitXml(@NotNull final InputStream inputStream) {
final DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();
final List<Tuple2<String, Long>> results = new ArrayList<>();
try {
final DocumentBuilder builder = dbFactory.newDocumentBuilder();
final Document document = builder.parse(inputStream);
document.getDocumentElement().normalize();
final XPathFactory xPathfactory = XPathFactory.newInstance();
final XPath xpath = xPathfactory.newXPath();
final XPathExpression expression = xpath.compile("//testcase");
final NodeList nodeList = (NodeList) expression.evaluate(document, XPathConstants.NODESET);
final BiFunction<NamedNodeMap, String, String> get =
(a, k) -> a.getNamedItem(k) != null ? a.getNamedItem(k).getNodeValue() : "";
for (int i = 0; i < nodeList.getLength(); i++) {
final Node item = nodeList.item(i);
final NamedNodeMap attributes = item.getAttributes();
final String testName = get.apply(attributes, "name");
final String testDuration = get.apply(attributes, "time");
final String testClassName = get.apply(attributes, "classname");
// If the test doesn't have a duration (it should), we return zero.
if (!(testName.isEmpty() || testClassName.isEmpty())) {
final long nanos = !testDuration.isEmpty() ? (long) (Double.parseDouble(testDuration) * 1000000.0) : 0L;
results.add(new Tuple2<>(testClassName + "." + testName, nanos));
} else {
LOG.warn("Bad test in junit xml: name={} className={}", testName, testClassName);
}
}
} catch (ParserConfigurationException | IOException | XPathExpressionException | SAXException e) {
return Collections.emptyList();
}
return results;
}
/**
* A supplier of tests.
* <p>
* We get them from Artifactory and then parse the test xml files to get the duration.
*
* @return a supplier of test results
*/
@NotNull
static Supplier<Tests> getTestsSupplier() {
return TestDurationArtifacts::loadTests;
}
/**
* we need to prepend the project type so that we have a unique tag for artifactory
*
* @return
*/
static String getBranchTag() {
return (Properties.getRootProjectType() + "-" + Properties.getGitBranch()).replace('.', '-');
}
/**
* we need to prepend the project type so that we have a unique tag artifactory
*
* @return
*/
static String getTargetBranchTag() {
return (Properties.getRootProjectType() + "-" + Properties.getTargetGitBranch()).replace('.', '-');
}
/**
* Load the tests from Artifactory, in-memory. No temp file used. Existing test data is cleared.
*
* @return a reference to the loaded tests.
*/
static Tests loadTests() {
LOG.warn("LOADING previous test runs from Artifactory");
tests.clear();
try {
final TestDurationArtifacts testArtifacts = new TestDurationArtifacts();
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
// Try getting artifacts for our branch, if not, try the target branch.
if (!testArtifacts.get(getBranchTag(), outputStream)) {
outputStream = new ByteArrayOutputStream();
LOG.warn("Could not get tests from Artifactory for tag {}, trying {}", getBranchTag(), getTargetBranchTag());
if (!testArtifacts.get(getTargetBranchTag(), outputStream)) {
LOG.warn("Could not get any tests from Artifactory");
return tests;
}
}
ByteArrayInputStream inputStream = new ByteArrayInputStream(outputStream.toByteArray());
addTestsFromZippedCsv(tests, inputStream);
LOG.warn("Got {} tests from Artifactory", tests.size());
return tests;
} catch (Exception e) { // was IOException
LOG.warn(e.toString());
LOG.warn("Could not get tests from Artifactory");
return tests;
}
}
/**
* Get tests for the specified tag in the outputStream
*
* @param theTag tag for tests
* @param outputStream stream of zipped xml files
* @return false if we fail to get the tests
*/
private boolean get(@NotNull final String theTag, @NotNull final OutputStream outputStream) {
return artifactory.get(BASE_URL, theTag, ARTIFACT, "zip", outputStream);
}
/**
* Upload the supplied tests
*
* @param theTag tag for tests
* @param inputStream stream of zipped xml files.
* @return true if we succeed
*/
private boolean put(@NotNull final String theTag, @NotNull final InputStream inputStream) {
return artifactory.put(BASE_URL, theTag, ARTIFACT, EXTENSION, inputStream);
}
}

View File

@ -0,0 +1,199 @@
package net.corda.testing;
import groovy.lang.Tuple2;
import groovy.lang.Tuple3;
import org.apache.commons.csv.CSVFormat;
import org.apache.commons.csv.CSVPrinter;
import org.apache.commons.csv.CSVRecord;
import org.jetbrains.annotations.NotNull;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.io.IOException;
import java.io.Reader;
import java.io.Writer;
import java.util.Arrays;
import java.util.Collections;
import java.util.Comparator;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Objects;
import java.util.stream.Collectors;
public class Tests {
final static String TEST_NAME = "Test Name";
final static String MEAN_DURATION_NANOS = "Mean Duration Nanos";
final static String NUMBER_OF_RUNS = "Number of runs";
private static final Logger LOG = LoggerFactory.getLogger(Tests.class);
// test name -> (mean duration, number of runs)
private final Map<String, Tuple2<Long, Long>> tests = new HashMap<>();
// If we don't have any tests from which to get a mean, use this.
static long DEFAULT_MEAN_NANOS = 1000L;
private static Tuple2<Long, Long> DEFAULT_MEAN_TUPLE = new Tuple2<>(DEFAULT_MEAN_NANOS, 0L);
// mean, count
private Tuple2<Long, Long> meanForTests = DEFAULT_MEAN_TUPLE;
/**
* Read tests, mean duration and runs from a csv file.
*
* @param reader a reader
* @return list of tests, or an empty list if none or we have a problem.
*/
public static List<Tuple3<String, Long, Long>> read(Reader reader) {
try {
List<CSVRecord> records = CSVFormat.DEFAULT.withHeader().parse(reader).getRecords();
return records.stream().map(record -> {
try {
final String testName = record.get(TEST_NAME);
final long testDuration = Long.parseLong(record.get(MEAN_DURATION_NANOS));
final long testRuns = Long.parseLong(record.get(NUMBER_OF_RUNS)); // can't see how we would have zero tbh.
return new Tuple3<>(testName, testDuration, Math.max(testRuns, 1));
} catch (IllegalArgumentException | IllegalStateException e) {
return null;
}
}).filter(Objects::nonNull).sorted(Comparator.comparing(Tuple3::getFirst)).collect(Collectors.toList());
} catch (IOException ignored) {
}
return Collections.emptyList();
}
private static Tuple2<Long, Long> recalculateMean(@NotNull final Tuple2<Long, Long> previous, long nanos) {
final long total = previous.getFirst() * previous.getSecond() + nanos;
final long count = previous.getSecond() + 1;
return new Tuple2<>(total / count, count);
}
/**
* Write a csv file of test name, duration, runs
*
* @param writer a writer
* @return true if no problems.
*/
public boolean write(@NotNull final Writer writer) {
boolean ok = true;
final CSVPrinter printer;
try {
printer = new CSVPrinter(writer,
CSVFormat.DEFAULT.withHeader(TEST_NAME, MEAN_DURATION_NANOS, NUMBER_OF_RUNS));
for (String key : tests.keySet()) {
printer.printRecord(key, tests.get(key).getFirst(), tests.get(key).getSecond());
}
printer.flush();
} catch (IOException e) {
ok = false;
}
return ok;
}
/**
* Add tests, and also (re)calculate the mean test duration.
* e.g. addTests(read(reader));
*
* @param testsCollection tests, typically from a csv file.
*/
public void addTests(@NotNull final List<Tuple3<String, Long, Long>> testsCollection) {
testsCollection.forEach(t -> this.tests.put(t.getFirst(), new Tuple2<>(t.getSecond(), t.getThird())));
// Calculate the mean test time.
if (tests.size() > 0) {
long total = 0;
for (String testName : this.tests.keySet()) total += tests.get(testName).getFirst();
meanForTests = new Tuple2<>(total / this.tests.size(), 1L);
}
}
/**
* Get the known mean duration of a test.
*
* @param testName the test name
* @return duration in nanos.
*/
public long getDuration(@NotNull final String testName) {
return tests.getOrDefault(testName, meanForTests).getFirst();
}
/**
* Add test information. Recalulates mean test duration if already exists.
*
* @param testName name of the test
* @param durationNanos duration
*/
public void addDuration(@NotNull final String testName, long durationNanos) {
final Tuple2<Long, Long> current = tests.getOrDefault(testName, new Tuple2<>(0L, 0L));
tests.put(testName, recalculateMean(current, durationNanos));
LOG.debug("Recorded test '{}', mean={} ns, runs={}", testName, tests.get(testName).getFirst(), tests.get(testName).getSecond());
meanForTests = recalculateMean(meanForTests, durationNanos);
}
/**
* Do we have any test information?
*
* @return false if no tests info
*/
public boolean isEmpty() {
return tests.isEmpty();
}
/**
* How many tests do we have?
*
* @return the number of tests we have information for
*/
public int size() {
return tests.size();
}
/**
* Return all tests (and their durations) that being with (or are equal to) `testPrefix`
* If not present we just return the mean test duration so that the test is fairly distributed.
* @param testPrefix could be just the classname, or the entire classname + testname.
* @return list of matching tests
*/
@NotNull
List<Tuple2<String, Long>> startsWith(@NotNull final String testPrefix) {
List<Tuple2<String, Long>> results = this.tests.keySet().stream()
.filter(t -> t.startsWith(testPrefix))
.map(t -> new Tuple2<>(t, getDuration(t)))
.collect(Collectors.toList());
// We don't know if the testPrefix is a classname or classname.methodname (exact match).
if (results == null || results.isEmpty()) {
LOG.warn("In {} previously executed tests, could not find any starting with {}", tests.size(), testPrefix);
results = Arrays.asList(new Tuple2<>(testPrefix, getMeanDurationForTests()));
}
return results;
}
/**
* How many times has this function been run? Every call to addDuration increments the current value.
*
* @param testName the test name
* @return the number of times the test name has been run.
*/
public long getRunCount(@NotNull final String testName) {
return tests.getOrDefault(testName, new Tuple2<>(0L, 0L)).getSecond();
}
/**
* Return the mean duration for a unit to run
*
* @return mean duration in nanos.
*/
public long getMeanDurationForTests() {
return meanForTests.getFirst();
}
/**
* Clear all tests
*/
void clear() {
tests.clear();
meanForTests = DEFAULT_MEAN_TUPLE;
}
}

View File

@ -0,0 +1,56 @@
package net.corda.testing;
import org.junit.After;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
public class PropertiesTest {
private static String username = "me";
private static String password = "me";
private static String cordaType = "corda-project";
private static String branch = "mine";
private static String targetBranch = "master";
@Before
public void setUp() {
System.setProperty("git.branch", branch);
System.setProperty("git.target.branch", targetBranch);
System.setProperty("artifactory.username", username);
System.setProperty("artifactory.password", password);
}
@After
public void tearDown() {
System.setProperty("git.branch", "");
System.setProperty("git.target.branch", "");
System.setProperty("artifactory.username", "");
System.setProperty("artifactory.password", "");
}
@Test
public void cordaType() {
Properties.setRootProjectType(cordaType);
Assert.assertEquals(cordaType, Properties.getRootProjectType());
}
@Test
public void getUsername() {
Assert.assertEquals(username, Properties.getUsername());
}
@Test
public void getPassword() {
Assert.assertEquals(password, Properties.getPassword());
}
@Test
public void getGitBranch() {
Assert.assertEquals(branch, Properties.getGitBranch());
}
@Test
public void getTargetGitBranch() {
Assert.assertEquals(targetBranch, Properties.getTargetGitBranch());
}
}

View File

@ -0,0 +1,323 @@
package net.corda.testing;
import groovy.lang.Tuple2;
import org.apache.commons.compress.archivers.ArchiveException;
import org.apache.commons.compress.archivers.ArchiveOutputStream;
import org.apache.commons.compress.archivers.ArchiveStreamFactory;
import org.apache.commons.compress.archivers.zip.ZipArchiveEntry;
import org.jetbrains.annotations.NotNull;
import org.junit.Assert;
import org.junit.Test;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.StringWriter;
import java.nio.charset.StandardCharsets;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList;
import java.util.List;
import java.util.stream.Collectors;
import java.util.zip.ZipEntry;
import java.util.zip.ZipOutputStream;
public class TestDurationArtifactsTest {
final static String CLASSNAME = "FAKE";
String getXml(List<Tuple2<String, Long>> tests) {
StringBuilder sb = new StringBuilder();
sb.append("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n" +
"<testsuites disabled=\"\" errors=\"\" failures=\"\" name=\"\" tests=\"\" time=\"\">\n" +
" <testsuite disabled=\"\" errors=\"\" failures=\"\" hostname=\"\" id=\"\"\n" +
" name=\"\" package=\"\" skipped=\"\" tests=\"\" time=\"\" timestamp=\"\">\n" +
" <properties>\n" +
" <property name=\"\" value=\"\"/>\n" +
" </properties>\n");
for (Tuple2<String, Long> test : tests) {
Double d = ((double) test.getSecond()) / 1_000_000;
sb.append(" <testcase assertions=\"\" classname=\"" + CLASSNAME + "\" name=\""
+ test.getFirst() + "\" status=\"\" time=\"" + d.toString() + "\">\n" +
" <skipped/>\n" +
" <error message=\"\" type=\"\"/>\n" +
" <failure message=\"\" type=\"\"/>\n" +
" <system-out/>\n" +
" <system-err/>\n" +
" </testcase>\n");
}
sb.append(" <system-out/>\n" +
" <system-err/>\n" +
" </testsuite>\n" +
"</testsuites>");
return sb.toString();
}
String getXmlWithNoTime(List<Tuple2<String, Long>> tests) {
StringBuilder sb = new StringBuilder();
sb.append("<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n" +
"<testsuites disabled=\"\" errors=\"\" failures=\"\" name=\"\" tests=\"\" time=\"\">\n" +
" <testsuite disabled=\"\" errors=\"\" failures=\"\" hostname=\"\" id=\"\"\n" +
" name=\"\" package=\"\" skipped=\"\" tests=\"\" time=\"\" timestamp=\"\">\n" +
" <properties>\n" +
" <property name=\"\" value=\"\"/>\n" +
" </properties>\n");
for (Tuple2<String, Long> test : tests) {
Double d = ((double) test.getSecond()) / 1_000_000;
sb.append(" <testcase assertions=\"\" classname=\"" + CLASSNAME + "\" name=\""
+ test.getFirst() + "\" status=\"\" time=\"\">\n" +
" <skipped/>\n" +
" <error message=\"\" type=\"\"/>\n" +
" <failure message=\"\" type=\"\"/>\n" +
" <system-out/>\n" +
" <system-err/>\n" +
" </testcase>\n");
}
sb.append(" <system-out/>\n" +
" <system-err/>\n" +
" </testsuite>\n" +
"</testsuites>");
return sb.toString();
}
@Test
public void fromJunitXml() {
List<Tuple2<String, Long>> tests = new ArrayList<>();
tests.add(new Tuple2<>("TEST-A", 111_000_000L));
tests.add(new Tuple2<>("TEST-B", 222_200_000L));
final String xml = getXml(tests);
List<Tuple2<String, Long>> results
= TestDurationArtifacts.fromJunitXml(new ByteArrayInputStream(xml.getBytes(StandardCharsets.UTF_8)));
Assert.assertNotNull(results);
Assert.assertFalse("Should have results", results.isEmpty());
Assert.assertEquals(results.size(), 2);
Assert.assertEquals(CLASSNAME + "." + "TEST-A", results.get(0).getFirst());
Assert.assertEquals(111_000_000L, results.get(0).getSecond().longValue());
Assert.assertEquals(CLASSNAME + "." + "TEST-B", results.get(1).getFirst());
Assert.assertEquals(222_200_000L, results.get(1).getSecond().longValue());
}
@Test
public void fromJunitXmlWithZeroDuration() {
// We do return zero values.
List<Tuple2<String, Long>> tests = new ArrayList<>();
tests.add(new Tuple2<>("TEST-A", 0L));
tests.add(new Tuple2<>("TEST-B", 0L));
final String xml = getXml(tests);
List<Tuple2<String, Long>> results
= TestDurationArtifacts.fromJunitXml(new ByteArrayInputStream(xml.getBytes(StandardCharsets.UTF_8)));
Assert.assertNotNull(results);
Assert.assertFalse("Should have results", results.isEmpty());
Assert.assertEquals(results.size(), 2);
Assert.assertEquals(CLASSNAME + "." + "TEST-A", results.get(0).getFirst());
Assert.assertEquals(0L, results.get(0).getSecond().longValue());
Assert.assertEquals(CLASSNAME + "." + "TEST-B", results.get(1).getFirst());
Assert.assertEquals(0L, results.get(1).getSecond().longValue());
}
@Test
public void fromJunitXmlWithNoDuration() {
// We do return zero values.
List<Tuple2<String, Long>> tests = new ArrayList<>();
tests.add(new Tuple2<>("TEST-A", 0L));
tests.add(new Tuple2<>("TEST-B", 0L));
final String xml = getXmlWithNoTime(tests);
List<Tuple2<String, Long>> results
= TestDurationArtifacts.fromJunitXml(new ByteArrayInputStream(xml.getBytes(StandardCharsets.UTF_8)));
Assert.assertNotNull(results);
Assert.assertFalse("Should have results", results.isEmpty());
Assert.assertEquals(2, results.size());
Assert.assertEquals(CLASSNAME + "." + "TEST-A", results.get(0).getFirst());
Assert.assertEquals(0L, results.get(0).getSecond().longValue());
Assert.assertEquals(CLASSNAME + "." + "TEST-B", results.get(1).getFirst());
Assert.assertEquals(0L, results.get(1).getSecond().longValue());
}
@Test
public void canCreateZipFile() throws IOException {
Tests outputTests = new Tests();
final String testA = "com.corda.testA";
final String testB = "com.corda.testB";
outputTests.addDuration(testA, 55L);
outputTests.addDuration(testB, 33L);
StringWriter writer = new StringWriter();
outputTests.write(writer);
String csv = writer.toString();
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
try (ZipOutputStream outputStream = new ZipOutputStream(byteStream, StandardCharsets.UTF_8)) {
ZipEntry entry = new ZipEntry("tests.csv");
outputStream.putNextEntry(entry);
outputStream.write(csv.getBytes(StandardCharsets.UTF_8));
outputStream.closeEntry();
}
Assert.assertNotEquals(0, byteStream.toByteArray().length);
ByteArrayInputStream inputStream = new ByteArrayInputStream(byteStream.toByteArray());
Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
TestDurationArtifacts.addTestsFromZippedCsv(tests, inputStream);
Assert.assertFalse(tests.isEmpty());
Assert.assertEquals(2, tests.size());
Assert.assertEquals(55L, tests.getDuration(testA));
Assert.assertEquals(33L, tests.getDuration(testB));
Assert.assertEquals(44L, tests.getMeanDurationForTests());
}
void putIntoArchive(@NotNull final ArchiveOutputStream outputStream,
@NotNull final String fileName,
@NotNull final String content) throws IOException {
ZipArchiveEntry entry = new ZipArchiveEntry(fileName);
outputStream.putArchiveEntry(entry);
outputStream.write(content.getBytes(StandardCharsets.UTF_8));
outputStream.closeArchiveEntry();
}
String write(@NotNull final Tests tests) {
StringWriter writer = new StringWriter();
tests.write(writer);
return writer.toString();
}
@Test
public void canCreateZipFileContainingMultipleFiles() throws IOException, ArchiveException {
// Currently we don't have two csvs in the zip file, but test anyway.
Tests outputTests = new Tests();
final String testA = "com.corda.testA";
final String testB = "com.corda.testB";
final String testC = "com.corda.testC";
outputTests.addDuration(testA, 55L);
outputTests.addDuration(testB, 33L);
String csv = write(outputTests);
Tests otherTests = new Tests();
otherTests.addDuration(testA, 55L);
otherTests.addDuration(testB, 33L);
otherTests.addDuration(testC, 22L);
String otherCsv = write(otherTests);
ByteArrayOutputStream byteStream = new ByteArrayOutputStream();
try (ArchiveOutputStream outputStream =
new ArchiveStreamFactory("UTF-8").createArchiveOutputStream(ArchiveStreamFactory.ZIP, byteStream)) {
putIntoArchive(outputStream, "tests1.csv", csv);
putIntoArchive(outputStream, "tests2.csv", otherCsv);
outputStream.flush();
}
Assert.assertNotEquals(0, byteStream.toByteArray().length);
ByteArrayInputStream inputStream = new ByteArrayInputStream(byteStream.toByteArray());
Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
TestDurationArtifacts.addTestsFromZippedCsv(tests, inputStream);
Assert.assertFalse(tests.isEmpty());
Assert.assertEquals(3, tests.size());
Assert.assertEquals((55 + 33 + 22) / 3, tests.getMeanDurationForTests());
}
// // Uncomment to test a file.
// // Run a build to generate some test files, create a zip:
// // zip ~/tests.zip $(find . -name "*.xml" -type f | grep test-results)
//// @Test
//// public void testZipFile() throws FileNotFoundException {
//// File f = new File(System.getProperty("tests.zip", "/tests.zip");
//// List<Tuple2<String, Long>> results = BucketingAllocatorTask.fromZippedXml(new BufferedInputStream(new FileInputStream(f)));
//// Assert.assertFalse("Should have results", results.isEmpty());
//// System.out.println("Results = " + results.size());
//// System.out.println(results.toString());
//// }
@Test
public void branchNamesDoNotHaveDirectoryDelimiters() {
// we use the branch name in file and artifact tagging, so '/' would confuse things,
// so make sure when we retrieve the property we strip them out.
final String expected = "release/os/4.3";
final String key = "git.branch";
final String cordaType = "corda";
Properties.setRootProjectType(cordaType);
System.setProperty(key, expected);
Assert.assertEquals(expected, System.getProperty(key));
Assert.assertNotEquals(expected, Properties.getGitBranch());
Assert.assertEquals("release-os-4.3", Properties.getGitBranch());
}
@Test
public void getTestsFromArtifactory() {
String artifactory_password = System.getenv("ARTIFACTORY_PASSWORD");
String artifactory_username = System.getenv("ARTIFACTORY_USERNAME");
String git_branch = System.getenv("CORDA_GIT_BRANCH");
String git_target_branch = System.getenv("CORDA_GIT_TARGET_BRANCH");
if (artifactory_password == null ||
artifactory_username == null ||
git_branch == null ||
git_target_branch == null
) {
System.out.println("Skipping test - set env vars to run this test");
return;
}
System.setProperty("git.branch", git_branch);
System.setProperty("git.target.branch", git_target_branch);
System.setProperty("artifactory.password", artifactory_password);
System.setProperty("artifactory.username", artifactory_username);
Assert.assertTrue(TestDurationArtifacts.tests.isEmpty());
TestDurationArtifacts.loadTests();
Assert.assertFalse(TestDurationArtifacts.tests.isEmpty());
}
@Test
public void tryAndWalkForTestXmlFiles() {
final String xmlRoot = System.getenv("JUNIT_XML_ROOT");
if (xmlRoot == null) {
System.out.println("Set JUNIT_XML_ROOT to run this test");
return;
}
List<Path> testXmlFiles = TestDurationArtifacts.getTestXmlFiles(Paths.get(xmlRoot));
Assert.assertFalse(testXmlFiles.isEmpty());
for (Path testXmlFile : testXmlFiles.stream().sorted().collect(Collectors.toList())) {
// System.out.println(testXmlFile.toString());
}
System.out.println("\n\nTESTS\n\n");
for (Path testResult : testXmlFiles) {
try {
final List<Tuple2<String, Long>> unitTests = TestDurationArtifacts.fromJunitXml(new FileInputStream(testResult.toFile()));
for (Tuple2<String, Long> unitTest : unitTests) {
System.out.println(unitTest.getFirst() + " --> " + BucketingAllocator.getDuration(unitTest.getSecond()));
}
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
}
}

View File

@ -0,0 +1,145 @@
package net.corda.testing;
import org.junit.Assert;
import org.junit.Test;
import java.io.StringReader;
import java.io.StringWriter;
public class TestsTest {
@Test
public void read() {
final Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
final String s = Tests.TEST_NAME + "," + Tests.MEAN_DURATION_NANOS + "," + Tests.NUMBER_OF_RUNS + '\n'
+ "hello,100,4\n";
tests.addTests(Tests.read(new StringReader(s)));
Assert.assertFalse(tests.isEmpty());
Assert.assertEquals((long) tests.getDuration("hello"), 100);
}
@Test
public void write() {
final StringWriter writer = new StringWriter();
final Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
tests.addDuration("hello", 100);
tests.write(writer);
Assert.assertFalse(tests.isEmpty());
final StringReader reader = new StringReader(writer.toString());
final Tests otherTests = new Tests();
otherTests.addTests(Tests.read(reader));
Assert.assertFalse(tests.isEmpty());
Assert.assertFalse(otherTests.isEmpty());
Assert.assertEquals(tests.size(), otherTests.size());
Assert.assertEquals(tests.getDuration("hello"), otherTests.getDuration("hello"));
}
@Test
public void addingTestChangesMeanDuration() {
final Tests tests = new Tests();
final String s = Tests.TEST_NAME + "," + Tests.MEAN_DURATION_NANOS + "," + Tests.NUMBER_OF_RUNS + '\n'
+ "hello,100,4\n";
tests.addTests(Tests.read(new StringReader(s)));
Assert.assertFalse(tests.isEmpty());
// 400 total for 4 tests
Assert.assertEquals((long) tests.getDuration("hello"), 100);
// 1000 total for 5 tests = 200 mean
tests.addDuration("hello", 600);
Assert.assertEquals((long) tests.getDuration("hello"), 200);
}
@Test
public void addTests() {
final Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
final String s = Tests.TEST_NAME + "," + Tests.MEAN_DURATION_NANOS + "," + Tests.NUMBER_OF_RUNS + '\n'
+ "hello,100,4\n"
+ "goodbye,200,4\n";
tests.addTests(Tests.read(new StringReader(s)));
Assert.assertFalse(tests.isEmpty());
Assert.assertEquals(tests.size(), 2);
}
@Test
public void getDuration() {
final Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
final String s = Tests.TEST_NAME + "," + Tests.MEAN_DURATION_NANOS + "," + Tests.NUMBER_OF_RUNS + '\n'
+ "hello,100,4\n"
+ "goodbye,200,4\n";
tests.addTests(Tests.read(new StringReader(s)));
Assert.assertFalse(tests.isEmpty());
Assert.assertEquals(tests.size(), 2);
Assert.assertEquals(100L, tests.getDuration("hello"));
Assert.assertEquals(200L, tests.getDuration("goodbye"));
}
@Test
public void addTestInfo() {
final Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
final String s = Tests.TEST_NAME + "," + Tests.MEAN_DURATION_NANOS + "," + Tests.NUMBER_OF_RUNS + '\n'
+ "hello,100,4\n"
+ "goodbye,200,4\n";
tests.addTests(Tests.read(new StringReader(s)));
Assert.assertFalse(tests.isEmpty());
Assert.assertEquals(2, tests.size());
tests.addDuration("foo", 55);
tests.addDuration("bar", 33);
Assert.assertEquals(4, tests.size());
tests.addDuration("bar", 56);
Assert.assertEquals(4, tests.size());
}
@Test
public void addingNewDurationUpdatesRunCount() {
final Tests tests = new Tests();
Assert.assertTrue(tests.isEmpty());
final String s = Tests.TEST_NAME + "," + Tests.MEAN_DURATION_NANOS + "," + Tests.NUMBER_OF_RUNS + '\n'
+ "hello,100,4\n"
+ "goodbye,200,4\n";
tests.addTests(Tests.read(new StringReader(s)));
Assert.assertFalse(tests.isEmpty());
Assert.assertEquals(2, tests.size());
tests.addDuration("foo", 55);
Assert.assertEquals(0, tests.getRunCount("bar"));
tests.addDuration("bar", 33);
Assert.assertEquals(4, tests.size());
tests.addDuration("bar", 56);
Assert.assertEquals(2, tests.getRunCount("bar"));
Assert.assertEquals(4, tests.size());
tests.addDuration("bar", 56);
tests.addDuration("bar", 56);
Assert.assertEquals(4, tests.getRunCount("bar"));
Assert.assertEquals(4, tests.getRunCount("hello"));
tests.addDuration("hello", 22);
tests.addDuration("hello", 22);
tests.addDuration("hello", 22);
Assert.assertEquals(7, tests.getRunCount("hello"));
}
}

View File

@ -2,12 +2,13 @@ package net.corda.testing;
import org.hamcrest.collection.IsIterableContainingInAnyOrder; import org.hamcrest.collection.IsIterableContainingInAnyOrder;
import org.junit.Assert; import org.junit.Assert;
import org.junit.Ignore;
import org.junit.Test; import org.junit.Test;
import java.util.Arrays; import java.util.Arrays;
import java.util.Collection; import java.util.Collection;
import java.util.Collections;
import java.util.List; import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import java.util.stream.Stream; import java.util.stream.Stream;
@ -17,8 +18,8 @@ public class BucketingAllocatorTest {
@Test @Test
public void shouldAlwaysBucketTestsEvenIfNotInTimedFile() { public void shouldAlwaysBucketTestsEvenIfNotInTimedFile() {
Tests tests = new Tests();
BucketingAllocator bucketingAllocator = new BucketingAllocator(1, Collections::emptyList); BucketingAllocator bucketingAllocator = new BucketingAllocator(1, () -> tests);
Object task = new Object(); Object task = new Object();
bucketingAllocator.addSource(() -> Arrays.asList("SomeTestingClass", "AnotherTestingClass"), task); bucketingAllocator.addSource(() -> Arrays.asList("SomeTestingClass", "AnotherTestingClass"), task);
@ -28,13 +29,41 @@ public class BucketingAllocatorTest {
Assert.assertThat(testsForForkAndTestTask, IsIterableContainingInAnyOrder.containsInAnyOrder("SomeTestingClass", "AnotherTestingClass")); Assert.assertThat(testsForForkAndTestTask, IsIterableContainingInAnyOrder.containsInAnyOrder("SomeTestingClass", "AnotherTestingClass"));
List<BucketingAllocator.TestsForForkContainer> forkContainers = bucketingAllocator.getForkContainers();
Assert.assertEquals(1, forkContainers.size());
// There aren't any known tests, so it will use the default instead.
Assert.assertEquals(Tests.DEFAULT_MEAN_NANOS, tests.getMeanDurationForTests());
Assert.assertEquals(2 * tests.getMeanDurationForTests(), forkContainers.get(0).getCurrentDuration().longValue());
} }
@Test
public void shouldAlwaysBucketTestsEvenIfNotInTimedFileAndUseMeanValue() {
final Tests tests = new Tests();
tests.addDuration("someRandomTestNameToForceMeanValue", 1_000_000_000);
BucketingAllocator bucketingAllocator = new BucketingAllocator(1, () -> tests);
Object task = new Object();
List<String> testNames = Arrays.asList("SomeTestingClass", "AnotherTestingClass");
bucketingAllocator.addSource(() -> testNames, task);
bucketingAllocator.generateTestPlan();
List<String> testsForForkAndTestTask = bucketingAllocator.getTestsForForkAndTestTask(0, task);
Assert.assertThat(testsForForkAndTestTask, IsIterableContainingInAnyOrder.containsInAnyOrder(testNames.toArray()));
List<BucketingAllocator.TestsForForkContainer> forkContainers = bucketingAllocator.getForkContainers();
Assert.assertEquals(1, forkContainers.size());
Assert.assertEquals(testNames.size() * tests.getMeanDurationForTests(), forkContainers.get(0).getCurrentDuration().longValue());
}
@Test @Test
public void shouldAllocateTestsAcrossForksEvenIfNoMatchingTestsFound() { public void shouldAllocateTestsAcrossForksEvenIfNoMatchingTestsFound() {
Tests tests = new Tests();
BucketingAllocator bucketingAllocator = new BucketingAllocator(2, Collections::emptyList); tests.addDuration("SomeTestingClass", 1_000_000_000);
tests.addDuration("AnotherTestingClass", 2222);
BucketingAllocator bucketingAllocator = new BucketingAllocator(2, () -> tests);
Object task = new Object(); Object task = new Object();
bucketingAllocator.addSource(() -> Arrays.asList("SomeTestingClass", "AnotherTestingClass"), task); bucketingAllocator.addSource(() -> Arrays.asList("SomeTestingClass", "AnotherTestingClass"), task);
@ -49,6 +78,101 @@ public class BucketingAllocatorTest {
List<String> allTests = Stream.of(testsForForkOneAndTestTask, testsForForkTwoAndTestTask).flatMap(Collection::stream).collect(Collectors.toList()); List<String> allTests = Stream.of(testsForForkOneAndTestTask, testsForForkTwoAndTestTask).flatMap(Collection::stream).collect(Collectors.toList());
Assert.assertThat(allTests, IsIterableContainingInAnyOrder.containsInAnyOrder("SomeTestingClass", "AnotherTestingClass")); Assert.assertThat(allTests, IsIterableContainingInAnyOrder.containsInAnyOrder("SomeTestingClass", "AnotherTestingClass"));
}
@Test
public void shouldAllocateTestsAcrossForksEvenIfNoMatchingTestsFoundAndUseExisitingValues() {
Tests tests = new Tests();
tests.addDuration("SomeTestingClass", 1_000_000_000L);
tests.addDuration("AnotherTestingClass", 3_000_000_000L);
BucketingAllocator bucketingAllocator = new BucketingAllocator(2, () -> tests);
Object task = new Object();
bucketingAllocator.addSource(() -> Arrays.asList("YetAnotherTestingClass", "SomeTestingClass", "AnotherTestingClass"), task);
bucketingAllocator.generateTestPlan();
List<String> testsForForkOneAndTestTask = bucketingAllocator.getTestsForForkAndTestTask(0, task);
List<String> testsForForkTwoAndTestTask = bucketingAllocator.getTestsForForkAndTestTask(1, task);
Assert.assertThat(testsForForkOneAndTestTask.size(), is(1));
Assert.assertThat(testsForForkTwoAndTestTask.size(), is(2));
List<String> allTests = Stream.of(testsForForkOneAndTestTask, testsForForkTwoAndTestTask).flatMap(Collection::stream).collect(Collectors.toList());
Assert.assertThat(allTests, IsIterableContainingInAnyOrder.containsInAnyOrder("YetAnotherTestingClass", "SomeTestingClass", "AnotherTestingClass"));
List<BucketingAllocator.TestsForForkContainer> forkContainers = bucketingAllocator.getForkContainers();
Assert.assertEquals(2, forkContainers.size());
// Internally, we should have sorted the tests by decreasing size, so the largest would be added to the first bucket.
Assert.assertEquals(TimeUnit.SECONDS.toNanos(3), forkContainers.get(0).getCurrentDuration().longValue());
// At this point, the second bucket is empty. We also know that the test average is 2s (1+3)/2.
// So we should put SomeTestingClass (1s) into this bucket, AND then put the 'unknown' test 'YetAnotherTestingClass'
// into this bucket, using the mean duration = 2s, resulting in 3s.
Assert.assertEquals(TimeUnit.SECONDS.toNanos(3), forkContainers.get(1).getCurrentDuration().longValue());
}
@Test
public void testBucketAllocationForSeveralTestsDistributedByClassName() {
Tests tests = new Tests();
tests.addDuration("SmallTestingClass", 1_000_000_000L);
tests.addDuration("LargeTestingClass", 3_000_000_000L);
tests.addDuration("MediumTestingClass", 2_000_000_000L);
// Gives a nice mean of 2s.
Assert.assertEquals(TimeUnit.SECONDS.toNanos(2), tests.getMeanDurationForTests());
BucketingAllocator bucketingAllocator = new BucketingAllocator(4, () -> tests);
List<String> testNames = Arrays.asList(
"EvenMoreTestingClass",
"YetAnotherTestingClass",
"AndYetAnotherTestingClass",
"OhYesAnotherTestingClass",
"MediumTestingClass",
"SmallTestingClass",
"LargeTestingClass");
Object task = new Object();
bucketingAllocator.addSource(() -> testNames, task);
// does not preserve order of known tests and unknown tests....
bucketingAllocator.generateTestPlan();
List<String> testsForFork0 = bucketingAllocator.getTestsForForkAndTestTask(0, task);
List<String> testsForFork1 = bucketingAllocator.getTestsForForkAndTestTask(1, task);
List<String> testsForFork2 = bucketingAllocator.getTestsForForkAndTestTask(2, task);
List<String> testsForFork3 = bucketingAllocator.getTestsForForkAndTestTask(3, task);
Assert.assertThat(testsForFork0.size(), is(1));
Assert.assertThat(testsForFork1.size(), is(2));
Assert.assertThat(testsForFork2.size(), is(2));
Assert.assertThat(testsForFork3.size(), is(2));
// This must be true as it is the largest value.
Assert.assertTrue(testsForFork0.contains("LargeTestingClass"));
List<String> allTests = Stream.of(testsForFork0, testsForFork1, testsForFork2, testsForFork3)
.flatMap(Collection::stream).collect(Collectors.toList());
Assert.assertThat(allTests, IsIterableContainingInAnyOrder.containsInAnyOrder(testNames.toArray()));
List<BucketingAllocator.TestsForForkContainer> forkContainers = bucketingAllocator.getForkContainers();
Assert.assertEquals(4, forkContainers.size());
long totalDuration = forkContainers.stream().mapToLong(c -> c.getCurrentDuration()).sum();
Assert.assertEquals(tests.getMeanDurationForTests() * testNames.size(), totalDuration);
Assert.assertEquals(TimeUnit.SECONDS.toNanos(3), forkContainers.get(0).getCurrentDuration().longValue());
Assert.assertEquals(TimeUnit.SECONDS.toNanos(4), forkContainers.get(1).getCurrentDuration().longValue());
Assert.assertEquals(TimeUnit.SECONDS.toNanos(4), forkContainers.get(2).getCurrentDuration().longValue());
Assert.assertEquals(TimeUnit.SECONDS.toNanos(3), forkContainers.get(3).getCurrentDuration().longValue());
}
@Test
public void durationToString() {
Assert.assertEquals("1 mins", BucketingAllocator.getDuration(60_000_000_000L));
Assert.assertEquals("4 secs", BucketingAllocator.getDuration(4_000_000_000L));
Assert.assertEquals("400 ms", BucketingAllocator.getDuration(400_000_000L));
Assert.assertEquals("400000 ns", BucketingAllocator.getDuration(400_000L));
} }
} }

File diff suppressed because it is too large Load Diff