Skip to main content
No Result Found

Upload JUnit XML reports API

Upload API helps users import JUnit test reports in XML format into Test Observability. You can include this API in the CI workflow as a step, for a seamless integration with Test Observability.

Note: While most test frameworks have a built-in JUnit XML export feature, please be aware that these JUnit XML reports have limitations in terms of the information they can hold. Therefore, certain features in Test Observability such as test execution video, will not be available when using JUnit XML reports as a data source.

API details

POST https://upload-observability.browserstack.com/upload

Request parameters

Request (macOS/Linux)

curl -u "YOUR_USERNAME:YOUR_ACCESS_KEY" -vvv \
        -X POST \
        -F "data=@/Users/testUser/junit_report.xml" \
        -F "projectName=Junit report uploads" \
        -F "buildName=Observability Sample" \
        -F "buildIdentifier=Your Build Identifier" \
        -F "tags=junit_upload, regression" \
        -F "ci=http://localhost:8080/" \
        -F "frameworkVersion=mocha, 10.0.0" \
    https://upload-observability.browserstack.com/upload

Request (Windows)

curl -u "YOUR_USERNAME:YOUR_ACCESS_KEY" -vvv \
        -X POST \
        -F 'data=@"/Users/testUser/junit_report.xml"' \
        -F "projectName=Junit report uploads" \
        -F "buildName=Observability Sample" \
        -F "buildIdentifier=Your Build Identifier" \
        -F "tags=junit_upload, regression" \
        -F "ci=http://localhost:8080/" \
        -F "frameworkVersion=mocha, 10.0.0" \
    https://upload-observability.browserstack.com/upload
  • data* File

    Junit reports to be uploaded. A single XML file or a zip archive containing multiple xml files and screenshots if available

  • projectName* String

    Name of the project to which the build details should be added to.

  • buildName* String

    Name of the build whose run/execution being uploaded

  • format String

    Type of the test reports being uploaded. Set to junit by default.

  • buildIdentifier String

    Unique build identifier of the build run, used to differentiate between multiple build runs. When not passed, the upload is treated as a new run. Use same identifier across multiple split-run or re-run uploads to map them under the same build run.

  • tags String

    Comma-separated strings users can pass to tag the build run appropriately (can be used for tagging the type of testing, release, etc.)

  • ci String

    CI link to the job if the CI system is integrated.

  • frameworkVersion String

    Framework name and version (e.g., “mocha, 10.0.0”). Both framework name and version need to be passed comma separated.

Response attributes 200 OK JSON

Response

{
    "status": "success",
    "message": "Request to import JUnit reports received. Visit https://observability.browserstack.com/builds/egx8vyi2sytcwdonwpecyi61ogmewffarud76oes for more details."
}
  • status String

    Status of your upload request.

  • message String

    More details on the upload request.

Supported XML schemas

Sample JUnit report in XML format

<?xml version="1.0" encoding="UTF-8"?>
<!-- <testsuites> Usually the root element of a JUnit XML file. Some tools leave out 
the <testsuites> element if there is only a single top-level <testsuite> element (which
is then used as the root element).
name        Name of the entire test run
tests       Total number of tests in this file
failures    Total number of failed tests in this file
errors      Total number of errored tests in this file
skipped     Total number of skipped tests in this file
assertions  Total number of assertions for all tests in this file
time        Aggregated time of all tests in this file in seconds
timestamp   Date and time of when the test run was executed. Add this after the time parameter in ISO 8601 format (sample: timestamp="2024-04-18T15:48:23").
-->
<testsuites name="Test run" tests="8" failures="1" errors="1" skipped="1" 
    assertions="20" time="16.082687">
    
    <!-- <testsuite> A test suite usually represents a class, folder or group of tests.
    There can be many test suites in an XML file.
    name        Name of the test suite (e.g. class name or folder name)
    tests       Total number of tests in this suite
    failures    Total number of failed tests in this suite
    errors      Total number of errored tests in this suite
    skipped     Total number of skipped tests in this suite
    assertions  Total number of assertions for all tests in this suite
    time        Aggregated time of all tests in this file in seconds
    timestamp   Date and time of when the test run was executed. Add this after the time parameter in ISO 8601 format (sample: timestamp="2024-04-18T15:48:23").
    file        Source code file of this test suite
    hostname    Name of the host where this test suite was run
    -->
    <testsuite name="Tests.Registration" tests="8" failures="1" errors="1" skipped="1" 
        assertions="20" time="16.082687" 
        file="tests/registration.code" hostname="C02G7B7FML7H">
        <!-- <properties> Test suites (and test cases, see below) can have additional 
        properties such as environment variables or version numbers. -->
        <properties>
            <!-- <property> Each property has a name and value. -->
            <property name="os" value="Andriod" />
            <property name="os_version" value="10" />
            <property name="browser" value="Google Chrome" />
            <!-- one of 'device' or 'devicename' properties can be used to add device information. -->
            <property name="devicename" value="Samsung Galaxy S10 Plus" />
            <property name="device" value="Samsung Galaxy S10 Plus" />
            <!-- 'hostname' can alternatively included here. -->
            <property name="hostname" value="C02G7B7FML7H" />
            <property name="ci" value="https://github.com/actions/runs/1234" />  
        </properties>
        <!-- <system-out> Optionally data written to standard out for the suite. 
        Also supported on a test case level, see below. -->
        <system-out>Data written to standard out.</system-out>
        <!-- <system-err> Optionally data written to standard error for the suite. 
        Also supported on a test case level, see below. -->
        <system-err>Data written to standard error.</system-err>
        <!-- <testcase> There are one or more test cases in a test suite. A test is considered passed
        if there isn't an additional result element (skipped, failure, error). 
        name        The name of this test case, often the method name
        classname   The name of the parent class/folder, often the same as the suite's name
        assertions  Number of assertions checked during test case execution
        time        Execution time of the test in seconds
        file        Source code file of this test case
        line        Source code line number of the start of this test case
        result      An optional attribute to indicate the test result
        -->
        <testcase name="testCase1" classname="Tests.Registration" assertions="2"
            time="2.436" file="tests/registration.code" line="24" />
        <testcase name="testCase2" classname="Tests.Registration" assertions="6"
            time="1.534" file="tests/registration.code" line="62" />
        <!-- Example of a test case with properties -->
        <testcase name="testCase3" classname="Tests.Registration" assertions="3"
            time="0.822" file="tests/registration.code" line="102" >
            <!-- <properties> Some tools also support properties for test cases. You can use these to pass more details around tests.-->
            <properties>
                <property name="browser" value="Google Chrome" />
                <property name="os" value="Andriod" />
                <property name="device" value="Samsung Galaxy S10 Plus" />
                <property name="author" value="Adrian" />
                <!-- Sreenshots should be included in the archive that is uploaded. Provide the relative path of the screenshots in the value-->
                <property name="attachment" value="screenshots/testCase3_login.png" />
                <property name="attachment" value="screenshots/testCase3_add_to_cart.png" />
                <property name="tags" value="p1, must_pass, sanity"/>
            </properties>
        </testcase>
        <!-- Example of a test case that was skipped -->
        <testcase name="testCase4" classname="Tests.Registration" assertions="0"
            time="0" file="tests/registration.code" line="164">
            <!-- <skipped> Indicates that the test was not executed. Can have an optional
            message describing why the test was skipped. -->
            <skipped message="Test was skipped." />
        </testcase>
        <!-- Example of a test case that failed. -->
        <testcase name="testCase5" classname="Tests.Registration" assertions="2"
            time="2.902412" file="tests/registration.code" line="202">
            <!-- <failure> The test failed because one of the assertions/checks failed.
            Can have a message and failure type, often the assertion type or class. The text
            content of the element often includes the failure description or stack trace. -->
            <failure message="Expected value did not match." type="AssertionError">
                <!-- Failure description or stack trace -->
            </failure>
        </testcase>
        <!-- Example of a test case that had errors. -->
        <testcase name="testCase6" classname="Tests.Registration" assertions="0"
            time="3.819" file="tests/registration.code" line="235">
            <!-- <error> The test had an unexpected error during execution. Can have a 
            message and error type, often the exception type or class. The text
            content of the element often includes the error description or stack trace. -->
            <error message="Division by zero." type="ArithmeticError">
                <!-- Error description or stack trace -->
            </error>
        </testcase>
        <!-- Example of a test case with outputs. -->
        <testcase name="testCase7" classname="Tests.Registration" assertions="3"
            time="2.944" file="tests/registration.code" line="287">
            <!-- <system-out> Optional data written to standard out for the test case. -->
            <system-out>Data written to standard out.</system-out>
            <!-- <system-err> Optional data written to standard error for the test case. -->
            <system-err>Data written to standard error.</system-err>
        </testcase>
        <testcase name="testCase8" classname="Tests.Registration" assertions="4"
            time="1.625275" file="tests/registration.code" line="302" />
    </testsuite>
</testsuites>

The tricky part of dealing with JUnit XML reports is that there is no official JUnit XML schema. Various tools generate and support different flavors of XML formats.

Test Observability supports all of the common JUnit XML schemas such as Apache Ant, Jenkins, and Maven Surefire. You can refer to the adjacent sample XML file which lists all the elements, attributes, and properties accepted by Test Observability.

You can find additional information on the JUnit XML format in this writeup by the Testmo team.

Limitations

  • The maximum input file size supported: 100MB
  • You have to upload JUnit XML reports chronologically. Test Observability considers the timestamp attribute of the testsuite element in the XML file as the build start time. If the timestamp attribute is not present, Test Observability considers the upload time as the build end time and calculates the start time appropriately.
  • The buildIdentifier expires after six hours. Hence, new test results cannot be appended to an existing build run after six hours from the first API call.
  • XML report parsing is asynchronous. Therefore, it may take a few seconds for the uploaded tests to appear on Test Observability.
  • If you are uploading a zip file, you can add images along with the XML files. These image files should be in PNG, APNG, JPG/JPEG, or BMP/BMPF format.

We're sorry to hear that. Please share your feedback so we can do better

Contact our Support team for immediate help while we work on improving our docs.

We're continuously improving our docs. We'd love to know what you liked





Thank you for your valuable feedback

Download Copy Check Circle