JUnit-XML based PyTest report
Using the terminal, you can create JUnit-XML based reports from PyTest and upload them to Test Management.
Executing a Test Case with PyTest
You can initiate a PyTest project with this git repository. Use this sample code to execute a Test Case.
import pytest
def test_div_zero_exception(record_property):
"""
pytest.raises can assert that exceptions are raised (catching them)
"""
with pytest.raises(ZeroDivisionError):
record_property("id", "TC-52628")
x = 1 / 0
def test_keyerror_details(record_property):
"""
The raised exception can be referenced, and further inspected (or asserted)
"""
record_property("id", "TC-52629")
my_map = {"foo": "bar"}
with pytest.raises(KeyError) as ke:
baz = my_map["baz"]
# Our KeyError should reference the missing key, "baz"
assert "baz" in str(ke)
def test_approximate_matches(record_property):
"""
pytest.approx can be used to assert "approximate" numerical equality
(compare to "assertAlmostEqual" in unittest.TestCase)
"""
record_property("id", "TC-52630")
assert 0.1 + 0.2 == pytest.approx(0.3)
Create JUnit-XML test report with PyTest
PyTest by default generates a JUnit-XML report. Run the following command to execute test run and generate JUnit-XML report.
python3 -m pytest --junitxml=./test.xml
Sample JUnit-XML report generated from PyTest.
<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
<testsuite name="pytest" errors="0" failures="0" skipped="0" tests="7" time="0.017" timestamp="2023-04-18T18:10:07.535667" hostname="Sh*****">
<testcase classname="00_empty_test" name="test_empty" time="0.000" />
<testcase classname="01_basic_test" name="test_example" time="0.000" />
<testcase classname="02_special_assertions_test" name="test_div_zero_exception" time="0.000">
<properties>
<property name="id" value="TC-52628" />
</properties>
</testcase>
<testcase classname="02_special_assertions_test" name="test_keyerror_details" time="0.000">
<properties>
<property name="id" value="TC-52629" />
</properties>
</testcase>
<testcase classname="02_special_assertions_test" name="test_approximate_matches" time="0.000">
<properties>
<property name="id" value="TC-52630" />
</properties>
</testcase>
<testcase classname="03_simple_fixture_test" name="test_with_local_fixture" time="0.000" />
<testcase classname="03_simple_fixture_test" name="test_with_global_fixture" time="0.000" />
</testsuite>
</testsuites>
Steps to upload the JUnit-XML test report
Open the project directory in the Terminal and load the variables.
export TEST_MANAGEMENT_API_TOKEN="*************28a42"
export TEST_MANAGEMENT_PROJECT_NAME="<Project Name>"
export JUNIT_XML_FILE_PATH="<Report Path>"
export TEST_RUN_NAME="<Test Run Name"
TEST_MANAGEMENT_API_TOKEN
from the Active API Key section in the Settings of BrowserStack Test Management.
Upload the JUnit-XML test report using curl
command.
curl -k -X POST https://test-management.browserstack.com/api/v1/import/results/xml/junit \
-u $TEST_MANAGEMENT_API_TOKEN \
-F project_name="$TEST_MANAGEMENT_PROJECT_NAME" \
-F "file_path=@$JUNIT_XML_FILE_PATH" \
-F test_run_name="$TEST_RUN_NAME"
cURL
command until you get the response back.
You can also upload JUnit-XML test report with PyTest using CI/CD.
Access the test run report from the console generated URL
After your report is successfully uploaded, you see a message similar to the following in your terminal. This message also generates a URL with "url"
parameter that you can use to access your automated test run.
{"message":"File uploaded successfully.","url":"https://test-management.browserstack.com/projects/<project id>/test-runs/<test run id>","success":true}
View test run report in Test Management
- Log in to the BrowserStack Test Management dashboard.
- Select the relevant project that has the test report uploaded.
- Click Test Runs.
- Open the Test Run generated from automation test exectution.
- You will find all the test cases with their respective results here.
We're sorry to hear that. Please share your feedback so we can do better
Contact our Support team for immediate help while we work on improving our docs.
We're continuously improving our docs. We'd love to know what you liked
We're sorry to hear that. Please share your feedback so we can do better
Contact our Support team for immediate help while we work on improving our docs.
We're continuously improving our docs. We'd love to know what you liked
Thank you for your valuable feedback!