]>
Commit | Line | Data |
---|---|---|
0f7fb5c5 MK |
1 | # Unit Test Framework Package\r |
2 | \r | |
3 | ## About\r | |
4 | \r | |
5 | This package adds a unit test framework capable of building tests for multiple contexts including\r | |
6 | the UEFI shell environment and host-based environments. It allows for unit test development to focus\r | |
7 | on the tests and leave error logging, result formatting, context persistance, and test running to the framework.\r | |
8 | The unit test framework works well for low level unit tests as well as system level tests and\r | |
9 | fits easily in automation frameworks.\r | |
10 | \r | |
11 | ### UnitTestLib\r | |
12 | \r | |
13 | The main "framework" library. The core of the framework is the Framework object, which can have any number\r | |
14 | of test cases and test suites registered with it. The Framework object is also what drives test execution.\r | |
15 | \r | |
16 | The Framework also provides helper macros and functions for checking test conditions and\r | |
17 | reporting errors. Status and error info will be logged into the test context. There are a number\r | |
18 | of Assert macros that make the unit test code friendly to view and easy to understand.\r | |
19 | \r | |
20 | Finally, the Framework also supports logging strings during the test execution. This data is logged\r | |
21 | to the test context and will be available in the test reporting phase. This should be used for\r | |
22 | logging test details and helpful messages to resolve test failures.\r | |
23 | \r | |
24 | ### UnitTestPersistenceLib\r | |
25 | \r | |
26 | Persistence lib has the main job of saving and restoring test context to a storage medium so that for tests\r | |
27 | that require exiting the active process and then resuming state can be maintained. This is critical\r | |
28 | in supporting a system reboot in the middle of a test run.\r | |
29 | \r | |
30 | ### UnitTestResultReportLib\r | |
31 | \r | |
32 | Library provides function to run at the end of a framework test run and handles formatting the report.\r | |
33 | This is a common customization point and allows the unit test framework to fit its output reports into\r | |
34 | other test infrastructure. In this package a simple library instances has been supplied to output test\r | |
35 | results to the console as plain text.\r | |
36 | \r | |
37 | ## Samples\r | |
38 | \r | |
39 | There is a sample unit test provided as both an example of how to write a unit test and leverage\r | |
40 | many of the features of the framework. This sample can be found in the `Test/UnitTest/Sample/SampleUnitTest`\r | |
41 | directory.\r | |
42 | \r | |
43 | The sample is provided in PEI, SMM, DXE, and UEFI App flavors. It also has a flavor for the HOST_APPLICATION\r | |
44 | build type, which can be run on a host system without needing a target.\r | |
45 | \r | |
46 | ## Usage\r | |
47 | \r | |
48 | This section is built a lot like a "Getting Started". We'll go through some of the components that are needed\r | |
49 | when constructing a unit test and some of the decisions that are made by the test writer. We'll also describe\r | |
50 | how to check for expected conditions in test cases and a bit of the logging characteristics.\r | |
51 | \r | |
52 | Most of these examples will refer to the SampleUnitTestUefiShell app found in this package.\r | |
53 | \r | |
54 | ### Requirements - INF\r | |
55 | \r | |
56 | In our INF file, we'll need to bring in the `UnitTestLib` library. Conveniently, the interface\r | |
57 | header for the `UnitTestLib` is located in `MdePkg`, so you shouldn't need to depend on any other\r | |
58 | packages. As long as your DSC file knows where to find the lib implementation that you want to use,\r | |
59 | you should be good to go.\r | |
60 | \r | |
61 | See this example in 'SampleUnitTestApp.inf'...\r | |
62 | \r | |
63 | ```\r | |
64 | [Packages]\r | |
65 | MdePkg/MdePkg.dec\r | |
66 | \r | |
67 | [LibraryClasses]\r | |
68 | UefiApplicationEntryPoint\r | |
69 | BaseLib\r | |
70 | DebugLib\r | |
71 | UnitTestLib\r | |
72 | PrintLib\r | |
73 | ```\r | |
74 | \r | |
75 | ### Requirements - Code\r | |
76 | \r | |
77 | Not to state the obvious, but let's make sure we have the following include before getting too far along...\r | |
78 | \r | |
79 | ```c\r | |
80 | #include <Library/UnitTestLib.h>\r | |
81 | ```\r | |
82 | \r | |
83 | Now that we've got that squared away, let's look at our 'Main()'' routine (or DriverEntryPoint() or whatever).\r | |
84 | \r | |
85 | ### Configuring the Framework\r | |
86 | \r | |
87 | Everything in the UnitTestPkg framework is built around an object called -- conveniently -- the Framework.\r | |
88 | This Framework object will contain all the information about our test, the test suites and test cases associated\r | |
89 | with it, the current location within the test pass, and any results that have been recorded so far.\r | |
90 | \r | |
91 | To get started with a test, we must first create a Framework instance. The function for this is\r | |
92 | `InitUnitTestFramework`. It takes in `CHAR8` strings for the long name, short name, and test version.\r | |
93 | The long name and version strings are just for user presentation and relatively flexible. The short name\r | |
94 | will be used to name any cache files and/or test results, so should be a name that makes sense in that context.\r | |
95 | These strings are copied internally to the Framework, so using stack-allocated or literal strings is fine.\r | |
96 | \r | |
97 | In the 'SampleUnitTestUefiShell' app, the module name is used as the short name, so the init looks like this.\r | |
98 | \r | |
99 | ```c\r | |
100 | DEBUG(( DEBUG_INFO, "%a v%a\n", UNIT_TEST_APP_NAME, UNIT_TEST_APP_VERSION ));\r | |
101 | \r | |
102 | //\r | |
103 | // Start setting up the test framework for running the tests.\r | |
104 | //\r | |
105 | Status = InitUnitTestFramework( &Framework, UNIT_TEST_APP_NAME, gEfiCallerBaseName, UNIT_TEST_APP_VERSION );\r | |
106 | ```\r | |
107 | \r | |
108 | The `&Framework` returned here is the handle to the Framework. If it's successfully returned, we can start adding\r | |
109 | test suites and test cases.\r | |
110 | \r | |
111 | Test suites exist purely to help organize test cases and to differentiate the results in reports. If you're writing\r | |
112 | a small unit test, you can conceivably put all test cases into a single suite. However, if you end up with 20+ test\r | |
113 | cases, it may be beneficial to organize them according to purpose. You _must_ have at least one test suite, even if\r | |
114 | it's just a catch-all. The function to create a test suite is `CreateUnitTestSuite`. It takes in a handle to\r | |
115 | the Framework object, a `CHAR8` string for the suite title and package name, and optional function pointers for\r | |
116 | a setup function and a teardown function.\r | |
117 | \r | |
118 | The suite title is for user presentation. The package name is for xUnit type reporting and uses a '.'-separated\r | |
119 | hierarchical format (see 'SampleUnitTestApp' for example). If provided, the setup and teardown functions will be\r | |
120 | called once at the start of the suite (before _any_ tests have run) and once at the end of the suite (after _all_\r | |
121 | tests have run), respectively. If either or both of these are unneeded, pass `NULL`. The function prototypes are\r | |
122 | `UNIT_TEST_SUITE_SETUP` and `UNIT_TEST_SUITE_TEARDOWN`.\r | |
123 | \r | |
124 | Looking at 'SampleUnitTestUefiShell' app, you can see that the first test suite is created as below...\r | |
125 | \r | |
126 | ```c\r | |
127 | //\r | |
128 | // Populate the SimpleMathTests Unit Test Suite.\r | |
129 | //\r | |
130 | Status = CreateUnitTestSuite( &SimpleMathTests, Fw, "Simple Math Tests", "Sample.Math", NULL, NULL );\r | |
131 | ```\r | |
132 | \r | |
133 | This test suite has no setup or teardown functions. The `&SimpleMathTests` returned here is a handle to the suite and\r | |
134 | will be used when adding test cases.\r | |
135 | \r | |
136 | Great! Now we've finished some of the cruft, red tape, and busy work. We're ready to add some tests. Adding a test\r | |
137 | to a test suite is accomplished with the -- you guessed it -- `AddTestCase` function. It takes in the suite handle;\r | |
138 | a `CHAR8` string for the description and class name; a function pointer for the test case itself; additional, optional\r | |
139 | function pointers for prerequisite check and cleanup routines; and and optional pointer to a context structure.\r | |
140 | \r | |
141 | Okay, that's a lot. Let's take it one piece at a time. The description and class name strings are very similar in\r | |
142 | usage to the suite title and package name strings in the test suites. The former is for user presentation and the\r | |
143 | latter is for xUnit parsing. The test case function pointer is what is actually executed as the "test" and the\r | |
144 | prototype should be `UNIT_TEST_FUNCTION`. The last three parameters require a little bit more explaining.\r | |
145 | \r | |
146 | The prerequisite check function has a prototype of `UNIT_TEST_PREREQUISITE` and -- if provided -- will be called\r | |
147 | immediately before the test case. If this function returns any error, the test case will not be run and will be\r | |
148 | recorded as `UNIT_TEST_ERROR_PREREQUISITE_NOT_MET`. The cleanup function (prototype `UNIT_TEST_CLEANUP`) will be called\r | |
149 | immediately after the test case to provide an opportunity to reset any global state that may have been changed in the\r | |
150 | test case. In the event of a prerequisite failure, the cleanup function will also be skipped. If either of these\r | |
151 | functions is not needed, pass `NULL`.\r | |
152 | \r | |
153 | The context pointer is entirely case-specific. It will be passed to the test case upon execution. One of the purposes\r | |
154 | of the context pointer is to allow test case reuse with different input data. (Another use is for testing that wraps\r | |
155 | around a system reboot, but that's beyond the scope of this guide.) The test case must know how to interpret the context\r | |
156 | pointer, so it could be a simple value, or it could be a complex structure. If unneeded, pass `NULL`.\r | |
157 | \r | |
158 | In 'SampleUnitTestUefiShell' app, the first test case is added using the code below...\r | |
159 | \r | |
160 | ```c\r | |
161 | AddTestCase( SimpleMathTests, "Adding 1 to 1 should produce 2", "Addition", OnePlusOneShouldEqualTwo, NULL, NULL, NULL );\r | |
162 | ```\r | |
163 | \r | |
164 | This test case calls the function `OnePlusOneShouldEqualTwo` and has no prerequisite, cleanup, or context.\r | |
165 | \r | |
166 | Once all the suites and cases are added, it's time to run the Framework.\r | |
167 | \r | |
168 | ```c\r | |
169 | //\r | |
170 | // Execute the tests.\r | |
171 | //\r | |
172 | Status = RunAllTestSuites( Framework );\r | |
173 | ```\r | |
174 | \r | |
175 | ### A Simple Test Case\r | |
176 | \r | |
177 | We'll take a look at the below test case from 'SampleUnitTestApp'...\r | |
178 | \r | |
179 | ```c\r | |
180 | UNIT_TEST_STATUS\r | |
181 | EFIAPI\r | |
182 | OnePlusOneShouldEqualTwo (\r | |
183 | IN UNIT_TEST_FRAMEWORK_HANDLE Framework,\r | |
184 | IN UNIT_TEST_CONTEXT Context\r | |
185 | )\r | |
186 | {\r | |
187 | UINTN A, B, C;\r | |
188 | \r | |
189 | A = 1;\r | |
190 | B = 1;\r | |
191 | C = A + B;\r | |
192 | \r | |
193 | UT_ASSERT_EQUAL(C, 2);\r | |
194 | return UNIT_TEST_PASSED;\r | |
195 | } // OnePlusOneShouldEqualTwo()\r | |
196 | ```\r | |
197 | \r | |
198 | The prototype for this function matches the `UNIT_TEST_FUNCTION` prototype. It takes in a handle to the Framework\r | |
199 | itself and the context pointer. The context pointer could be cast and interpreted as anything within this test case,\r | |
200 | which is why it's important to configure contexts carefully. The test case returns a value of `UNIT_TEST_STATUS`, which\r | |
201 | will be recorded in the Framework and reported at the end of all suites.\r | |
202 | \r | |
203 | In this test case, the `UT_ASSERT_EQUAL` assertion is being used to establish that the business logic has functioned\r | |
204 | correctly. There are several assertion macros, and you are encouraged to use one that matches as closely to your\r | |
205 | intended test criterium as possible, because the logging is specific to the macro and more specific macros have more\r | |
206 | detailed logs. When in doubt, there are always `UT_ASSERT_TRUE` and `UT_ASSERT_FALSE`. Assertion macros that fail their\r | |
207 | test criterium will immediately return from the test case with `UNIT_TEST_ERROR_TEST_FAILED` and log an error string.\r | |
208 | _Note_ that this early return can have implications for memory leakage.\r | |
209 | \r | |
210 | At the end, if all test criteria pass, you should return `UNIT_TEST_PASSED`.\r | |
211 | \r | |
212 | ### More Complex Cases\r | |
213 | \r | |
214 | To write more advanced tests, first take a look at all the Assertion and Logging macros provided in the framework.\r | |
215 | \r | |
216 | Beyond that, if you're writing host-based tests and want to take a dependency on the UnitTestFrameworkPkg, you can\r | |
217 | leverage the `cmocka.h` interface and write tests with all the features of the Cmocka framework.\r | |
218 | \r | |
219 | Documentation for Cmocka can be found here:\r | |
220 | https://api.cmocka.org/\r | |
221 | \r | |
222 | ## Development\r | |
223 | \r | |
224 | When using the EDK2 Pytools for CI testing, the host-based unit tests will be built and run on any build that includes the `NOOPT` build target.\r | |
225 | \r | |
226 | If you are trying to iterate on a single test, a convenient pattern is to build only that test module. For example, the following command will build only the SafeIntLib host-based test from the MdePkg...\r | |
227 | \r | |
228 | ```bash\r | |
229 | stuart_ci_build -c .pytool/CISettings.py TOOL_CHAIN_TAG=VS2017 -p MdePkg -t NOOPT BUILDMODULE=MdePkg/Test/UnitTest/Library/BaseSafeIntLib/TestBaseSafeIntLib.inf\r | |
230 | ```\r | |
231 | \r | |
232 | ## Known Limitations\r | |
233 | \r | |
234 | ### PEI, DXE, SMM\r | |
235 | \r | |
236 | While sample tests have been provided for these execution environments, only cursory build validation\r | |
237 | has been performed. Care has been taken while designing the frameworks to allow for execution during\r | |
238 | boot phases, but only UEFI Shell and host-based tests have been thoroughly evaluated. Full support for\r | |
239 | PEI, DXE, and SMM is forthcoming, but should be considered beta/staging for now.\r | |
240 | \r | |
241 | ### Host-Based Support vs Other Tests\r | |
242 | \r | |
243 | The host-based test framework is powered internally by the Cmocka framework. As such, it has abilities\r | |
244 | that the target-based tests don't (yet). It would be awesome if this meant that it was a super set of\r | |
245 | the target-based tests, and it worked just like the target-based tests but with more features. Unfortunately,\r | |
246 | this is not the case. While care has been taken to keep them as close a possible, there are a few known\r | |
247 | inconsistencies that we're still ironing out. For example, the logging messages in the target-based tests\r | |
248 | are cached internally and associated with the running test case. They can be saved later as part of the\r | |
249 | reporting lib. This isn't currently possible with host-based. Only the assertion failures are logged.\r | |
250 | \r | |
251 | We will continue trying to make these as similar as possible.\r | |
252 | \r | |
253 | ## Copyright\r | |
254 | \r | |
255 | Copyright (c) Microsoft Corporation.\r | |
256 | SPDX-License-Identifier: BSD-2-Clause-Patent\r | |
257 | \r |