proxygen
proxygen/wangle/_build/gtest/src/gtest/googletest/docs/V1_5_Primer.md
Go to the documentation of this file.
1 
2 
3 # Introduction: Why Google C++ Testing Framework? #
4 
5 _Google C++ Testing Framework_ helps you write better C++ tests.
6 
7 No matter whether you work on Linux, Windows, or a Mac, if you write C++ code,
8 Google Test can help you.
9 
10 So what makes a good test, and how does Google C++ Testing Framework fit in? We believe:
11  1. Tests should be _independent_ and _repeatable_. It's a pain to debug a test that succeeds or fails as a result of other tests. Google C++ Testing Framework isolates the tests by running each of them on a different object. When a test fails, Google C++ Testing Framework allows you to run it in isolation for quick debugging.
12  1. Tests should be well _organized_ and reflect the structure of the tested code. Google C++ Testing Framework groups related tests into test cases that can share data and subroutines. This common pattern is easy to recognize and makes tests easy to maintain. Such consistency is especially helpful when people switch projects and start to work on a new code base.
13  1. Tests should be _portable_ and _reusable_. The open-source community has a lot of code that is platform-neutral, its tests should also be platform-neutral. Google C++ Testing Framework works on different OSes, with different compilers (gcc, MSVC, and others), with or without exceptions, so Google C++ Testing Framework tests can easily work with a variety of configurations. (Note that the current release only contains build scripts for Linux - we are actively working on scripts for other platforms.)
14  1. When tests fail, they should provide as much _information_ about the problem as possible. Google C++ Testing Framework doesn't stop at the first test failure. Instead, it only stops the current test and continues with the next. You can also set up tests that report non-fatal failures after which the current test continues. Thus, you can detect and fix multiple bugs in a single run-edit-compile cycle.
15  1. The testing framework should liberate test writers from housekeeping chores and let them focus on the test _content_. Google C++ Testing Framework automatically keeps track of all tests defined, and doesn't require the user to enumerate them in order to run them.
16  1. Tests should be _fast_. With Google C++ Testing Framework, you can reuse shared resources across tests and pay for the set-up/tear-down only once, without making tests depend on each other.
17 
18 Since Google C++ Testing Framework is based on the popular xUnit
19 architecture, you'll feel right at home if you've used JUnit or PyUnit before.
20 If not, it will take you about 10 minutes to learn the basics and get started.
21 So let's go!
22 
23 _Note:_ We sometimes refer to Google C++ Testing Framework informally
24 as _Google Test_.
25 
26 # Setting up a New Test Project #
27 
28 To write a test program using Google Test, you need to compile Google
29 Test into a library and link your test with it. We provide build
30 files for some popular build systems (`msvc/` for Visual Studio,
31 `xcode/` for Mac Xcode, `make/` for GNU make, `codegear/` for Borland
32 C++ Builder, and the autotools script in the
33 Google Test root directory). If your build system is not on this
34 list, you can take a look at `make/Makefile` to learn how Google Test
35 should be compiled (basically you want to compile `src/gtest-all.cc`
36 with `GTEST_ROOT` and `GTEST_ROOT/include` in the header search path,
37 where `GTEST_ROOT` is the Google Test root directory).
38 
39 Once you are able to compile the Google Test library, you should
40 create a project or build target for your test program. Make sure you
41 have `GTEST_ROOT/include` in the header search path so that the
42 compiler can find `<gtest/gtest.h>` when compiling your test. Set up
43 your test project to link with the Google Test library (for example,
44 in Visual Studio, this is done by adding a dependency on
45 `gtest.vcproj`).
46 
47 If you still have questions, take a look at how Google Test's own
48 tests are built and use them as examples.
49 
50 # Basic Concepts #
51 
52 When using Google Test, you start by writing _assertions_, which are statements
53 that check whether a condition is true. An assertion's result can be _success_,
54 _nonfatal failure_, or _fatal failure_. If a fatal failure occurs, it aborts
55 the current function; otherwise the program continues normally.
56 
57 _Tests_ use assertions to verify the tested code's behavior. If a test crashes
58 or has a failed assertion, then it _fails_; otherwise it _succeeds_.
59 
60 A _test case_ contains one or many tests. You should group your tests into test
61 cases that reflect the structure of the tested code. When multiple tests in a
62 test case need to share common objects and subroutines, you can put them into a
63 _test fixture_ class.
64 
65 A _test program_ can contain multiple test cases.
66 
67 We'll now explain how to write a test program, starting at the individual
68 assertion level and building up to tests and test cases.
69 
70 # Assertions #
71 
72 Google Test assertions are macros that resemble function calls. You test a
73 class or function by making assertions about its behavior. When an assertion
74 fails, Google Test prints the assertion's source file and line number location,
75 along with a failure message. You may also supply a custom failure message
76 which will be appended to Google Test's message.
77 
78 The assertions come in pairs that test the same thing but have different
79 effects on the current function. `ASSERT_*` versions generate fatal failures
80 when they fail, and **abort the current function**. `EXPECT_*` versions generate
81 nonfatal failures, which don't abort the current function. Usually `EXPECT_*`
82 are preferred, as they allow more than one failures to be reported in a test.
83 However, you should use `ASSERT_*` if it doesn't make sense to continue when
84 the assertion in question fails.
85 
86 Since a failed `ASSERT_*` returns from the current function immediately,
87 possibly skipping clean-up code that comes after it, it may cause a space leak.
88 Depending on the nature of the leak, it may or may not be worth fixing - so
89 keep this in mind if you get a heap checker error in addition to assertion
90 errors.
91 
92 To provide a custom failure message, simply stream it into the macro using the
93 `<<` operator, or a sequence of such operators. An example:
94 ```
95 ASSERT_EQ(x.size(), y.size()) << "Vectors x and y are of unequal length";
96 
97 for (int i = 0; i < x.size(); ++i) {
98  EXPECT_EQ(x[i], y[i]) << "Vectors x and y differ at index " << i;
99 }
100 ```
101 
102 Anything that can be streamed to an `ostream` can be streamed to an assertion
103 macro--in particular, C strings and `string` objects. If a wide string
104 (`wchar_t*`, `TCHAR*` in `UNICODE` mode on Windows, or `std::wstring`) is
105 streamed to an assertion, it will be translated to UTF-8 when printed.
106 
107 ## Basic Assertions ##
108 
109 These assertions do basic true/false condition testing.
110 | **Fatal assertion** | **Nonfatal assertion** | **Verifies** |
111 |:--------------------|:-----------------------|:-------------|
112 | `ASSERT_TRUE(`_condition_`)`; | `EXPECT_TRUE(`_condition_`)`; | _condition_ is true |
113 | `ASSERT_FALSE(`_condition_`)`; | `EXPECT_FALSE(`_condition_`)`; | _condition_ is false |
114 
115 Remember, when they fail, `ASSERT_*` yields a fatal failure and
116 returns from the current function, while `EXPECT_*` yields a nonfatal
117 failure, allowing the function to continue running. In either case, an
118 assertion failure means its containing test fails.
119 
120 _Availability_: Linux, Windows, Mac.
121 
122 ## Binary Comparison ##
123 
124 This section describes assertions that compare two values.
125 
126 | **Fatal assertion** | **Nonfatal assertion** | **Verifies** |
127 |:--------------------|:-----------------------|:-------------|
128 |`ASSERT_EQ(`_expected_`, `_actual_`);`|`EXPECT_EQ(`_expected_`, `_actual_`);`| _expected_ `==` _actual_ |
129 |`ASSERT_NE(`_val1_`, `_val2_`);` |`EXPECT_NE(`_val1_`, `_val2_`);` | _val1_ `!=` _val2_ |
130 |`ASSERT_LT(`_val1_`, `_val2_`);` |`EXPECT_LT(`_val1_`, `_val2_`);` | _val1_ `<` _val2_ |
131 |`ASSERT_LE(`_val1_`, `_val2_`);` |`EXPECT_LE(`_val1_`, `_val2_`);` | _val1_ `<=` _val2_ |
132 |`ASSERT_GT(`_val1_`, `_val2_`);` |`EXPECT_GT(`_val1_`, `_val2_`);` | _val1_ `>` _val2_ |
133 |`ASSERT_GE(`_val1_`, `_val2_`);` |`EXPECT_GE(`_val1_`, `_val2_`);` | _val1_ `>=` _val2_ |
134 
135 In the event of a failure, Google Test prints both _val1_ and _val2_
136 . In `ASSERT_EQ*` and `EXPECT_EQ*` (and all other equality assertions
137 we'll introduce later), you should put the expression you want to test
138 in the position of _actual_, and put its expected value in _expected_,
139 as Google Test's failure messages are optimized for this convention.
140 
141 Value arguments must be comparable by the assertion's comparison operator or
142 you'll get a compiler error. Values must also support the `<<` operator for
143 streaming to an `ostream`. All built-in types support this.
144 
145 These assertions can work with a user-defined type, but only if you define the
146 corresponding comparison operator (e.g. `==`, `<`, etc). If the corresponding
147 operator is defined, prefer using the `ASSERT_*()` macros because they will
148 print out not only the result of the comparison, but the two operands as well.
149 
150 Arguments are always evaluated exactly once. Therefore, it's OK for the
151 arguments to have side effects. However, as with any ordinary C/C++ function,
152 the arguments' evaluation order is undefined (i.e. the compiler is free to
153 choose any order) and your code should not depend on any particular argument
154 evaluation order.
155 
156 `ASSERT_EQ()` does pointer equality on pointers. If used on two C strings, it
157 tests if they are in the same memory location, not if they have the same value.
158 Therefore, if you want to compare C strings (e.g. `const char*`) by value, use
159 `ASSERT_STREQ()` , which will be described later on. In particular, to assert
160 that a C string is `NULL`, use `ASSERT_STREQ(NULL, c_string)` . However, to
161 compare two `string` objects, you should use `ASSERT_EQ`.
162 
163 Macros in this section work with both narrow and wide string objects (`string`
164 and `wstring`).
165 
166 _Availability_: Linux, Windows, Mac.
167 
168 ## String Comparison ##
169 
170 The assertions in this group compare two **C strings**. If you want to compare
171 two `string` objects, use `EXPECT_EQ`, `EXPECT_NE`, and etc instead.
172 
173 | **Fatal assertion** | **Nonfatal assertion** | **Verifies** |
174 |:--------------------|:-----------------------|:-------------|
175 | `ASSERT_STREQ(`_expected\_str_`, `_actual\_str_`);` | `EXPECT_STREQ(`_expected\_str_`, `_actual\_str_`);` | the two C strings have the same content |
176 | `ASSERT_STRNE(`_str1_`, `_str2_`);` | `EXPECT_STRNE(`_str1_`, `_str2_`);` | the two C strings have different content |
177 | `ASSERT_STRCASEEQ(`_expected\_str_`, `_actual\_str_`);`| `EXPECT_STRCASEEQ(`_expected\_str_`, `_actual\_str_`);` | the two C strings have the same content, ignoring case |
178 | `ASSERT_STRCASENE(`_str1_`, `_str2_`);`| `EXPECT_STRCASENE(`_str1_`, `_str2_`);` | the two C strings have different content, ignoring case |
179 
180 Note that "CASE" in an assertion name means that case is ignored.
181 
182 `*STREQ*` and `*STRNE*` also accept wide C strings (`wchar_t*`). If a
183 comparison of two wide strings fails, their values will be printed as UTF-8
184 narrow strings.
185 
186 A `NULL` pointer and an empty string are considered _different_.
187 
188 _Availability_: Linux, Windows, Mac.
189 
190 See also: For more string comparison tricks (substring, prefix, suffix, and
191 regular expression matching, for example), see the [AdvancedGuide Advanced
192 Google Test Guide].
193 
194 # Simple Tests #
195 
196 To create a test:
197  1. Use the `TEST()` macro to define and name a test function, These are ordinary C++ functions that don't return a value.
198  1. In this function, along with any valid C++ statements you want to include, use the various Google Test assertions to check values.
199  1. The test's result is determined by the assertions; if any assertion in the test fails (either fatally or non-fatally), or if the test crashes, the entire test fails. Otherwise, it succeeds.
200 
201 ```
202 TEST(test_case_name, test_name) {
203  ... test body ...
204 }
205 ```
206 
207 
208 `TEST()` arguments go from general to specific. The _first_ argument is the
209 name of the test case, and the _second_ argument is the test's name within the
210 test case. Remember that a test case can contain any number of individual
211 tests. A test's _full name_ consists of its containing test case and its
212 individual name. Tests from different test cases can have the same individual
213 name.
214 
215 For example, let's take a simple integer function:
216 ```
217 int Factorial(int n); // Returns the factorial of n
218 ```
219 
220 A test case for this function might look like:
221 ```
222 // Tests factorial of 0.
223 TEST(FactorialTest, HandlesZeroInput) {
224  EXPECT_EQ(1, Factorial(0));
225 }
226 
227 // Tests factorial of positive numbers.
228 TEST(FactorialTest, HandlesPositiveInput) {
229  EXPECT_EQ(1, Factorial(1));
230  EXPECT_EQ(2, Factorial(2));
231  EXPECT_EQ(6, Factorial(3));
232  EXPECT_EQ(40320, Factorial(8));
233 }
234 ```
235 
236 Google Test groups the test results by test cases, so logically-related tests
237 should be in the same test case; in other words, the first argument to their
238 `TEST()` should be the same. In the above example, we have two tests,
239 `HandlesZeroInput` and `HandlesPositiveInput`, that belong to the same test
240 case `FactorialTest`.
241 
242 _Availability_: Linux, Windows, Mac.
243 
244 # Test Fixtures: Using the Same Data Configuration for Multiple Tests #
245 
246 If you find yourself writing two or more tests that operate on similar data,
247 you can use a _test fixture_. It allows you to reuse the same configuration of
248 objects for several different tests.
249 
250 To create a fixture, just:
251  1. Derive a class from `::testing::Test` . Start its body with `protected:` or `public:` as we'll want to access fixture members from sub-classes.
252  1. Inside the class, declare any objects you plan to use.
253  1. If necessary, write a default constructor or `SetUp()` function to prepare the objects for each test. A common mistake is to spell `SetUp()` as `Setup()` with a small `u` - don't let that happen to you.
254  1. If necessary, write a destructor or `TearDown()` function to release any resources you allocated in `SetUp()` . To learn when you should use the constructor/destructor and when you should use `SetUp()/TearDown()`, read this [FAQ entry](V1_5_FAQ.md#should-i-use-the-constructordestructor-of-the-test-fixture-or-the-set-uptear-down-function).
255  1. If needed, define subroutines for your tests to share.
256 
257 When using a fixture, use `TEST_F()` instead of `TEST()` as it allows you to
258 access objects and subroutines in the test fixture:
259 ```
260 TEST_F(test_case_name, test_name) {
261  ... test body ...
262 }
263 ```
264 
265 Like `TEST()`, the first argument is the test case name, but for `TEST_F()`
266 this must be the name of the test fixture class. You've probably guessed: `_F`
267 is for fixture.
268 
269 Unfortunately, the C++ macro system does not allow us to create a single macro
270 that can handle both types of tests. Using the wrong macro causes a compiler
271 error.
272 
273 Also, you must first define a test fixture class before using it in a
274 `TEST_F()`, or you'll get the compiler error "`virtual outside class
275 declaration`".
276 
277 For each test defined with `TEST_F()`, Google Test will:
278  1. Create a _fresh_ test fixture at runtime
279  1. Immediately initialize it via `SetUp()` ,
280  1. Run the test
281  1. Clean up by calling `TearDown()`
282  1. Delete the test fixture. Note that different tests in the same test case have different test fixture objects, and Google Test always deletes a test fixture before it creates the next one. Google Test does not reuse the same test fixture for multiple tests. Any changes one test makes to the fixture do not affect other tests.
283 
284 As an example, let's write tests for a FIFO queue class named `Queue`, which
285 has the following interface:
286 ```
287 template <typename E> // E is the element type.
288 class Queue {
289  public:
290  Queue();
291  void Enqueue(const E& element);
292  E* Dequeue(); // Returns NULL if the queue is empty.
293  size_t size() const;
294  ...
295 };
296 ```
297 
298 First, define a fixture class. By convention, you should give it the name
299 `FooTest` where `Foo` is the class being tested.
300 ```
301 class QueueTest : public ::testing::Test {
302  protected:
303  virtual void SetUp() {
304  q1_.Enqueue(1);
305  q2_.Enqueue(2);
306  q2_.Enqueue(3);
307  }
308 
309  // virtual void TearDown() {}
310 
311  Queue<int> q0_;
312  Queue<int> q1_;
313  Queue<int> q2_;
314 };
315 ```
316 
317 In this case, `TearDown()` is not needed since we don't have to clean up after
318 each test, other than what's already done by the destructor.
319 
320 Now we'll write tests using `TEST_F()` and this fixture.
321 ```
322 TEST_F(QueueTest, IsEmptyInitially) {
323  EXPECT_EQ(0, q0_.size());
324 }
325 
326 TEST_F(QueueTest, DequeueWorks) {
327  int* n = q0_.Dequeue();
328  EXPECT_EQ(NULL, n);
329 
330  n = q1_.Dequeue();
331  ASSERT_TRUE(n != NULL);
332  EXPECT_EQ(1, *n);
333  EXPECT_EQ(0, q1_.size());
334  delete n;
335 
336  n = q2_.Dequeue();
337  ASSERT_TRUE(n != NULL);
338  EXPECT_EQ(2, *n);
339  EXPECT_EQ(1, q2_.size());
340  delete n;
341 }
342 ```
343 
344 The above uses both `ASSERT_*` and `EXPECT_*` assertions. The rule of thumb is
345 to use `EXPECT_*` when you want the test to continue to reveal more errors
346 after the assertion failure, and use `ASSERT_*` when continuing after failure
347 doesn't make sense. For example, the second assertion in the `Dequeue` test is
348 `ASSERT_TRUE(n != NULL)`, as we need to dereference the pointer `n` later,
349 which would lead to a segfault when `n` is `NULL`.
350 
351 When these tests run, the following happens:
352  1. Google Test constructs a `QueueTest` object (let's call it `t1` ).
353  1. `t1.SetUp()` initializes `t1` .
354  1. The first test ( `IsEmptyInitially` ) runs on `t1` .
355  1. `t1.TearDown()` cleans up after the test finishes.
356  1. `t1` is destructed.
357  1. The above steps are repeated on another `QueueTest` object, this time running the `DequeueWorks` test.
358 
359 _Availability_: Linux, Windows, Mac.
360 
361 _Note_: Google Test automatically saves all _Google Test_ flags when a test
362 object is constructed, and restores them when it is destructed.
363 
364 # Invoking the Tests #
365 
366 `TEST()` and `TEST_F()` implicitly register their tests with Google Test. So, unlike with many other C++ testing frameworks, you don't have to re-list all your defined tests in order to run them.
367 
368 After defining your tests, you can run them with `RUN_ALL_TESTS()` , which returns `0` if all the tests are successful, or `1` otherwise. Note that `RUN_ALL_TESTS()` runs _all tests_ in your link unit -- they can be from different test cases, or even different source files.
369 
370 When invoked, the `RUN_ALL_TESTS()` macro:
371  1. Saves the state of all Google Test flags.
372  1. Creates a test fixture object for the first test.
373  1. Initializes it via `SetUp()`.
374  1. Runs the test on the fixture object.
375  1. Cleans up the fixture via `TearDown()`.
376  1. Deletes the fixture.
377  1. Restores the state of all Google Test flags.
378  1. Repeats the above steps for the next test, until all tests have run.
379 
380 In addition, if the text fixture's constructor generates a fatal failure in
381 step 2, there is no point for step 3 - 5 and they are thus skipped. Similarly,
382 if step 3 generates a fatal failure, step 4 will be skipped.
383 
384 _Important_: You must not ignore the return value of `RUN_ALL_TESTS()`, or `gcc`
385 will give you a compiler error. The rationale for this design is that the
386 automated testing service determines whether a test has passed based on its
387 exit code, not on its stdout/stderr output; thus your `main()` function must
388 return the value of `RUN_ALL_TESTS()`.
389 
390 Also, you should call `RUN_ALL_TESTS()` only **once**. Calling it more than once
391 conflicts with some advanced Google Test features (e.g. thread-safe death
392 tests) and thus is not supported.
393 
394 _Availability_: Linux, Windows, Mac.
395 
396 # Writing the main() Function #
397 
398 You can start from this boilerplate:
399 ```
400 #include "this/package/foo.h"
401 #include <gtest/gtest.h>
402 
403 namespace {
404 
405 // The fixture for testing class Foo.
406 class FooTest : public ::testing::Test {
407  protected:
408  // You can remove any or all of the following functions if its body
409  // is empty.
410 
411  FooTest() {
412  // You can do set-up work for each test here.
413  }
414 
415  virtual ~FooTest() {
416  // You can do clean-up work that doesn't throw exceptions here.
417  }
418 
419  // If the constructor and destructor are not enough for setting up
420  // and cleaning up each test, you can define the following methods:
421 
422  virtual void SetUp() {
423  // Code here will be called immediately after the constructor (right
424  // before each test).
425  }
426 
427  virtual void TearDown() {
428  // Code here will be called immediately after each test (right
429  // before the destructor).
430  }
431 
432  // Objects declared here can be used by all tests in the test case for Foo.
433 };
434 
435 // Tests that the Foo::Bar() method does Abc.
436 TEST_F(FooTest, MethodBarDoesAbc) {
437  const string input_filepath = "this/package/testdata/myinputfile.dat";
438  const string output_filepath = "this/package/testdata/myoutputfile.dat";
439  Foo f;
440  EXPECT_EQ(0, f.Bar(input_filepath, output_filepath));
441 }
442 
443 // Tests that Foo does Xyz.
444 TEST_F(FooTest, DoesXyz) {
445  // Exercises the Xyz feature of Foo.
446 }
447 
448 } // namespace
449 
450 int main(int argc, char **argv) {
451  ::testing::InitGoogleTest(&argc, argv);
452  return RUN_ALL_TESTS();
453 }
454 ```
455 
456 The `::testing::InitGoogleTest()` function parses the command line for Google
457 Test flags, and removes all recognized flags. This allows the user to control a
458 test program's behavior via various flags, which we'll cover in [AdvancedGuide](V1_5_AdvancedGuide.md).
459 You must call this function before calling `RUN_ALL_TESTS()`, or the flags
460 won't be properly initialized.
461 
462 On Windows, `InitGoogleTest()` also works with wide strings, so it can be used
463 in programs compiled in `UNICODE` mode as well.
464 
465 But maybe you think that writing all those main() functions is too much work? We agree with you completely and that's why Google Test provides a basic implementation of main(). If it fits your needs, then just link your test with gtest\_main library and you are good to go.
466 
467 ## Important note for Visual C++ users ##
468 If you put your tests into a library and your `main()` function is in a different library or in your .exe file, those tests will not run. The reason is a [bug](https://connect.microsoft.com/feedback/viewfeedback.aspx?FeedbackID=244410&siteid=210) in Visual C++. When you define your tests, Google Test creates certain static objects to register them. These objects are not referenced from elsewhere but their constructors are still supposed to run. When Visual C++ linker sees that nothing in the library is referenced from other places it throws the library out. You have to reference your library with tests from your main program to keep the linker from discarding it. Here is how to do it. Somewhere in your library code declare a function:
469 ```
470 __declspec(dllexport) int PullInMyLibrary() { return 0; }
471 ```
472 If you put your tests in a static library (not DLL) then `__declspec(dllexport)` is not required. Now, in your main program, write a code that invokes that function:
473 ```
474 int PullInMyLibrary();
475 static int dummy = PullInMyLibrary();
476 ```
477 This will keep your tests referenced and will make them register themselves at startup.
478 
479 In addition, if you define your tests in a static library, add `/OPT:NOREF` to your main program linker options. If you use MSVC++ IDE, go to your .exe project properties/Configuration Properties/Linker/Optimization and set References setting to `Keep Unreferenced Data (/OPT:NOREF)`. This will keep Visual C++ linker from discarding individual symbols generated by your tests from the final executable.
480 
481 There is one more pitfall, though. If you use Google Test as a static library (that's how it is defined in gtest.vcproj) your tests must also reside in a static library. If you have to have them in a DLL, you _must_ change Google Test to build into a DLL as well. Otherwise your tests will not run correctly or will not run at all. The general conclusion here is: make your life easier - do not write your tests in libraries!
482 
483 # Where to Go from Here #
484 
485 Congratulations! You've learned the Google Test basics. You can start writing
486 and running Google Test tests, read some [samples](Samples.md), or continue with
487 [AdvancedGuide](V1_5_AdvancedGuide.md), which describes many more useful Google Test features.
488 
489 # Known Limitations #
490 
491 Google Test is designed to be thread-safe. The implementation is
492 thread-safe on systems where the `pthreads` library is available. It
493 is currently _unsafe_ to use Google Test assertions from two threads
494 concurrently on other systems (e.g. Windows). In most tests this is
495 not an issue as usually the assertions are done in the main thread. If
496 you want to help, you can volunteer to implement the necessary
497 synchronization primitives in `gtest-port.h` for your platform.