Apply suggestions from code review

Co-authored-by: Trevor Clinkenbeard <trevor.clinkenbeard@snowflake.com>
Co-authored-by: Bharadwaj V.R <bharadwaj.vr@snowflake.com>
This commit is contained in:
Markus Pilman 2022-09-21 08:28:01 -06:00 committed by GitHub
parent 24aced6d4a
commit 90b48e862e
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
1 changed files with 5 additions and 5 deletions
documentation/sphinx/source

View File

@ -17,15 +17,15 @@ A simple example of a code probe could look as follows:
.. code-block:: C++
CODE_PROBE(self->forceRecovery, "Resolver detects forced recovery", context::sim2);
CODE_PROBE(self->forceRecovery, "Resolver detects forced recovery", probe::context::sim2);
On a very high level, the above code will indicate that whenever this line is executed and ``self->forceRecovery`` is ``true``, we ran into some interesting case. In addition this probe is also annotated with ``context::sim2``. This indicates that we expect this code to be eventually hit in simulation.
On a very high level, the above code will indicate that whenever this line is executed and ``self->forceRecovery`` is ``true``, we ran into some interesting case. In addition this probe is also annotated with ``probe::context::sim2``. This indicates that we expect this code to be eventually hit in simulation.
By default, FDB simply will write a trace-line when this code is hit and the condition is ``true``. If the code is never hit, the simulator will, at the end of the run, print the code probe but set the ``covered`` field to ``false``.
We expect that ALL code probes will be hit in a nightly run. In the future we can potentially use this feature for other things (like instructing the simulator to do an extensive search starting when one of these probes is being hit).
In addition to ``context`` annotations, users can also define and pass assertsions. For example:
In addition to ``context`` annotations, users can also define and pass assertions. For example:
.. code-block:: C++
@ -39,7 +39,7 @@ Test Harness
TestHarness is our primary testing tool. It has multiple jobs:
* *Running*: It can run a test in Joshua.
* *Statictics*: It will chose a test to run based on previously spent CPU time for all tests. It does that by writing statistics about the test at the end of each run.
* *Statistics*: It will choose a test to run based on previously spent CPU time for all tests. It does that by writing statistics about the test at the end of each run.
* *Reporting*: After an ensemble has finished (or while it is running), ``TestHarness`` can be used to generate a report in ``xml`` or ``json``.
Test Harness can be found in the FDB source repository under ``contrib/TestHarness2``. It has a weak dependency to `joshua <https://github.com/foundationDB/fdb-joshua>`_ (if Test Harness can find joshua it will report back about failed tests, otherwise it will just print out general statistics about the ensemble). Joshua will call Test Harness as follows:
@ -48,7 +48,7 @@ Test Harness can be found in the FDB source repository under ``contrib/TestHarne
python3 -m test_harness.app -s ${JOSHUA_SEED} --old-binaries-path ${OLDBINDIR}
Here the seed is a random number generated by joshua and ``OLDBINDIR`` is a directory path where the old fdb binaries can be found (this is needed for restart tests). If once wants to retry a test they can pass the previous joshua seed, a directory path that has *exactly* the same content as ``OLDBINARYDIR``, plus the reported statistics to the test harness app. This should then rerun the same code as before.
Here the seed is a random number generated by joshua and ``OLDBINDIR`` is a directory path where the old fdb binaries can be found (this is needed for restart tests). If one wants to retry a test they can pass the previous joshua seed, a directory path that has *exactly* the same content as ``OLDBINARYDIR``, plus the reported statistics to the test harness app. This should then re-run the same code as before.
In order to figure out what command line arguments ``test_harness.app`` (and ``test_harness.results``) accepts, one can check the contents of ``contrib/TestHarness2/test_harness/config.py``.