You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+58Lines changed: 58 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -137,6 +137,64 @@ A = [-1, 10, 0, &
137
137
138
138
## Contributing / Testing
139
139
140
+
When contributing new features by opening a pull request, testing is essential
141
+
to verify that the new features behave as intended, and that there are no
142
+
unwanted side effects. It is expected that before merging a pull request:
143
+
1. one or more unit tests are added which test formatting of small Fortran code
144
+
snippets, covering all relevant aspects of the added features.
145
+
2. if the changes lead to failures of existing tests, these test failures
146
+
should be carefully examinated. Only if the test failures are due to
147
+
intended changes of `fprettify` defaults, or because of bug fixes, the
148
+
expected test results can be updated.
149
+
150
+
### How to add a unit test
151
+
152
+
Can the new feature be reasonably covered by small code snippets (< 10 lines)?
153
+
- Yes: add a test by starting from the following skeleton, and by adding the code to the file `fprettify/tests/unittests.py`:
154
+
155
+
```python
156
+
deftest_something(self):
157
+
"""short description"""
158
+
159
+
in="Some Fortran code"
160
+
out ="Same Fortran code after fprettify formatting"
161
+
162
+
# seleced fprettify command line arguments, as documented in "fprettify.py -h":
163
+
opt = ["arg 1", "value for arg 1", "arg2", ...]
164
+
165
+
# helper function checking that fprettify output is equal to "out":
166
+
self.assert_fprettify_result(opt, in, out)
167
+
```
168
+
169
+
Then run `./run_tests.py -s unittests` and check in the output that the newly added unit test passes.
170
+
171
+
172
+
- No: add a test by adding an example Fortran source file: Add the Fortran file
173
+
to `examples/in`, and the reformatted `fprettify` output to `examples/out`.
174
+
If the test requires non-default `fprettify` options, specify these options
175
+
as an annotation `! fprettify:` followed by the command-line arguments at the
176
+
beginning of the Fortran file. You need to manually remove
177
+
`fortran_tests/test_code/examples` to make sure that the test configuration
178
+
will be updated with the changes from `examples`.
179
+
180
+
Then run `./run_tests.py -s builtin`, and check that the output mentions the
181
+
newly added example with `checksum new ok`. Check that a new line containing
182
+
the checksum for this example has been added to the file
183
+
`fortran_tests/test_results/expected_results`, and commit this change along
184
+
with your example. Rerun `./run_tests.py -s builtin` and check that the
185
+
output mentions the newly added example with `checksum ok`.
186
+
187
+
### How to debug test failures
188
+
189
+
`fprettify` comes with **unit tests**, typically testing expected formatting of smaller code snippets. These tests are entirely self-contained, insofar as the Fortran code, the fprettify options and the expected formatting results are all set within the respective test method. `fprettify` also allows to configure **integration tests** to test expected formatting of external Fortran code. **Unit tests** are relevant when adding new features to `fprettify`, and when these features can easily be tested by small code snippets. **Integration tests** are relevant when an entire Fortran module or program is needed to test a specific feature, or when an external repository relying on `fprettify` should be checked regularly for invariance under `fprettify` formatting.
The testing mechanism allows you to easily test fprettify with any Fortran project of your choice. Simply clone or copy your entire project into `fortran_tests/before` and run `python setup.py test`. The directory `fortran_tests/after` contains the test output (reformatted Fortran files). If testing fails, please submit an issue!
0 commit comments