Currently devolearn uses a relatively simple testing suite i.e python3 setup.py test. But a much cleaner alternative would be to use pytest for testing and codecov for coverage reports.
Why use coverage reports ?
They help us see how much of the code is being tested after each push, this'll help us find the pieces of code that are important but remain untested.
Why pytest ?
From my own personal experience, python3 setup.py test crashes without an error message when the user tries to access a CUDA device (GPU) on the github actions runtime. This error would've been impossible to fix if I hadn't moved to using pytest (which showed me the proper error message).
Currently devolearn uses a relatively simple testing suite i.e
python3 setup.py test. But a much cleaner alternative would be to usepytestfor testing andcodecovfor coverage reports.Why use coverage reports ?
They help us see how much of the code is being tested after each push, this'll help us find the pieces of code that are important but remain untested.
Why pytest ?
From my own personal experience,
python3 setup.py testcrashes without an error message when the user tries to access a CUDA device (GPU) on the github actions runtime. This error would've been impossible to fix if I hadn't moved to using pytest (which showed me the proper error message).