Using pytest-flask, we can write tests for wikilabels flask app.
Description
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Open | None | T171080 The great test making of Wikilabels | |||
Resolved | Ladsgroup | T171082 pytests for wikilabels | |||
Resolved | nikitavbv | T179015 Introduce and create pytest for flask application of the wikilabels AI service |
Event Timeline
@Ladsgroup Can you elaborate on what the task for a possible GCI student would be? What should they create/change exactly, what do you want to see to accept this task :)
@Florian The acceptance criteria would be to a passing ci test for at least one top level route of wikilabel based on the library I mentioned.
I just published github pull request.
I introduced pytest-flask and added some simple webserver route tests.
Note: I was forced to modify open_session in BeakerSessionInterface (that's in wikilabels/wsgi/sessions.py), so it checks if beaker.session actually exists. Looks like when tests are setting up pytest-flask is making a request before middleware is initiated. I am still not sure if this is the best solution to this problem.
Anyway, if something needs fixing / improving - please tell and I will do it.
However, right now CI build is failing. That's because some of tests I included require connection to database in order to work.
So we need to make PostgreSQL available during CI build, or remove the tests which require making queries to database.
What is your opinion on this?
For me, it's better to have them passing, I think setting up a database and a connection is not super hard in pytest but if it turns out to be lots of work, marking the database-related ones skipped would suffice for now.
I edited travis build process a bit, so now it contains postgres and all needed setup. Tests are not failing anymore