Page MenuHomePhabricator

AQS 2.0: create Device Analytics handler unit tests
Closed, ResolvedPublic

Description

Per T328969: AQS 2.0: Revisit in-service testing approach, we are refactoring in-service tests to separate "unit" tests, which have no external dependencies, from "integration" tests, which may depend on a datastore external to the service

Create unit tests for the Device Analytics handler function (the service exposes only one endpoint, so there is only one relevant handler function) by mocking the logic layer.

It is not necessary to comprehensively test all expected values from the response. That will be done by the "integration" tests. For purposes of this new "unit" test, it is sufficient to confirm that the handler function performs properly when the logic layer responds with data of the expected schema/shape or, for a negative test, has the expected error behavior.

The bulk of the handler code is parameter validation. The unit tests should confirm that this works as expected, by verifying that the handler responds with expected errors for disallowed parameters, and allows expected parameters.

Event Timeline

Atieno subscribed.
SGupta-WMF changed the task status from Open to In Progress.Mar 23 2023, 11:21 AM
SGupta-WMF claimed this task.

Change 907801 had a related patch set uploaded (by Sg912; author: Sg912):

[generated-data-platform/aqs/device-analytics@main] Handler unit test cases with structural changes

https://gerrit.wikimedia.org/r/907801

Change 907801 merged by BPirkle:

[generated-data-platform/aqs/device-analytics@main] Handler unit test cases with structural changes

https://gerrit.wikimedia.org/r/907801

QA: make sure unit tests pass, and that there are no regressions.

@BPirkle I believe the command make test runs the unit tests correct ?

@EChukwukere-WMF Yes , so

make test is for the newly added unit tests
make itest runs the intergration tests that were there since some time.

@SGupta-WMF @BPirkle I believe my repo is up to date when when I run make itest I get the error below, unless I am missing something:

emekachukwukere@wmf3135 device-analytics % make itest 
go test ./itest
--- FAIL: TestUniqueDevices (0.04s)
    --- FAIL: TestUniqueDevices/should_return_200_for_expected_parameters (0.01s)
        unique_devices_test.go:66: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:66
            	Error:      	Not equal: 
            	            	expected: 200
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_return_200_for_expected_parameters
            	Messages:   	Wrong status code
    --- FAIL: TestUniqueDevices/should_return_400_when_parameters_are_wrong (0.01s)
        unique_devices_test.go:75: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:75
            	Error:      	Not equal: 
            	            	expected: 400
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_return_400_when_parameters_are_wrong
            	Messages:   	Wrong status code
    --- FAIL: TestUniqueDevices/should_return_400_when_start_is_after_end (0.00s)
        unique_devices_test.go:84: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:84
            	Error:      	Not equal: 
            	            	expected: 400
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_return_400_when_start_is_after_end
            	Messages:   	Wrong status code
    --- FAIL: TestUniqueDevices/should_return_400_when_timestamp_is_invalid (0.00s)
        unique_devices_test.go:94: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:94
            	Error:      	Not equal: 
            	            	expected: 400
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_return_400_when_timestamp_is_invalid
            	Messages:   	Wrong status code
    --- FAIL: TestUniqueDevices/should_return_the_same_data_when_using_timestamps_with_hours (0.00s)
        unique_devices_test.go:38: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:38
            	            				/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:109
            	Error:      	Not equal: 
            	            	expected: 200
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_return_the_same_data_when_using_timestamps_with_hours
            	Messages:   	Wrong status code
    --- FAIL: TestUniqueDevices/should_include_offset_and_underestimate (0.00s)
        unique_devices_test.go:38: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:38
            	            				/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:129
            	Error:      	Not equal: 
            	            	expected: 200
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_include_offset_and_underestimate
            	Messages:   	Wrong status code
    --- FAIL: TestUniqueDevices/should_return_numeric_values_as_integers (0.00s)
        unique_devices_test.go:38: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:38
            	            				/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:138
            	Error:      	Not equal: 
            	            	expected: 200
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_return_numeric_values_as_integers
            	Messages:   	Wrong status code
    --- FAIL: TestUniqueDevices/should_return_405_for_invalid_HTTP_verb (0.00s)
        unique_devices_test.go:155: 
            	Error Trace:	/Users/emekachukwukere/Documents/gerrit/device-analytics/itest/unique_devices_test.go:155
            	Error:      	Not equal: 
            	            	expected: 405
            	            	actual  : 404
            	Test:       	TestUniqueDevices/should_return_405_for_invalid_HTTP_verb
            	Messages:   	Wrong status code
FAIL
FAIL	device-analytics/itest	0.514s
FAIL
make: *** [itest] Error 1

@SGupta-WMF @BPirkle I believe my repo is up to date when when I run make itest I get the error below, unless I am missing something:

Do you have the service and mock dataset running? Please see T330222: AQS 2.0: allow Device Analytics in-service unit and integration tests to be executed separately, where we talked about this before. To summarize:

make test has no dependencies. You can just run it.
make itest requires the service and the mock dataset (aka testing environment) to be running.

Yup my dockers environment are running.. I just did restart it and retried it in port 8080 and still seeing that failure

Test status: QA PASS

emekachukwukere@wmf3135 device-analytics % go test -count=1 ./itest
ok  	device-analytics/itest	0.351s
emekachukwukere@wmf3135 device-analytics % go test -count=1 ./test 
ok  	device-analytics/test	0.322s

Unit tests commands
go test -count=1 ./test
go test -count=1 ./itest ( service and dockers have to be running. Change port to default of 8080 in path here : itest/unique_devices_test.go)