This task will track the migration of EventLogging schemas & stream so to Event Platform schemas.
Tracking and planning of what schemas to migrate is being done in the [[ https://docs.google.com/spreadsheets/d/1WXbGPyuu2S6TYvrb-DvWWmrEx_K7TJ5rYPkjhvgWjoI/edit#gid=1715982822| EventLogging Schema Migration Audit spreadsheet ]].
Explanation of what this means for legacy EventLogging schema owners:
https://wikitech.wikimedia.org/wiki/Event_Platform/EventLogging_legacy
We will do our best to migrate schemas in groups associated with teams.
**Schemas produced by EventLogging extension**
[x] Community Tech
-- [[ https://gerrit.wikimedia.org/r/c/schemas/event/secondary/+/607308 | TemplateWizard ]]
[x] Anti-harassment
-- {T268517} (SpecialInvestigate, SpecialMuteSubmit)
[x] Growth
-- {T267333} (HelpPanel, HomepageModule, HomepageVisit, NewcomerTask)
[x] Product Infrastructure
-- {T267348}
[x] Structured Data
-- {T267351}
[x] Web
-- {T267347}
-- {T271164}
-- {T271165}
-- {T271166}
[x] Editing
-- {T267343}
-- {T267353}
[x] Language
-- ContentTranslationAbuseFilter
-- {T267352}
[x] Performance
-- {T271208}
- Other
-- [x] {T282140}
[x] WMDE Technical Wishes
-- {T275005}
-- {T275007}
-- {T275008}
-- {T275009}
-- {T275011}
-- {T275012}
-- {T275013}
-- {T275014}
-- {T275015}
[] WMDEBanner*
-- {T282562}
[] FR Tech
-- {T282855}
**Schemas produced by other software**
[] Inuka //in progress //
-- {T273219} - must be done before schema migrations can happen
-- {T267344}
-- {T267345}
-- {T267346}
[x] Research
-- {T271163}
[] FR Tech
-- {T271168}
[] Reader's Web
-- {T238138}
- Other
-- [] {T282012}
---
====== Migration plan for a schema:
**1. Pick a schema to migrate**
Schemas to migrate are listed in [[ https://docs.google.com/spreadsheets/d/1WXbGPyuu2S6TYvrb-DvWWmrEx_K7TJ5rYPkjhvgWjoI/edit#gid=1715982822| EventLogging Schema Migration Audit spreadsheet ]].
**2. Create a new task to track this schema's migration**
```lang=bash
# This should work on MacOS to open a new Phab Task form in a browser with some fields already field out.
function new_el_migration_phab_task() {
schema_name="$1"
open "https://phabricator.wikimedia.org/maniphest/task/edit/form/1/?title=$schema_name Event Platform Migration&description=See: https://wikitech.wikimedia.org/wiki/Event_Platform/EventLogging_legacy
Unless otherwise notified, client IP and consequently geocoded data will no longer be collected for this event data after this migration. Please let us know if this should continue to be captured. See also T262626.& &parent=259163&tags=Event-Platform&subscribers=Ottomata,Mforns"
}
new_el_migration_phab_task SearchSatisfaction
```
Paste this checklist into the new task. This checklist is a summary of the instructions described here.
[x] 1. Pick a schema to migrate
[x] 2. Create a new task to track this schema's migration
[] 3. Create /analytics/legacy/landingpageimpression/current.yaml schema
[] 4. Edit-protect the metawiki Schema page at https://meta.wikimedia.org/wiki/Schema:LandingPageImpression
[] 5. Manually evolve the Hive table to use new schema
[] 6. Add entry to wgEventStreams, wgEventLoggingStreamNames and wgEventLoggingSchemas in operations/mediwiki-config
[] 7. Once the legacy stream's data is fully produced through EventGate, switch to using Refine job that uses schema repo instead of meta.wm.org
[] 8. Edit the producer extension.json and set EventLoggingSchemas to the new schema URI
[] 9. Once the producer extension.json is fully deployed, edit wgEventLoggingSchemas in operations/mediawiki-config InitialiseSettings.php and remove the schema's entry.
[] 10. Mark the schema as migrated in the EventLogging Schema Migration Audit spreadsheet
Link this task in the [[ https://docs.google.com/spreadsheets/d/1WXbGPyuu2S6TYvrb-DvWWmrEx_K7TJ5rYPkjhvgWjoI/edit#gid=1715982822| EventLogging Schema Migration Audit spreadsheet ]].
On the task, contact the owner of the schema and ask if they need client IP and/or geocoded data in the Hive table.
**3. Create /analytics/legacy/<schemaname>/current.yaml schema**
Using [[ https://gerrit.wikimedia.org/r/plugins/gitiles/schemas/event/secondary/+/refs/heads/master/scripts/eventlogging_legacy_schema_convert.js | eventlogging_legacy_schema_convert script ]]) in the [[ https://gerrit.wikimedia.org/r/admin/repos/schemas/event/secondary | schemas/event/secondary ]] repository:
```
old_schema_name=SearchSatisfaction
new_schema_name=$(echo $old_schema_name | tr '[:upper:]' '[:lower:]')
mkdir ./jsonschema/analytics/legacy/$new_schema_name
node ./scripts/eventlogging_legacy_schema_convert.js $old_schema_name > ./jsonschema/analytics/legacy/$new_schema_name/current.yaml
```
You'll need to edit at least the JSONSchema `examples` in current.yaml. Easiest thing to do is get an event out of Kafka and use that as as staring point.
```
# Get the last event out of Kafka
kafkacat -C -b kafka-jumbo1001.eqiad.wmnet -o -1 -c 1 -t eventlogging_SearchSatisfaction
```
If the schema owner indicated that they need client IP and/or geocoded data in Hive, you'll need to add a $ref to the fragment/http/client_ip schema. [[ https://schema.wikimedia.org/repositories//primary/jsonschema/fragment/w3c/reportingapi/report/current.yaml | Example here ]].
**(!)** Please, make sure that `npm test` is not thowing any errors other than snake_case inconsistencies (those are allowed for legacy schemas).
To do that, you need to first comment out L21 in `test/jsonschema/repository.test.js`, and then run `npm test` for your new schema to be checked.
When done, remember to revert `test/jsonschema/repository.test.js` to its original state before commiting!
**4. Edit-protect the metawiki Schema page at https://meta.wikimedia.org/wiki/Schema:$old_schema_name **
Use this as the edit-protect log message:
```
This schema has been moved to https://schema.wikimedia.org/#!//secondary/jsonschema/analytics/legacy. See also https://wikitech.wikimedia.org/wiki/Event_Platform/EventLogging_legacy
```
**5. Manually evolve the Hive table to use new schema**
Once the above schema is merged (you might have to wait 10 minutes after merge for Spark to be able to retrieve it):
```lang=bash
old_schema_name=SearchSatisfaction
new_schema_name=$(echo $old_schema_name | tr '[:upper:]' '[:lower:]')
table="event.${new_schema_name}"
schema_uri="/analytics/legacy/${new_schema_name}/latest"
# First run in dry-run mode (the default) to see what EvolveHiveTable will do.
spark2-submit --conf spark.driver.extraClassPath=/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/srv/deployment/analytics/refinery/artifacts/hive-jdbc-1.1.0-cdh5.10.0.jar:/srv/deployment/analytics/refinery/artifacts/hive-service-1.1.0-cdh5.10.0.jar --class org.wikimedia.analytics.refinery.job.refine.tool.EvolveHiveTable /srv/deployment/analytics/refinery/artifacts/refinery-job.jar --table="${table}" --schema_uri="${schema_uri}"
# If that looks good, evolve the table:
spark2-submit --conf spark.driver.extraClassPath=/usr/lib/hadoop-mapreduce/hadoop-mapreduce-client-common.jar:/srv/deployment/analytics/refinery/artifacts/hive-jdbc-1.1.0-cdh5.10.0.jar:/srv/deployment/analytics/refinery/artifacts/hive-service-1.1.0-cdh5.10.0.jar --class org.wikimedia.analytics.refinery.job.refine.tool.EvolveHiveTable /srv/deployment/analytics/refinery/artifacts/refinery-job.jar --table="${table}" --schema_uri="${schema_uri}" --dry_run=false
```
**6. Add entry to `wgEventStreams`, `wgEventLoggingStreamNames` and `wgEventLoggingSchemas` in operations/mediwiki-config**
Rolling deploy changes to make EventLogging extension produce data to EventGate. Example: https://gerrit.wikimedia.org/r/c/operations/mediawiki-config/+/607333/2/wmf-config/InitialiseSettings.php
To test that events make it through the pipeline:
Check that events for your stream are still flowing through in [[ https://grafana.wikimedia.org/d/000000018/eventlogging-schema?orgId=1 | this Grafana dashboard ]], or by consuming the `eventlogging_$old_schema_name` topic from Kafka.
You can submit an event via a browser developer console:
```lang=javascript
old_schema_name=Test;
event = { "OtherMessage" => "Hello from JS" } // example event here;
mw.eventLog.logEvent(old_schema_name, event);
```
Or, if a server side PHP EventLogging event, you can emit a test event with mwscript shell.php on deployment.eqiad.wmnet:
```lang=bash
cd /srv/mediawiki-staging
mwscript shell.php --wiki testwiki
>>> $old_schema_name = 'Test';
>>> $event = [ "OtherMessage" => "Hello from PHP" ];
>>> EventLogging::logEvent( $old_schema_name, -1, $event );
```
**7. Once the legacy stream's data is fully produced through EventGate, switch to using Refine job that uses schema repo instead of meta.wm.org**
Also add the SchemaName to the eventlogging-processor disabled schemas list in puppet in modules/eventlogging/files/plugins.py**
Example: https://gerrit.wikimedia.org/r/c/operations/puppet/+/644259
This will prevent eventlogging-processor from producing what are now invalid legacy events from clients that are running old code.
Restart eventlogging-processor on eventlog1003:
```
sudo puppet agent -t # make sure the change has been applied
sudo service eventlogging-processor@client-side-* restart
```
**8. Edit the producer extension.json and set EventLoggingSchemas to the new schema URI**
Example: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/ContentTranslation/+/639578
**9. Once the producer extension.json is fully deployed, edit `wgEventLoggingSchemas` in operations/mediawiki-config InitialiseSettings.php and remove the schema's entry.**
Example: https://gerrit.wikimedia.org/r/c/operations/mediawiki-config/+/639579
**10. Mark the schema as migrated in the [[ https://docs.google.com/spreadsheets/d/1WXbGPyuu2S6TYvrb-DvWWmrEx_K7TJ5rYPkjhvgWjoI/edit#gid=1715982822| EventLogging Schema Migration Audit spreadsheet ]]**