I've been doing some testing in my Commons userspace, as embedded maps are enabled on Commons. Using OSM data directly via the geoline service seems to result in low accuracy at medium-to-high zoom levels, at least for longer roads such as Great Northern Highway – see the comparison at commons:User:Evad37/sandbox/maps § Test 2: Great Northern Highway between the top map (generated from geoline service, actual result) and bottom map (actually generated from GeoJSON map data, but is the expected result for geoline service). Great Northern Highway isn't an isolated case, there are plenty of long roads that users will want to map when <mapframe> becomes available on Wikipedia, such as the U.S. interstate highways.
Description
Status | Subtype | Assigned | Task | ||
---|---|---|---|---|---|
Open | Feature | None | T155919 Kartographer geoline and geoshape services have low accuracy at medium-to-high zoom levels (for long or complicated features) | ||
Open | None | T303584 Investigate geoshape simplification |
Event Timeline
It seems that some sort of generalization algorithm is applied to all maplink/mapframe external data on Wikimedia side, including polygons of geoshape type, and for longer features it's just easier to spot. For example border segment of this object (Q990542) matches river curvature, but maplink/mapframe shows a zig-zag line that departs from river curvature (figure 1), unlike WIWOSM (figure 2). I hope this line generalization is not needed for performance reasons and can be dropped easily.
Figure 1: maplink/mapframe | Figure 2: WIWOSM |
It looks particularly bad if I query neighbouring administrative boundaries and boundaries of different levels. Following example shows three lines that are boundaries for a municipality, a neighbouring municipality and one top level administrative entity, all generalized to different degree. These three lines should be the same line like here.
Another example from the above merged task:
Comparing my test maplink (geoshape for France, Germany, Luxembourg, Belgium) and the OSM equivalent (for France) it becomes obvious that the shape of France is not being rendered in the maplink as it should be. For countries with less complex borders than France this issue does not seem to exist, hence this might be related.
Test maplink:
<maplink text="Test" width=300 height=300 zoom=8 latitude=50.073 longitude=4.554> { "type": "ExternalData", "service": "geoshape", "ids": "Q142,Q183,Q32,Q31", "properties": { "fill": "#90ee90" } } </maplink>
If I'm not mistaken this was introduced in T138154. Based on that task it's unclear to me if or to what extent it was necessary to apply these generalization algorithms. Appropriate generalization largely depends on scale and the way data is used. Current one doesn't look fitting to me. Could someone please evaluate if it can be dropped or what would be better parameters for it. Maybe you can adjust it to generalize lines to lesser degree for greater scale (smaller zoom level)? Or, if it relates to significant performance issue, then maybe you can rather limit the number of objects or the size of data that can be retrieved at a time?
Can someone fix this bug ASAP? The solution seem to be really trivial and nothing has happened in more than two years. WMF really fails miserably in such cases... WFT are the steadily rising financial assets of WMF good for when it can not handle bug as simple as this one??? This simply does not make sense to me...
@Kozuch: Please see https://www.mediawiki.org/wiki/Bug_management/Phabricator_etiquette where to discuss meta-topics like "financial assets", as they are off-topic in this task. See https://www.mediawiki.org/wiki/Wikimedia_Maps for general information about general Maps maintenance information. Thanks!
If the solution seems to be really trivial, https://www.mediawiki.org/wiki/Gerrit/Tutorial covers how you (or anyone else) can propose a software change to get this fixed faster.
I've done some digging:
https://github.com/kartotherian/geoshapes/blame/master/geoshapes.js#L33
sql: "SELECT id, ST_AsGeoJSON(ST_Transform(ST_Simplify(way, $3*sqrt(ST_Area(ST_Envelope(way)))), 4326)) as data FROM " + subQuery, params: [{ name: 'arg1', default: 0.001, regex: floatRe }]
The 'default' value is likely what determine the aggressiveness of the simplification (before it was more aggressive with a value of 0.01)
However, the bigger problem is of course that this was done for a reason.. There are some pretty huge geoshapes out there, like reaching into the multi mega-byte range without simplification. That is a LOT for people to download on a page, let alone have the renderer turn into an image serverside... And that is if it is just the only shape on that page, often there are multiple shapes on 1 map.
Ideally, i guess you'd want to analyse the size of the feature and depending on its complexity increase/decrease the aggressiveness of the simplification algo.. But I know nothing about the GIS side of things, so... :(
@TheDJ nice findings, I thinks it's worth to mention that kartotherian became monorepo and was move to gerrit and no longer reflects this github repository.
For what you are looking for, you should consider those links (gerrit/github):