Page MenuHomePhabricator

Maps don’t show up because of the problem with layers: "HTTP 400 Bad Request"
Closed, ResolvedPublic

Description

There is a problem with Wikidata integration in Wikimedia Maps and OSM layers (e. g. https://ru.wikipedia.org/wiki/Казань#/maplink/0), which blocks showing of maps in the article.

Page: https://ru.wikipedia.org/wiki/Бретань

Steps to reproduce:

  1. Click on <maplink>
  2. Map overlay opens but no map content is showing up.
  3. In the developer console see that https://maps.wikimedia.org/geoline?getgeojson=1&ids=Q12130 for some reason shows 400 Bad Request and there’s TypeError: layer is undefined error in console because of that. Compare to https://ru.wikivoyage.org/wiki/Бретань (doesn’t have OSM layers support)

Event Timeline

stjn created this task.Apr 9 2017, 6:18 PM
Restricted Application added a subscriber: Aklapper. · View Herald TranscriptApr 9 2017, 6:18 PM
Aklapper renamed this task from Maps don’t show up because of the problem with layers to Maps don’t show up because of the problem with layers: "HTTP 400 Bad Request".Apr 10 2017, 9:44 AM
Aklapper updated the task description. (Show Details)
MaxSem added a subscriber: MaxSem.Apr 11 2017, 5:55 PM

Something unhealthy is going on, I grabbed a similar query from server logs:

gis=> explain analyse SELECT id, ST_Multi(ST_Union(way)) AS way
        FROM (
          SELECT tags->'wikidata' AS id, (ST_Dump(way)).geom AS way
          FROM "planet_osm_line"
          WHERE tags ? 'wikidata' AND tags->'wikidata' IN ('Q12130')
          ) combq
        GROUP BY id;

 HashAggregate  (cost=50959.93..50975.43 rows=200 width=64) (actual time=86060.888..86060.890 rows=1 loops=1)
   Group Key: (planet_osm_line.tags -> 'wikidata'::text)
   ->  Index Scan using planet_osm_line_wikidata on planet_osm_line  (cost=0.43..16914.93 rows=619000 width=304) (actual time=0.198..95.472 rows=888 loops=1)
         Index Cond: ((tags -> 'wikidata'::text) = 'Q12130'::text)
 Planning time: 0.235 ms
 Execution time: 86083.500 ms

This mysteriously omits what the hell most of query execution time gets spent on.

https://maps.wikimedia.org/geoline?getgeojson=1&ids=Q12130 reports "canceling statement due to statement timeout"

So that's consistent with a slow query. Looking at the explain analyze, the time spent on the aggregate for the group by, and in this case, I know its in ST_Union.

The inner select is returning 888 linestrings with 263k vertexes which ST_Union has to process and attempt to merge.

Depending on what the expected result of this query is, perhaps it should be using ST_Collect, not ST_Union.

stjn added a comment.Apr 12 2017, 4:06 PM

Maybe you could at least ignore the layer in JavaScript if there’s errors for loading it? Right now it bugs out the whole map, and that’s not the best behaviour even if there’s more underlying problems.

Maybe you could at least ignore the layer in JavaScript if there’s errors for loading it? Right now it bugs out the whole map, and that’s not the best behaviour even if there’s more underlying problems.

There is a ticket for gracefully handling invalid data layers, see T148883. Please join the discussion.

On this ticket, I suggest we focus on this specific bug.

debt added a subscriber: debt.

Based on the conversation in T162554, removing the sprint tag as we won't be working on this anytime soon.

Restricted Application added a project: Discovery. · View Herald TranscriptJun 15 2017, 12:20 AM
Mholloway closed this task as Resolved.Aug 21 2018, 3:48 PM
Mholloway claimed this task.
Mholloway added a subscriber: Mholloway.

Closing as resolved as the specific issue here appears to be resolved. We still need to do a general improvement in handling ExternalData failures (see T148883).