Page MenuHomePhabricator

Query service example "Law & Order episodes" seems to break in a stack overflow, stuck at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
Closed, DuplicatePublic

Description

#Law & Order episodes
# All Law & Order episodes on Wikidata.
# According to enwp, “[a] total of 456 original episodes… aired before cancellation” (https://en.wikipedia.org/wiki/List_of_Law_%26_Order_episodes).
# As of this writing, the query returns 451 results, so some episodes are missing (either without item or lacking the necessary statements to match this query).

SELECT (SAMPLE(?seasonNumber) AS ?seasonNumber) (SAMPLE(?episodeNumber) AS ?episodeNumber) (SAMPLE(?title) AS ?title) (MIN(?pubDate) AS ?pubDate) ?episode
{
  # All episodes should be instance of episode with series Law & Order.
  ?episode wdt:P31 wd:Q21191270;
           wdt:P179 wd:Q321423.
  # Many of them also have the season as series, so we can get episode and season number from qualifiers there.
  OPTIONAL {
    ?episode p:P179 [
      # the season also has series Law & Order
      ps:P179/p:P179 [
        ps:P179 wd:Q321423;
                pq:P1545 ?seasonNumber
      ] ;
      pq:P1545 ?episodeNumber
    ]
  }
  OPTIONAL { ?episode wdt:P1476 ?title. }
  OPTIONAL { ?episode wdt:P577 ?pubDate. }
}
GROUP BY ?episode # make sure we return each episode only once – a few have multiple publication dates, for example
ORDER BY IF(BOUND(?seasonNumber), xsd:integer(?seasonNumber), 1000) xsd:integer(?episodeNumber) ?title

Results in...

java.util.concurrent.ExecutionException: java.util.concurrent.ExecutionException: java.lang.StackOverflowError
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:206)
	at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:292)
	at com.bigdata.rdf.sail.webapp.QueryServlet.doSparqlQuery(QueryServlet.java:678)
	at com.bigdata.rdf.sail.webapp.QueryServlet.doGet(QueryServlet.java:290)
	at com.bigdata.rdf.sail.webapp.RESTServlet.doGet(RESTServlet.java:240)
	at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doGet(MultiTenancyServlet.java:273)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:865)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1655)
	at org.wikidata.query.rdf.blazegraph.throttling.ThrottlingFilter.doFilter(ThrottlingFilter.java:320)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
	at org.wikidata.query.rdf.blazegraph.throttling.SystemOverloadFilter.doFilter(SystemOverloadFilter.java:82)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
	at ch.qos.logback.classic.helpers.MDCInsertingServletFilter.doFilter(MDCInsertingServletFilter.java:50)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
	at org.wikidata.query.rdf.blazegraph.filters.QueryEventSenderFilter.doFilter(QueryEventSenderFilter.java:119)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
	at org.wikidata.query.rdf.blazegraph.filters.ClientIPFilter.doFilter(ClientIPFilter.java:43)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
	at org.wikidata.query.rdf.blazegraph.filters.JWTIdentityFilter.doFilter(JWTIdentityFilter.java:66)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
	at org.wikidata.query.rdf.blazegraph.filters.RealAgentFilter.doFilter(RealAgentFilter.java:33)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1642)
	at org.wikidata.query.rdf.blazegraph.filters.RequestConcurrencyFilter.doFilter(RequestConcurrencyFilter.java:50)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1634)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1340)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1242)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
	at org.eclipse.jetty.server.Server.handle(Server.java:503)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
	at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.StackOverflowError
	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
	at java.util.concurrent.FutureTask.get(FutureTask.java:192)
	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:889)
	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:695)
	at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	... 1 more
Caused by: java.lang.StackOverflowError
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.getVarsFromArguments(StaticAnalysis.java:2113)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2099)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)
	at com.bigdata.rdf.sparql.ast.StaticAnalysis.collectVarsFromExpressions(StaticAnalysis.java:2101)

etc.

Event Timeline

(SAMPLE(?seasonNumber) AS ?seasonNumber) (SAMPLE(?episodeNumber) AS ?episodeNumber) (SAMPLE(?title) AS ?title) (MIN(?pubDate) AS ?pubDate)

This is just T235540: StackOverflowError when SPARQL query uses same variable name before and after aggregation again. Rename the variables and it stops crashing. (But the data model on the items seems to have changed in the meantime, so no season numbers or episode numbers are found anymore.)