Page MenuHomePhabricator

SPARQL service returns entities with broken URLs
Closed, ResolvedPublic

Event Timeline

Restricted Application added a subscriber: Aklapper. · View Herald Transcript

Take a look at https://addshore.com/2018/04/wikibase-of-wikibases/#more-1316 which is an example setup.

For the query service and updater you must set the Wikibase host

For example:

WIKIBASE_HOST: wikibase-registry.wmflabs.org

Do I need to repopulate the blazegraph?

You will need to yes.
If this is a new instance you should just be able to repopulate from recent changes, in which case you can just destroy the wdqs container and create a new one and it should automatically pull in the changes (as long as everything is still in recent changes.)

Addshore triaged this task as Medium priority.Oct 16 2018, 9:59 AM

Maybe more detail:
I changed the three occurrences of WIKIBASE_HOST in docker-composer.yml to "mixnmatch.wmflabs.org", then did stop, then up -d. It rebuild and startedthe wdqs containers, but the updater_1 keeps dying.

log files:

wdqs-updater_1     | Exception in thread "main" org.wikidata.query.rdf.tool.exception.ContainedException: Non-200 response from triple store:  HttpContentResponse[HTTP/1.1 500 Server Error - 6429 bytes] body=
wdqs-updater_1     | SPARQL-QUERY: queryStr=PREFIX schema: <http://schema.org/>
wdqs-updater_1     | SELECT * WHERE { <http://mixnmatch.wmflabs.org> schema:dateModified ?date }
wdqs-updater_1     | java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539696002531), readTime=readOnly(1539679638136)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.get(FutureTask.java:206)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:293)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doSparqlQuery(QueryServlet.java:654)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doPost(QueryServlet.java:273)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:269)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:193)
wdqs-updater_1     | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
wdqs-updater_1     | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:769)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1667)
wdqs-updater_1     | 	at org.wikidata.query.rdf.blazegraph.throttling.ThrottlingFilter.doFilter(ThrottlingFilter.java:304)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at ch.qos.logback.classic.helpers.MDCInsertingServletFilter.doFilter(MDCInsertingServletFilter.java:49)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at org.wikidata.query.rdf.blazegraph.filters.ClientIPFilter.doFilter(ClientIPFilter.java:43)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
wdqs-updater_1     | 	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
wdqs-updater_1     | 	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1125)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
wdqs-updater_1     | 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1059)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
wdqs-updater_1     | 	at org.eclipse.jetty.server.Server.handle(Server.java:497)
wdqs-updater_1     | 	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
wdqs-updater_1     | 	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:248)
wdqs-updater_1     | 	at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
wdqs-updater_1     | 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:610)
wdqs-updater_1     | 	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:539)
wdqs-updater_1     | 	at java.lang.Thread.run(Thread.java:748)
wdqs-updater_1     | Caused by: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539696002531), readTime=readOnly(1539679638136)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:817)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResource(DefaultResourceLocator.java:586)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.cacheMiss(DefaultResourceLocator.java:395)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locate(DefaultResourceLocator.java:347)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.<init>(BigdataSail.java:2068)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailReadOnlyConnection.<init>(BigdataSail.java:5236)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail._getReadOnlyConnection(BigdataSail.java:1540)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail.getReadOnlyConnection(BigdataSail.java:1503)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSailRepository.getReadOnlyConnection(BigdataSailRepository.java:140)
wdqs-updater_1     | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:247)
wdqs-updater_1     | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:221)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:722)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:671)
wdqs-updater_1     | 	at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
wdqs-updater_1     | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
wdqs-updater_1     | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
wdqs-updater_1     | 	... 1 more
wdqs-updater_1     | Caused by: java.lang.RuntimeException: off=0, len=558
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:239)
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:207)
wdqs-updater_1     | 	at com.bigdata.sparse.ValueType.decode(ValueType.java:333)
wdqs-updater_1     | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:347)
wdqs-updater_1     | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:157)
wdqs-updater_1     | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:98)
wdqs-updater_1     | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:36)
wdqs-updater_1     | 	at com.bigdata.btree.AbstractBTree.submit(AbstractBTree.java:3263)
wdqs-updater_1     | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:537)
wdqs-updater_1     | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:420)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:807)
wdqs-updater_1     | 	... 17 more
wdqs-updater_1     | Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException
wdqs-updater_1     | 	at com.bigdata.rdf.vocab.BaseVocabulary.readVersion2(BaseVocabulary.java:680)
wdqs-updater_1     | 	at com.bigdata.rdf.vocab.BaseVocabulary.readExternal(BaseVocabulary.java:458)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:2116)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2065)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:231)
wdqs-updater_1     | 	... 27 more
wdqs-updater_1     |
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.execute(RdfRepository.java:726)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.query(RdfRepository.java:679)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.dateFromQuery(RdfRepository.java:750)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.fetchLeftOffTime(RdfRepository.java:636)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.buildRecentChangePollerChangeSource(Update.java:160)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.buildChangeSource(Update.java:141)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.main(Update.java:65)

Ahh yes, the one other thing you'll need to do is this for the wikibase services (for example)!

networks:
  default:
    aliases:
     - wikibase.svc
     - wikibase-registry.wmflabs.org

Did that now. No change. wikibase-docker_wdqs-updater_1 still keeps dying.

I changed the configuration back to where it was, and restarted the docker containers. Still broken. Now the tool does not work anymore, and I can't fix it.

Magnus raised the priority of this task from Medium to Unbreak Now!.Oct 17 2018, 10:12 AM

So the tool still doesn't work, and I can't rollback the change...

I changed the configuration back to where it was, and restarted the docker containers.

Restarting the docker containers won't update the configuration, you need to recreate them.
You should be able to do something like this:

docker-compose up -d --force-recreate <servicename>

so

docker-compose up -d --force-recreate wdqs-updater

If you set the env vars as described in T207133#4669746 and add a network alias as in T207133#4670616 and recreate the containers as described above it should work.

If it doesn't work then please again send the logs from the container output this way as in T207133#4670492

I set

services:
  wikibase:
    networks:
      default:
        aliases:
         - wikibase.svc
         - mixnmatch.wmflabs.org

Also,

wdqs-frontend:
  environment:
    - WIKIBASE_HOST=mixnmatch.wmflabs.org
    - WDQS_HOST=wdqs-proxy.svc

and

wdqs:
  environment:
    - WIKIBASE_HOST=mixnmatch.wmflabs.org
    - WDQS_HOST=wdqs.svc
    - WDQS_PORT=9999

and

wdqs-updater:
  environment:
   - WIKIBASE_HOST=mixnmatch.wmflabs.org
   - WDQS_HOST=wdqs.svc
   - WDQS_PORT=9999

Then I downed everything, and ran docker-compose up -d --force-recreate wdqs-updater. That only started some of the services, so I downed everything again, and did a normal docker-compose up -d.

No joy. Logfile via docker-compose logs wdqs-updater:

wdqs-updater_1     | wait-for-it.sh: waiting 120 seconds for mixnmatch.wmflabs.org:80
wdqs-updater_1     | wait-for-it.sh: mixnmatch.wmflabs.org:80 is available after 3 seconds
wdqs-updater_1     | wait-for-it.sh: waiting 120 seconds for wdqs.svc:9999
wdqs-updater_1     | wait-for-it.sh: wdqs.svc:9999 is available after 8 seconds
wdqs-updater_1     | Updating via http://wdqs.svc:9999/bigdata/namespace/wdq/sparql
wdqs-updater_1     | OpenJDK 64-Bit Server VM warning: Cannot open file /var/log/wdqs/wdqs-updater_jvm_gc.pid8.log due to No such file or directory
wdqs-updater_1     |
wdqs-updater_1     | I> No access restrictor found, access to any MBean is allowed
wdqs-updater_1     | Jolokia: Agent started with URL http://127.0.0.1:8778/jolokia/
wdqs-updater_1     | #logback.classic pattern: %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
wdqs-updater_1     | 08:58:31.464 [main] INFO  org.wikidata.query.rdf.tool.Update - Checking where we left off
wdqs-updater_1     | 08:58:31.467 [main] INFO  o.w.query.rdf.tool.rdf.RdfRepository - Checking for left off time from the updater
wdqs-updater_1     | 08:58:31.624 [main] ERROR org.wikidata.query.rdf.tool.Update - Error during initialization.
wdqs-updater_1     | org.wikidata.query.rdf.tool.exception.ContainedException: Non-200 response from triple store:  HttpContentResponse[HTTP/1.1 500 Server Error - 6429 bytes] body=
wdqs-updater_1     | SPARQL-QUERY: queryStr=PREFIX schema: <http://schema.org/>
wdqs-updater_1     | SELECT * WHERE { <http://mixnmatch.wmflabs.org> schema:dateModified ?date }
wdqs-updater_1     | java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539853111593), readTime=readOnly(1539701290285)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.get(FutureTask.java:206)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:293)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doSparqlQuery(QueryServlet.java:654)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doPost(QueryServlet.java:273)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:269)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:193)
wdqs-updater_1     | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
wdqs-updater_1     | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:769)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1667)
wdqs-updater_1     | 	at org.wikidata.query.rdf.blazegraph.throttling.ThrottlingFilter.doFilter(ThrottlingFilter.java:304)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at ch.qos.logback.classic.helpers.MDCInsertingServletFilter.doFilter(MDCInsertingServletFilter.java:49)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at org.wikidata.query.rdf.blazegraph.filters.ClientIPFilter.doFilter(ClientIPFilter.java:43)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
wdqs-updater_1     | 	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
wdqs-updater_1     | 	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1125)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
wdqs-updater_1     | 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1059)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
wdqs-updater_1     | 	at org.eclipse.jetty.server.Server.handle(Server.java:497)
wdqs-updater_1     | 	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
wdqs-updater_1     | 	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:248)
wdqs-updater_1     | 	at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
wdqs-updater_1     | 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:610)
wdqs-updater_1     | 	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:539)
wdqs-updater_1     | 	at java.lang.Thread.run(Thread.java:748)
wdqs-updater_1     | Caused by: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539853111593), readTime=readOnly(1539701290285)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:817)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResource(DefaultResourceLocator.java:586)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.cacheMiss(DefaultResourceLocator.java:395)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locate(DefaultResourceLocator.java:347)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.<init>(BigdataSail.java:2068)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailReadOnlyConnection.<init>(BigdataSail.java:5236)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail._getReadOnlyConnection(BigdataSail.java:1540)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail.getReadOnlyConnection(BigdataSail.java:1503)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSailRepository.getReadOnlyConnection(BigdataSailRepository.java:140)
wdqs-updater_1     | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:247)
wdqs-updater_1     | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:221)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:722)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:671)
wdqs-updater_1     | 	at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
wdqs-updater_1     | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
wdqs-updater_1     | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
wdqs-updater_1     | 	... 1 more
wdqs-updater_1     | Caused by: java.lang.RuntimeException: off=0, len=558
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:239)
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:207)
wdqs-updater_1     | 	at com.bigdata.sparse.ValueType.decode(ValueType.java:333)
wdqs-updater_1     | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:347)
wdqs-updater_1     | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:157)
wdqs-updater_1     | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:98)
wdqs-updater_1     | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:36)
wdqs-updater_1     | 	at com.bigdata.btree.AbstractBTree.submit(AbstractBTree.java:3263)
wdqs-updater_1     | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:537)
wdqs-updater_1     | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:420)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:807)
wdqs-updater_1     | 	... 17 more
wdqs-updater_1     | Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException
wdqs-updater_1     | 	at com.bigdata.rdf.vocab.BaseVocabulary.readVersion2(BaseVocabulary.java:680)
wdqs-updater_1     | 	at com.bigdata.rdf.vocab.BaseVocabulary.readExternal(BaseVocabulary.java:458)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:2116)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2065)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:231)
wdqs-updater_1     | 	... 27 more
wdqs-updater_1     |
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.execute(RdfRepository.java:726)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.query(RdfRepository.java:679)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.dateFromQuery(RdfRepository.java:750)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.fetchLeftOffTime(RdfRepository.java:636)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.buildRecentChangePollerChangeSource(Update.java:160)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.buildChangeSource(Update.java:141)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.main(Update.java:65)
wdqs-updater_1     | Exception in thread "main" org.wikidata.query.rdf.tool.exception.ContainedException: Non-200 response from triple store:  HttpContentResponse[HTTP/1.1 500 Server Error - 6429 bytes] body=
wdqs-updater_1     | SPARQL-QUERY: queryStr=PREFIX schema: <http://schema.org/>
wdqs-updater_1     | SELECT * WHERE { <http://mixnmatch.wmflabs.org> schema:dateModified ?date }
wdqs-updater_1     | java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539853111593), readTime=readOnly(1539701290285)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.get(FutureTask.java:206)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:293)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doSparqlQuery(QueryServlet.java:654)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doPost(QueryServlet.java:273)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:269)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:193)
wdqs-updater_1     | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
wdqs-updater_1     | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:769)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1667)
wdqs-updater_1     | 	at org.wikidata.query.rdf.blazegraph.throttling.ThrottlingFilter.doFilter(ThrottlingFilter.java:304)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at ch.qos.logback.classic.helpers.MDCInsertingServletFilter.doFilter(MDCInsertingServletFilter.java:49)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at org.wikidata.query.rdf.blazegraph.filters.ClientIPFilter.doFilter(ClientIPFilter.java:43)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
wdqs-updater_1     | 	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
wdqs-updater_1     | 	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1125)
wdqs-updater_1     | 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
wdqs-updater_1     | 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1059)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
wdqs-updater_1     | 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
wdqs-updater_1     | 	at org.eclipse.jetty.server.Server.handle(Server.java:497)
wdqs-updater_1     | 	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
wdqs-updater_1     | 	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:248)
wdqs-updater_1     | 	at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
wdqs-updater_1     | 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:610)
wdqs-updater_1     | 	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:539)
wdqs-updater_1     | 	at java.lang.Thread.run(Thread.java:748)
wdqs-updater_1     | Caused by: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539853111593), readTime=readOnly(1539701290285)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:817)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResource(DefaultResourceLocator.java:586)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.cacheMiss(DefaultResourceLocator.java:395)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locate(DefaultResourceLocator.java:347)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.<init>(BigdataSail.java:2068)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailReadOnlyConnection.<init>(BigdataSail.java:5236)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail._getReadOnlyConnection(BigdataSail.java:1540)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSail.getReadOnlyConnection(BigdataSail.java:1503)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.BigdataSailRepository.getReadOnlyConnection(BigdataSailRepository.java:140)
wdqs-updater_1     | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:247)
wdqs-updater_1     | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:221)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:722)
wdqs-updater_1     | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:671)
wdqs-updater_1     | 	at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68)
wdqs-updater_1     | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
wdqs-updater_1     | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
wdqs-updater_1     | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
wdqs-updater_1     | 	... 1 more
wdqs-updater_1     | Caused by: java.lang.RuntimeException: off=0, len=558
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:239)
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:207)
wdqs-updater_1     | 	at com.bigdata.sparse.ValueType.decode(ValueType.java:333)
wdqs-updater_1     | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:347)
wdqs-updater_1     | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:157)
wdqs-updater_1     | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:98)
wdqs-updater_1     | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:36)
wdqs-updater_1     | 	at com.bigdata.btree.AbstractBTree.submit(AbstractBTree.java:3263)
wdqs-updater_1     | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:537)
wdqs-updater_1     | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:420)
wdqs-updater_1     | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:807)
wdqs-updater_1     | 	... 17 more
wdqs-updater_1     | Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException
wdqs-updater_1     | 	at com.bigdata.rdf.vocab.BaseVocabulary.readVersion2(BaseVocabulary.java:680)
wdqs-updater_1     | 	at com.bigdata.rdf.vocab.BaseVocabulary.readExternal(BaseVocabulary.java:458)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:2116)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2065)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
wdqs-updater_1     | 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
wdqs-updater_1     | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:231)
wdqs-updater_1     | 	... 27 more
wdqs-updater_1     |
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.execute(RdfRepository.java:726)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.query(RdfRepository.java:679)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.dateFromQuery(RdfRepository.java:750)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.rdf.RdfRepository.fetchLeftOffTime(RdfRepository.java:636)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.buildRecentChangePollerChangeSource(Update.java:160)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.buildChangeSource(Update.java:141)
wdqs-updater_1     | 	at org.wikidata.query.rdf.tool.Update.main(Update.java:65)

Should I force-recreate wdqs, wdqs-frontend, and wdqs-proxy as well?

> Non-200 response from triple store: HttpContentResponse[HTTP/1.1 500 Server Error - 6429 bytes]

It looks like the wdqs service is firing errors
It might be worth looking at the wdqs logs.

Could you provide your whole docker-compose with any private stuff removed?
I should be able to reproduce exactly what is happening then.

wdqs log (last "entry"):

wdqs_1             | 08:20:16.620 [qtp1747585824-18] ERROR c.b.r.sail.webapp.BigdataRDFServlet IP:wikibase-docker_wdqs-proxy_1.wikibase-docker_default UA:Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:63.0) Gecko/20100101 Firefox/63.0 - cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539937216616), readTime=readOnly(1539701290285), query=SPARQL-QUERY: queryStr=prefix schema: <http://schema.org/> SELECT * WHERE {<http://www.wikidata.org> schema:dateModified ?y}
wdqs_1             | java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539937216616), readTime=readOnly(1539701290285)
wdqs_1             | 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
wdqs_1             | 	at java.util.concurrent.FutureTask.get(FutureTask.java:206)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:293)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doSparqlQuery(QueryServlet.java:654)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doGet(QueryServlet.java:288)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.RESTServlet.doGet(RESTServlet.java:240)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doGet(MultiTenancyServlet.java:271)
wdqs_1             | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
wdqs_1             | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:769)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1667)
wdqs_1             | 	at org.wikidata.query.rdf.blazegraph.throttling.ThrottlingFilter.doFilter(ThrottlingFilter.java:304)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs_1             | 	at ch.qos.logback.classic.helpers.MDCInsertingServletFilter.doFilter(MDCInsertingServletFilter.java:49)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs_1             | 	at org.wikidata.query.rdf.blazegraph.filters.ClientIPFilter.doFilter(ClientIPFilter.java:43)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
wdqs_1             | 	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
wdqs_1             | 	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1125)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
wdqs_1             | 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1059)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
wdqs_1             | 	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
wdqs_1             | 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
wdqs_1             | 	at org.eclipse.jetty.server.Server.handle(Server.java:497)
wdqs_1             | 	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
wdqs_1             | 	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:248)
wdqs_1             | 	at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
wdqs_1             | 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:610)
wdqs_1             | 	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:539)
wdqs_1             | 	at java.lang.Thread.run(Thread.java:748)
wdqs_1             | Caused by: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1539937216616), readTime=readOnly(1539701290285)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:817)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResource(DefaultResourceLocator.java:586)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.cacheMiss(DefaultResourceLocator.java:395)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locate(DefaultResourceLocator.java:347)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.<init>(BigdataSail.java:2068)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailReadOnlyConnection.<init>(BigdataSail.java:5236)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail._getReadOnlyConnection(BigdataSail.java:1540)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail.getReadOnlyConnection(BigdataSail.java:1503)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSailRepository.getReadOnlyConnection(BigdataSailRepository.java:140)
wdqs_1             | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:247)
wdqs_1             | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:221)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:722)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:671)
wdqs_1             | 	at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68)
wdqs_1             | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
wdqs_1             | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
wdqs_1             | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
wdqs_1             | 	... 1 common frames omitted
wdqs_1             | Caused by: java.lang.RuntimeException: off=0, len=558
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:239)
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:207)
wdqs_1             | 	at com.bigdata.sparse.ValueType.decode(ValueType.java:333)
wdqs_1             | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:347)
wdqs_1             | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:157)
wdqs_1             | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:98)
wdqs_1             | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:36)
wdqs_1             | 	at com.bigdata.btree.AbstractBTree.submit(AbstractBTree.java:3263)
wdqs_1             | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:537)
wdqs_1             | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:420)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:807)
wdqs_1             | 	... 17 common frames omitted
wdqs_1             | Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null
wdqs_1             | 	at com.bigdata.rdf.vocab.BaseVocabulary.readVersion2(BaseVocabulary.java:680)
wdqs_1             | 	at com.bigdata.rdf.vocab.BaseVocabulary.readExternal(BaseVocabulary.java:458)
wdqs_1             | 	at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:2116)
wdqs_1             | 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2065)
wdqs_1             | 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
wdqs_1             | 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:231)
wdqs_1             | 	... 27 common frames omitted

docker-compose.yml:

# Wikibase with Query Service
#
# This docker-compose example can be used to pull the images from docker hub.
#
# Examples:
#
# Access Wikibase via "http://localhost:8181"
#   (or "http://$(docker-machine ip):8181" if using docker-machine)
#
# Access Query Service via "http://localhost:8282"
#   (or "http://$(docker-machine ip):8282" if using docker-machine)
version: '3'

services:
  wikibase:
    image: wikibase/wikibase:1.30-bundle
    links:
      - mysql
    ports:
    # CONFIG - Change the 8181 here to expose Wikibase & MediaWiki on a different port
     - "8181:80"
    volumes:
      - mediawiki-images-data:/var/www/html/images
      - ../mixnmatch_wb/interface:/var/www/html/interface
    depends_on:
    - mysql
    - elasticsearch
    networks:
      default:
        aliases:
         - wikibase.svc
         - mixnmatch.wmflabs.org
         # CONFIG - Add your real wikibase hostname here, for example wikibase-registry.wmflabs.org
    environment:
      - DB_SERVER=mysql.svc:3306
      - MW_ELASTIC_HOST=elasticsearch.svc
      - MW_ELASTIC_PORT=9200
      # CONFIG - Change the default values below
      - MW_ADMIN_NAME=admin
      - MW_ADMIN_PASS=*********
      - MW_WG_SECRET_KEY=*********
      # CONFIG - Change the default values below (should match mysql values in this file)
      - DB_USER=wikiuser
      - DB_PASS=*********
      - DB_NAME=my_wiki
  mysql:
    image: mariadb:10.3
    restart: always
    volumes:
      - mediawiki-mysql-data:/var/lib/mysql
    environment:
      MYSQL_RANDOM_ROOT_PASSWORD: 'yes'
      # CONFIG - Change the default values below (should match values passed to wikibase)
      MYSQL_DATABASE: 'my_wiki'
      MYSQL_USER: 'wikiuser'
      MYSQL_PASSWORD: '*********'
    networks:
      default:
        aliases:
         - mysql.svc
  wdqs-frontend:
    image: wikibase/wdqs-frontend:latest
    ports:
    # CONFIG - Change the 8282 here to expose the Query Service UI on a different port
     - "8282:80"
    depends_on:
    - wdqs-proxy
    networks:
      default:
        aliases:
         - wdqs-frontend.svc
    environment:
      - WIKIBASE_HOST=mixnmatch.wmflabs.org
      - WDQS_HOST=wdqs-proxy.svc
  wdqs:
    image: wikibase/wdqs:0.3.0
    volumes:
      - query-service-data:/wdqs/data
    command: /runBlazegraph.sh
    networks:
      default:
        aliases:
         - wdqs.svc
    environment:
      - WIKIBASE_HOST=mixnmatch.wmflabs.org
      - WDQS_HOST=wdqs.svc
      - WDQS_PORT=9999
    expose:
      - 9999
  wdqs-proxy:
    image: wikibase/wdqs-proxy
    environment:
      - PROXY_PASS_HOST=wdqs.svc:9999
    ports:
     - "8989:80"
    depends_on:
    - wdqs
    networks:
      default:
        aliases:
         - wdqs-proxy.svc
  wdqs-updater:
    image: wikibase/wdqs:0.3.0
    command: /runUpdate.sh
    depends_on:
    - wdqs
    - wikibase
    networks:
      default:
        aliases:
         - wdqs-updater.svc
    environment:
     - WIKIBASE_HOST=mixnmatch.wmflabs.org
     - WDQS_HOST=wdqs.svc
     - WDQS_PORT=9999
  elasticsearch:
    image: elasticsearch@sha256:f1dbf2019dc9a4ca5dd458635bfb31f9a601e4905e1d6ca1d65a3958d428f497
    networks:
      default:
        aliases:
         - elasticsearch.svc
    environment:
      discovery.type: single-node
  # CONFING, in order to not load quickstatements then remove this entire section
  quickstatements:
    image: wikibase/quickstatements:latest
    ports:
     - "9191:80"
    depends_on:
    - wikibase
    networks:
      default:
        aliases:
         - quickstatements.svc
    environment:
      # CONFIG, you have to config this if you want quickstatements
      - OAUTH_CONSUMER_KEY=*********
      - OAUTH_CONSUMER_SECRET=*********
      - QS_PUBLIC_SCHEME_HOST_AND_PORT=https://mixnmatch-qs.wmflabs.org
      - WB_PUBLIC_SCHEME_HOST_AND_PORT=https://mixnmatch.wmflabs.org
      - WIKIBASE_SCHEME_AND_HOST=http://wikibase.svc
      - WB_PROPERTY_NAMESPACE=122
      - "WB_PROPERTY_PREFIX=Property:"
      - WB_ITEM_NAMESPACE=120
      - "WB_ITEM_PREFIX=Item:"

volumes:
  mediawiki-mysql-data:
  mediawiki-images-data:
  query-service-data:

As a sidenote, this may also contain the solution to T207132 ...

Looks like I can't actually run that exact docker-compose file locally as I apparently don't have enough memory with everything else running on my laptop.

Let me try and grab a labs VM to test on....

I just tried running this (see below) which just has elastic search commented out to free up some memory and the updater appears to work fine:

version: '3'

services:
  wikibase:
    image: wikibase/wikibase:1.30-bundle
    links:
      - mysql
    ports:
    # CONFIG - Change the 8181 here to expose Wikibase & MediaWiki on a different port
     - "8181:80"
    volumes:
      - mediawiki-images-data:/var/www/html/images
      - ../mixnmatch_wb/interface:/var/www/html/interface
    depends_on:
    - mysql
#    - elasticsearch
    networks:
      default:
        aliases:
         - wikibase.svc
         - mixnmatch.wmflabs.org
         # CONFIG - Add your real wikibase hostname here, for example wikibase-registry.wmflabs.org
    environment:
      - DB_SERVER=mysql.svc:3306
      - MW_ELASTIC_HOST=elasticsearch.svc
      - MW_ELASTIC_PORT=9200
      # CONFIG - Change the default values below
      - MW_ADMIN_NAME=admin
      - MW_ADMIN_PASS=*********
      - MW_WG_SECRET_KEY=*********
      # CONFIG - Change the default values below (should match mysql values in this file)
      - DB_USER=wikiuser
      - DB_PASS=*********
      - DB_NAME=my_wiki
  mysql:
    image: mariadb:10.3
    restart: always
    volumes:
      - mediawiki-mysql-data:/var/lib/mysql
    environment:
      MYSQL_RANDOM_ROOT_PASSWORD: 'yes'
      # CONFIG - Change the default values below (should match values passed to wikibase)
      MYSQL_DATABASE: 'my_wiki'
      MYSQL_USER: 'wikiuser'
      MYSQL_PASSWORD: '*********'
    networks:
      default:
        aliases:
         - mysql.svc
  wdqs-frontend:
    image: wikibase/wdqs-frontend:latest
    ports:
    # CONFIG - Change the 8282 here to expose the Query Service UI on a different port
     - "8282:80"
    depends_on:
    - wdqs-proxy
    networks:
      default:
        aliases:
         - wdqs-frontend.svc
    environment:
      - WIKIBASE_HOST=mixnmatch.wmflabs.org
      - WDQS_HOST=wdqs-proxy.svc
  wdqs:
    image: wikibase/wdqs:0.3.0
    volumes:
      - query-service-data:/wdqs/data
    command: /runBlazegraph.sh
    networks:
      default:
        aliases:
         - wdqs.svc
    environment:
      - WIKIBASE_HOST=mixnmatch.wmflabs.org
      - WDQS_HOST=wdqs.svc
      - WDQS_PORT=9999
    expose:
      - 9999
  wdqs-proxy:
    image: wikibase/wdqs-proxy
    environment:
      - PROXY_PASS_HOST=wdqs.svc:9999
    ports:
     - "8989:80"
    depends_on:
    - wdqs
    networks:
      default:
        aliases:
         - wdqs-proxy.svc
  wdqs-updater:
    image: wikibase/wdqs:0.3.0
    command: /runUpdate.sh
    depends_on:
    - wdqs
    - wikibase
    networks:
      default:
        aliases:
         - wdqs-updater.svc
    environment:
     - WIKIBASE_HOST=mixnmatch.wmflabs.org
     - WDQS_HOST=wdqs.svc
     - WDQS_PORT=9999
#  elasticsearch:
#    image: elasticsearch@sha256:f1dbf2019dc9a4ca5dd458635bfb31f9a601e4905e1d6ca1d65a3958d428f497
#    networks:
#      default:
#        aliases:
#         - elasticsearch.svc
#    environment:
#      discovery.type: single-node
#  # CONFING, in order to not load quickstatements then remove this entire section
#  quickstatements:
#    image: wikibase/quickstatements:latest
#    ports:
#     - "9191:80"
#    depends_on:
#    - wikibase
#    networks:
#      default:
#        aliases:
#         - quickstatements.svc
#    environment:
#      # CONFIG, you have to config this if you want quickstatements
#      - OAUTH_CONSUMER_KEY=*********
#      - OAUTH_CONSUMER_SECRET=*********
#      - QS_PUBLIC_SCHEME_HOST_AND_PORT=https://mixnmatch-qs.wmflabs.org
#      - WB_PUBLIC_SCHEME_HOST_AND_PORT=https://mixnmatch.wmflabs.org
#      - WIKIBASE_SCHEME_AND_HOST=http://wikibase.svc
#      - WB_PROPERTY_NAMESPACE=122
#      - "WB_PROPERTY_PREFIX=Property:"
#      - WB_ITEM_NAMESPACE=120
#      - "WB_ITEM_PREFIX=Item:"

volumes:
  mediawiki-mysql-data:
  mediawiki-images-data:
  query-service-data:

Updater logs:

wdqs-updater_1   | wait-for-it.sh: mixnmatch.wmflabs.org:80 is available after 73 seconds
wdqs-updater_1   | wait-for-it.sh: waiting 120 seconds for wdqs.svc:9999
wdqs-updater_1   | wait-for-it.sh: wdqs.svc:9999 is available after 0 seconds
wdqs-updater_1   | Updating via http://wdqs.svc:9999/bigdata/namespace/wdq/sparql
wdqs-updater_1   | OpenJDK 64-Bit Server VM warning: Cannot open file /var/log/wdqs/wdqs-updater_jvm_gc.pid8.log due to No such file or directory
wdqs-updater_1   |
wdqs-updater_1   | I> No access restrictor found, access to any MBean is allowed
wdqs-updater_1   | Jolokia: Agent started with URL http://127.0.0.1:8778/jolokia/
wdqs-updater_1   | #logback.classic pattern: %d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n
wdqs-updater_1   | 12:37:14.115 [main] INFO  org.wikidata.query.rdf.tool.Update - Checking where we left off
wdqs-updater_1   | 12:37:14.131 [main] INFO  o.w.query.rdf.tool.rdf.RdfRepository - Checking for left off time from the updater
wdqs-updater_1   | 12:37:15.449 [main] INFO  o.w.query.rdf.tool.rdf.RdfRepository - Checking for left off time from the dump
wdqs-updater_1   | 12:37:15.486 [main] INFO  org.wikidata.query.rdf.tool.Update - Defaulting start time to 90 days ago:  2018-07-21T12:37:15Z
wdqs-updater_1   | 12:37:16.617 [main] INFO  o.w.q.r.t.change.RecentChangesPoller - Got 98 changes, from Q2@2@20181009091428|2 to Q95@102@20181009124625|102
wdqs-updater_1   | 12:37:26.816 [main] INFO  org.wikidata.query.rdf.tool.Updater - Polled up to 2018-10-09T12:46:25Z (next: 20181009124625|103) at (0.0, 0.0, 0.0) updates per second and (0.0, 0.0, 0.0) milliseconds per second
wdqs-updater_1   | 12:37:27.197 [main] INFO  o.w.q.r.t.change.RecentChangesPoller - Got 100 changes, from Q96@103@20181009124625|103 to Q195@202@20181009125010|202
wdqs-updater_1   | 12:37:33.743 [main] INFO  org.wikidata.query.rdf.tool.Updater - Polled up to 2018-10-09T12:50:10Z (next: 20181009125012|203) at (1.6, 0.3, 0.1) updates per second and (110539388.6, 22850878.8, 7659315.0) milliseconds per second
wdqs-updater_1   | 12:37:34.114 [main] INFO  o.w.q.r.t.change.RecentChangesPoller - Got 100 changes, from Q196@203@20181009125012|203 to Q295@302@20181009125114|302

Can I give you (or can you just get) access to my VM? mixnmatch in the mix-n-match project.

We have been having the same issues on ScienceSource this week. We now have a working Query Service.
(We have definitely found we need a large VM instance to run the current stack if we want any memory left to start a process for docker ps.)

To get rid of the com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException I had to ensure that the Blazegraph data store was recreated.
I tried deleting data/data.jnl and it successfully recreated the data now including the correct URLs (not wikibase,svc).

This is our docker-compose.yml -- most of the instances of 'wikibase.svc' have been replaced by the URL for our site:

https://github.com/ContentMine/wikibase-docker/blob/master/docker-compose.yml

Thanks @Jkbr, that seems to do the trick!

Tarrow subscribed.

Great! I'll mark this as resolved.

@Magnus was the solution?:

I tried deleting data/data.jnl and it successfully recreated the data now including the correct URLs (not wikibase,svc).

If so it is also worth mentioning that you can start totally afresh with query service data by removing the docker volume with something like : docker volume rm wikibase-docker_query-service-data

You'll see the names of the volumes specified at the bottom of the .yml

@Tarrow True, although in our fork we persist the Wikibase and wdqs data using host volumes, so have removed that final volumes section. Troubleshooting required examining how Blazegraph was working under the hood too!

I would like to revisit this topic since we got the very same exception trace in one of our instances.

java.lang.RuntimeException: off=0, len=702::namespace=wdq, timestamp=readOnly(1720714024138), readTime=readOnly(1720519309034)
[…]
Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null

We've seen this error in the past when the wdqs container was started with the wrong WIKIBASE_HOST variable, and were able to recover by simply force-recreating the container using the correct docker-compose configuration, but this is not the case this time.

Dropping the data volume and re-ingesting a complete dump of the mediawiki rdf store would be the last resort for us, I'd much rather understand the source of the problem. Unfortunately, the java stack traces do not help me in any way in understanding the underlying cause.

The VocabularyVersioningException class description states:

An instance of this class indicates a versioning problem with the declaration classes. If a vocabulary declaration class is modified after it has been used to instantiate a triple store then the mapping of URIs onto IVs might not be stable with the result that encode and decode of statements may be broken.

If the vocabulary declaration corresponds to the prefixes (/wdqs/prefixes.conf), then I would understand the exception as: "Your current prefixes are different from those that were used when the triple store was first initialized and when the data volume was created". I don't think that's the case here, but I don't know how to investigate.

I have also confirmed, by creating a docker-compose.override.yml that overrides the WIKIBASE_HOST variable and re-creating both wdqs-based containers, then deleting the override and re-creating them again, that the VocabularyVersioningException only occurs while the override file is in place and sets an environment variable that is different from the initial one. As I mentioned above, this has also worked out on production in the past.

I've resigned to dropping the data volume and re-importing from an rdf dump at this point since this is a problem environment that we need back up and I'm out of ideas what to look into.

Update

Just to confirm that recreating the data-volume (with data.jnl) from a fresh rdf dump worked and all services are healthy again. This does not help us discover the cause of this issue, but it confirms that none of the docker configuration or environment variables were at fault.