Page MenuHomePhabricator

Problem with docker-compose2 (behind proxy)
Closed, ResolvedPublic

Description

Hi,

this is a second error when I run docker-compose. I'm able to let it run and the wikibase works fine. I can insert/modify/search for entities. But the SPARQL endpoint and the updater are not working. Here the first error that I get:

java.lang.RuntimeException: off=0, len=558
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:239)
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:207)
wdqs_1             | 	at com.bigdata.sparse.ValueType.decode(ValueType.java:333)
wdqs_1             | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:347)
wdqs_1             | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:157)
wdqs_1             | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:98)
wdqs_1             | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:36)
wdqs_1             | 	at com.bigdata.btree.AbstractBTree.submit(AbstractBTree.java:3263)
wdqs_1             | 	at com.bigdata.btree.UnisolatedReadWriteIndex.submit(UnisolatedReadWriteIndex.java:668)
wdqs_1             | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:537)
wdqs_1             | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:420)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:910)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResource(DefaultResourceLocator.java:586)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.cacheMiss(DefaultResourceLocator.java:395)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locate(DefaultResourceLocator.java:347)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail.exists(BigdataSail.java:924)
wdqs_1             | 	at com.bigdata.rdf.sail.CreateKBTask$1.call(CreateKBTask.java:244)
wdqs_1             | 	at com.bigdata.rdf.sail.CreateKBTask$1.call(CreateKBTask.java:241)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail.getUnisolatedConnectionLocksAndRunLambda(BigdataSail.java:1451)
wdqs_1             | 	at com.bigdata.rdf.sail.CreateKBTask.doRun(CreateKBTask.java:257)
wdqs_1             | 	at com.bigdata.rdf.sail.CreateKBTask.call(CreateKBTask.java:104)
wdqs_1             | 	at com.bigdata.rdf.sail.CreateKBTask.call(CreateKBTask.java:62)
wdqs_1             | 	at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68)
wdqs_1             | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
wdqs_1             | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
wdqs_1             | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
wdqs_1             | 	at java.lang.Thread.run(Thread.java:748)
wdqs_1             | Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null
wdqs_1             | 	at com.bigdata.rdf.vocab.BaseVocabulary.readVersion2(BaseVocabulary.java:680)
wdqs_1             | 	at com.bigdata.rdf.vocab.BaseVocabulary.readExternal(BaseVocabulary.java:458)
wdqs_1             | 	at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:2118)
wdqs_1             | 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
wdqs_1             | 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
wdqs_1             | 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:231)
wdqs_1             | 	... 26 common frames omitted


then: 

wdqs_1             | 15:36:52.825 [qtp1078694789-14] ERROR c.b.r.sail.webapp.BigdataRDFServlet IP:wikibase_wdqs-updater_1.wikibase_default UA:Jetty/9.2.z-SNAPSHOT - cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1549640212804), readTime=readOnly(1549636656936), query=SPARQL-QUERY: queryStr=PREFIX schema: <http://schema.org/>
wdqs_1             | SELECT * WHERE { <http://qanswer-svc1.univ-st-etienne.fr> schema:dateModified ?date }
wdqs_1             | java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1549640212804), readTime=readOnly(1549636656936)
wdqs_1             | 	at java.util.concurrent.FutureTask.report(FutureTask.java:122)
wdqs_1             | 	at java.util.concurrent.FutureTask.get(FutureTask.java:206)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.BigdataServlet.submitApiTask(BigdataServlet.java:293)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doSparqlQuery(QueryServlet.java:654)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet.doPost(QueryServlet.java:273)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.RESTServlet.doPost(RESTServlet.java:269)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.MultiTenancyServlet.doPost(MultiTenancyServlet.java:193)
wdqs_1             | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
wdqs_1             | 	at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:769)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1667)
wdqs_1             | 	at org.wikidata.query.rdf.blazegraph.throttling.ThrottlingFilter.doFilter(ThrottlingFilter.java:304)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs_1             | 	at ch.qos.logback.classic.helpers.MDCInsertingServletFilter.doFilter(MDCInsertingServletFilter.java:49)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs_1             | 	at org.wikidata.query.rdf.blazegraph.filters.ClientIPFilter.doFilter(ClientIPFilter.java:43)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
wdqs_1             | 	at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577)
wdqs_1             | 	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1125)
wdqs_1             | 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
wdqs_1             | 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1059)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
wdqs_1             | 	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:215)
wdqs_1             | 	at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:110)
wdqs_1             | 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
wdqs_1             | 	at org.eclipse.jetty.server.Server.handle(Server.java:497)
wdqs_1             | 	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
wdqs_1             | 	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:248)
wdqs_1             | 	at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
wdqs_1             | 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:610)
wdqs_1             | 	at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:539)
wdqs_1             | 	at java.lang.Thread.run(Thread.java:748)
wdqs_1             | Caused by: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1549640212804), readTime=readOnly(1549636656936)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:817)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResource(DefaultResourceLocator.java:586)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.cacheMiss(DefaultResourceLocator.java:395)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locate(DefaultResourceLocator.java:347)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailConnection.<init>(BigdataSail.java:2068)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail$BigdataSailReadOnlyConnection.<init>(BigdataSail.java:5236)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail._getReadOnlyConnection(BigdataSail.java:1540)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSail.getReadOnlyConnection(BigdataSail.java:1503)
wdqs_1             | 	at com.bigdata.rdf.sail.BigdataSailRepository.getReadOnlyConnection(BigdataSailRepository.java:140)
wdqs_1             | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:247)
wdqs_1             | 	at com.bigdata.rdf.task.AbstractApiTask.getQueryConnection(AbstractApiTask.java:221)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:722)
wdqs_1             | 	at com.bigdata.rdf.sail.webapp.QueryServlet$SparqlQueryTask.call(QueryServlet.java:671)
wdqs_1             | 	at com.bigdata.rdf.task.ApiTaskForIndexManager.call(ApiTaskForIndexManager.java:68)
wdqs_1             | 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
wdqs_1             | 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
wdqs_1             | 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
wdqs_1             | 	... 1 common frames omitted
wdqs_1             | Caused by: java.lang.RuntimeException: off=0, len=558
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:239)
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:207)
wdqs_1             | 	at com.bigdata.sparse.ValueType.decode(ValueType.java:333)
wdqs_1             | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:347)
wdqs_1             | 	at com.bigdata.sparse.AbstractAtomicRowReadOrWrite.atomicRead(AbstractAtomicRowReadOrWrite.java:157)
wdqs_1             | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:98)
wdqs_1             | 	at com.bigdata.sparse.AtomicRowRead.apply(AtomicRowRead.java:36)
wdqs_1             | 	at com.bigdata.btree.AbstractBTree.submit(AbstractBTree.java:3263)
wdqs_1             | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:537)
wdqs_1             | 	at com.bigdata.sparse.SparseRowStore.read(SparseRowStore.java:420)
wdqs_1             | 	at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:807)
wdqs_1             | 	... 17 common frames omitted
wdqs_1             | Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null
wdqs_1             | 	at com.bigdata.rdf.vocab.BaseVocabulary.readVersion2(BaseVocabulary.java:680)
wdqs_1             | 	at com.bigdata.rdf.vocab.BaseVocabulary.readExternal(BaseVocabulary.java:458)
wdqs_1             | 	at java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:2118)
wdqs_1             | 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
wdqs_1             | 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
wdqs_1             | 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
wdqs_1             | 	at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:231)
wdqs_1             | 	... 27 common frames omitted

Any clue?

That the updater and the query service interface are not wokrking is I think related to this.

Thank you
D063520

Event Timeline

I'm not really sure what is causing this.
It kind of looks like miss configuration to me?

wdqs_1             | 15:36:52.825 [qtp1078694789-14] ERROR c.b.r.sail.webapp.BigdataRDFServlet IP:wikibase_wdqs-updater_1.wikibase_default UA:Jetty/9.2.z-SNAPSHOT - cause=java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1549640212804), readTime=readOnly(1549636656936), query=SPARQL-QUERY: queryStr=PREFIX schema: <http://schema.org/>
wdqs_1             | SELECT * WHERE { <http://qanswer-svc1.univ-st-etienne.fr> schema:dateModified ?date }
wdqs_1             | java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1549640212804), readTime=readOnly(1549636656936)
wdqs_1             | Caused by: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1549640212804), readTime=readOnly(1549636656936)
wdqs_1             | Caused by: java.lang.RuntimeException: off=0, len=558
wdqs_1             | Caused by: com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null

Can we see a snippet of your configuration / if your using a docker-compose then the docker-compose file?

Hi,

maybe it is related to T215870, so let's wait for this issue being resolved.

Thank you!

Seems related to T286334 and T207133. I see this error too with the following error message in the updater

18:34:44.410 [main] INFO  org.wikidata.query.rdf.tool.Update - Starting Updater 0.3.10-SNAPSHOT (8bba8bfb0bbb12361c0e214fe482fbe15fcaa129)
18:34:45.335 [main] INFO  o.w.q.r.t.change.ChangeSourceContext - Checking where we left off
18:34:45.335 [main] INFO  o.w.query.rdf.tool.rdf.RdfRepository - Checking for left off time from the updater
18:34:45.517 [main] ERROR org.wikidata.query.rdf.tool.Update - Error during initialization.
org.wikidata.query.rdf.tool.exception.ContainedException: Non-200 response from triple store:  HttpContentResponse[HTTP/1.1 500 Server Error - 8081 bytes] body=
SPARQL-QUERY: queryStr=PREFIX schema: <http://schema.org/>
SELECT * WHERE { <http://192.168.188.188:8181> schema:dateModified ?date }
java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1626719685498), readTime=readOnly(1599573101388)
...

this message in the wdqs

18:34:45.501 [qtp584634336-16] ERROR c.b.r.sail.webapp.BigdataRDFServlet IP:ees-wikibase-docker_wdqs-updater_1.ees-wikibase-docker_default UA:Jetty/9.4.z-SNAPSHOT - cause=java.util.concurrent.ExecutionExcept
ion: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1626719685498), readTime=readOnly(1599573101388), query=SPARQL-QUERY: queryStr=PREFIX schema: <http://schema.org/>
SELECT * WHERE { <http://192.168.188.188:8181> schema:dateModified ?date }
com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null
        at com.bigdata.rdf.vocab.BaseVocabulary.readVersion2(BaseVocabulary.java:680)
Wrapped by: java.lang.RuntimeException: off=0, len=558
        at com.bigdata.io.SerializerUtil.deserialize(SerializerUtil.java:239)
Wrapped by: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1626719685498), readTime=readOnly(1599573101388)
        at com.bigdata.relation.locator.DefaultResourceLocator.locateResourceOn(DefaultResourceLocator.java:817)
Wrapped by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: off=0, len=558::namespace=wdq, timestamp=readOnly(1626719685498), readTime=readOnly(1599573101388)
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)

And the docker-compose.yml (matching .env at the bottom)

# Wikibase with Query Service
#
# This docker-compose example can be used to pull the images from docker hub.
#
# Examples:
#
# Access Wikibase via "http://localhost:8181"
#   (or "http://$(docker-machine ip):8181" if using docker-machine)
#
# Access Query Service via "http://localhost:8282"
#   (or "http://$(docker-machine ip):8282" if using docker-machine)
version: '3'

services:
  wikibase:
    image: wikibase/wikibase:1.34-bundle
    links:
      - mysql
    ports:
    # CONFIG - Change the 8181 here to expose Wikibase & MediaWiki on a different port
     - "${EES_WIKIBASE_HOST_PORT}:${EES_WIKIBASE_CONTAINER_PORT}"
    volumes:
      - mediawiki-images-data:/var/www/html/images
      - quickstatements-data:/quickstatements/data
      - ./extensions/WikibaseQualityConstraints:/var/www/html/extensions/WikibaseQualityConstraints:rw
      # Note: enable the following line only after an initial docker-compose up -d
      - ./LocalSettings.php:/var/www/html/LocalSettings.php:ro
      - ./logo.png:/var/www/html/logo.png:ro
      # - ./LocalSettings.php.template:/LocalSettings.php.template
    depends_on:
    - mysql
    - elasticsearch
    restart: unless-stopped
    networks:
      default:
        aliases:
         - ${EES_WIKIBASE_FQDN}
         # CONFIG - Add your real wikibase hostname here, for example wikibase-registry.wmflabs.org
    environment:
      - DB_SERVER=mysql.svc:3306
      - MW_ELASTIC_HOST=elasticsearch.svc
      - MW_ELASTIC_PORT=9200
      # CONFIG - Change the default values below
      - MW_ADMIN_NAME=admin
      - MW_ADMIN_PASS=${EES_WIKIBASE_MW_ADMIN_PASS}
      - MW_ADMIN_EMAIL=redacted
      - MW_WG_SECRET_KEY=${EES_WIKIBASE_MW_WG_SECRET_KEY}
      # CONFIG - Change the default values below (should match mysql values in this file)
      - DB_USER=wikiuser
      - DB_PASS=sqlpass
      - DB_NAME=my_wiki
      - QS_PUBLIC_SCHEME_HOST_AND_PORT=http://localhost:9191
  mysql:
    image: mariadb:10.3
    restart: unless-stopped
    volumes:
      - mediawiki-mysql-data:/var/lib/mysql
    environment:
      MYSQL_RANDOM_ROOT_PASSWORD: 'yes'
      # CONFIG - Change the default values below (should match values passed to wikibase)
      MYSQL_DATABASE: 'my_wiki'
      MYSQL_USER: 'wikiuser'
      MYSQL_PASSWORD: 'sqlpass'
    networks:
      default:
        aliases:
         - mysql.svc
  wdqs-frontend:
    image: wikibase/wdqs-frontend:latest
    restart: unless-stopped
    ports:
    # CONFIG - Change the 8282 here to expose the Query Service UI on a different port
     - "8282:80"
    depends_on:
    - wdqs-proxy
    networks:
      default:
        aliases:
         - wdqs-frontend.svc
    environment:
      - WIKIBASE_HOST=${EES_WIKIBASE_FQDN}
      - WDQS_HOST=wdqs-proxy.svc
  wdqs:
    image: wikibase/wdqs:0.3.10
    restart: unless-stopped
    volumes:
      - query-service-data:/wdqs/data
    command: /runBlazegraph.sh
    networks:
      default:
        aliases:
         - wdqs.svc
    environment:
      - WIKIBASE_HOST=${EES_WIKIBASE_FQDN}
      - WDQS_HOST=wdqs.svc
      - WDQS_PORT=9999
    expose:
      - 9999
  wdqs-proxy:
    image: wikibase/wdqs-proxy
    restart: unless-stopped
    environment:
      - PROXY_PASS_HOST=wdqs.svc:9999
    ports:
     - "8989:80"
    depends_on:
    - wdqs
    networks:
      default:
        aliases:
         - wdqs-proxy.svc
  wdqs-updater:
    image: wikibase/wdqs:0.3.10
    restart: unless-stopped
    command: /runUpdate.sh
    restart: on-failure:5
    depends_on:
    - wdqs
    - wikibase
    networks:
      default:
        aliases:
         - wdqs-updater.svc
    environment:
     - WIKIBASE_HOST=${EES_WIKIBASE_FQDN}:${EES_WIKIBASE_HOST_PORT}
     - WDQS_HOST=wdqs.svc
     - WDQS_PORT=9999
     - WIKIBASE_MAX_DAYS_BACK=999999999
  elasticsearch:
    image: wikibase/elasticsearch:6.5.4-extra
    restart: unless-stopped
    networks:
      default:
        aliases:
         - elasticsearch.svc
    environment:
      discovery.type: single-node
      ES_JAVA_OPTS: "-Xms512m -Xmx512m"
  # CONFING, in order to not load quickstatements then remove this entire section
  quickstatements:
    image: wikibase/quickstatements:latest
    ports:
     - "9191:80"
    depends_on:
     - wikibase
    volumes:
     - quickstatements-data:/quickstatements/data
    networks:
      default:
        aliases:
         - quickstatements.svc
    environment:
      - QS_PUBLIC_SCHEME_HOST_AND_PORT=http://localhost:9191
      - WB_PUBLIC_SCHEME_HOST_AND_PORT=http://localhost:80
      - WIKIBASE_SCHEME_AND_HOST=http://${EES_WIKIBASE_FQDN}
      - WB_PROPERTY_NAMESPACE=122
      - "WB_PROPERTY_PREFIX=Property:"
      - WB_ITEM_NAMESPACE=120
      - "WB_ITEM_PREFIX=Item:"

volumes:
  mediawiki-mysql-data:
  mediawiki-images-data:
  query-service-data:
  quickstatements-data:

with .env file

EES_WIKIBASE_FQDN=192.168.188.188
EES_WIKIBASE_HOST_PORT=8181
EES_WIKIBASE_CONTAINER_PORT=80
EES_WIKIBASE_MW_ADMIN_PASS=admin
EES_WIKIBASE_MW_WG_SECRET_KEY=secretkey

This select using an IP address looks wrong, this should be using the concept base URI for wikibase, something feels wrong with this configuration.

SELECT * WHERE { <http://192.168.188.188:8181> schema:dateModified ?date }

However that will not lead to the 500 error.


This does indeed look like your problem.

com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null

And this will likely be the cause of T286334: `wdqs-frontend` and `wdqs` containers are constantly restarting using the wikibase-release-pipeline docker-compose example

However so as not to split discussion lets pick a single task to continue with!


wdqs:
  image: wikibase/wdqs:0.3.10

You are using a very old version of the wdqs docker image and service.
However in T286334 you are seemingly using wikibase/wdqs:0.3.40-wmde.1?

Thanks for checking back so quickly.

This select using an IP address looks wrong, this should be using the concept base URI for wikibase, something feels wrong with this configuration.

What is the concept base URI? What config could be causing this? Should this query be to the docker host, or directly the docker internal network?

You are using a very old version of the wdqs docker image and service.

Yes, this is a deployment where we tried to avoid upgrades when we can as we had some problems with those in the past.

However in T286334 you are seemingly using wikibase/wdqs:0.3.40-wmde.1?

I just thought the ticket might be related, I am not using 0.3.40. How can I debug this further to see where the root cause of com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null might be?

This does indeed look like your problem.

com.bigdata.rdf.vocab.BaseVocabulary$VocabularyVersioningException: null

I think T286334 is not my problem as I do not see the mentioned:

wdqs_1             | /wdqs /wdqs
wdqs_1             | Running Blazegraph from /wdqs on :9999/bigdata
wdqs_1             | Error: Could not find or load main class #

in there. I want to try the solution in T207133#4681278 which seemed to work back then. Now I don't fully understand how all pieces fit together, especially the part about changes not yet committed to the DB. In particular, deleting data/data.jnl as mentioned in T207133#4681278 would wipe the blazegraph triple store, wouldn't it?

Another thing I did not mention and just got told by a colleague: This instance was migrated (docker volumes were copied over) from another instance that had a FQDN in our network DNS. In the deployment at hand, we do not have a FQDN, but only the IP you mentioned looked odd in the concept URI query.

In particular, deleting data/data.jnl as mentioned in T207133#4681278 would wipe the blazegraph triple store, wouldn't it?

Yes

This instance was migrated (docker volumes were copied over) from another instance that had a FQDN in our network DNS. In the deployment at hand, we do not have a FQDN, but only the IP you mentioned looked odd in the concept URI query.

So the concept URI can be configured in both the query service updater, as the expected concept base, and also in wikibase and the concept base to output.

In your docker-compose for the query service this is EES_WIKIBASE_FQDN which is currently the IP.

If you look in the docker-compose you'll see the bit CONFIG - Add your real wikibase hostname here, for example wikibase-registry.wmflabs.org
So this should indeed be not the IP and it should be what you want to appear as the concept base URis.

Following your advice, we use proper FQDN now instead of IP.

Further, following your post here: https://addshore.com/2019/11/changing-the-concept-uri-of-an-existing-wikibase-with-data/, esp. dumpRdf.php, munge.sh and loadData.sh with the correct conceptUri, we managed to re-create the wdqs-updater and the BaseVocabulary$VocabularyVersioningException: null error is gone. Maybe this also helps @DD063520, I consider this as solved.

Thanks for your support!

Addshore claimed this task.

Great!