Steps to replicate the issue (include links if applicable):
- Logout.
- Remove cookies on enwiki, plwiki and sister projects.
- Login on new auth.wiki (using 2FA code, but that probably is not significant here). My account: https://meta.wikimedia.org/wiki/Special:CentralAuth/Nux
- Open devtools.
What happens?:
My log is littered with hundreds of warnings and errors. Seriously there are more then 200 bad cookies set. I thought SUL3 was supposed to fix this. It doesn't seem like it did.
There are hundreds of calls to set cookies like:
- https://meta.wikimedia.org/wiki/Special:CentralAutoLogin/setCookies?type=1x1&from=plwiki&usesul3=1
- https://species.wikimedia.org/wiki/Special:CentralAutoLogin/setCookies?type=1x1&from=plwiki&usesul3=1
- https://incubator.wikimedia.org/wiki/Special:CentralAutoLogin/setCookies?type=1x1&from=plwiki&usesul3=1
- and much, much more...
And seeing incubator.wikimedia is just weird. I don't know if I've been on that site this year. I have 0 edits on that site.
What should have happened instead?:
Well this is not how normal SSO (single sign-on) works. Could you imagine visiting accounts.google and pinging a thousand websites which happen to use Google Authentication? I'm sorry, but auth.wiki SSO implementation just seems absurd to me.
How does standard SSO work:
- You visit a website.
- You are prompted to log in or redirected to login to SSO site (that would be auth.wiki in this case).
- If you were centrally logged in you are immediately redirected back.
If you really must set a cookie on each website please at least:
- Filter out sites to which I don't contribute. This should be relatively easy to figure out. All the data is already on: https://meta.wikimedia.org/wiki/Special:CentralAuth/Nux
- Set cookies on main domain:
- https://pl.wikisource.org/ -> .wikisource.org
- https://pl.wiktionary.org/ -> .wiktionary.org
- https://pl.wikinews.org/ ->.wikinews.org
- https://pl.wikibooks.org/ -> .wikibooks.org
- https://commons.wikimedia.org/ -> .wikimedia.org
- https://www.wikidata.org -> .wikidata.org
That would mean 3-10 cookies, and avoid setting up separate sessions on 150 websites. Even if under the hood there are 150 mediawiki installations you can still set a cookie saying "User Nux is authenticated on auth.wiki" and another saying "User is using skin X". That would be enough to keep the experience smooth.
A user option could be set to also say if he.she wants a quick redirect to be fully authenticated. So a visit to a more exotic site after normal login could look like this:
- Login on enwiki.
- Redirect to auth.wiki (https://auth.wikimedia.org/...)
- I log in.
- SUL setup cookies on main domains (up to 20 calls). sul3_user="Nux"; user_skin="monobook" (I'm using vector but for the sake of example...).
- I visit https://www.wikifunctions.org/wiki/ I see it as if visited: https://www.wikifunctions.org/w/index.php?title=Wikifunctions:Main_Page&useskin=monobook
- I'm immediately redirected to auth.wiki.
- I'm back to wikifunctions fully authenticated.
I want to stress out this is not just a nice theory. I know this works. This is almost exactly what we have at the moment for Polish and UK libraries. There are multiple host servers (separate, physical machines) and even more application servers (separate virtual machines). For those installations that have common SSO once you log in to https://mol-app0101.molnet.mol.pl/lms/ you would be immediately logged in when visiting https://mol-app0701.molnet.mol.pl/lms/... Assuming you would be a librarian of course :)
Software version: SUL3, plwiki et al
Other information (browser name/version, screenshots, etc.):
Firefox 136, Windows 11.
| ... |


