I just tried on a new install of Ubuntu 12.04.5 Desktop, and apt-transport-https is installed out of box.
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Advanced Search
Feb 8 2023
Apr 8 2019
Apr 28 2016
In T132450#2244769, @faidon wrote:I'm not convinced https for that is a good idea. apt doesn't support it by default — apt-transport-https isn't installed out of the box even in Ubuntu AFAIK.
Apr 27 2016
@mark As (ubuntu|mirrors).wikimedia.org now supports HTTPS, could we update Wikimedia's Ubuntu mirror link to https://ubuntu.wikimedia.org/ubuntu/ on Ubuntu's website?
Apr 15 2016
I suggest we use Let's Encrypt. It can issue SAN certificates.
Apr 13 2016
Redirect to https should be fine, since we enabled HSTS for transparency.wikimedia.org in May 2015.[1] But was there any reason that the redirect was dropped?
In T132521#2202254, @BBlack wrote:As for the rest of the work, IMHO we should re-purpose the wiki tracking page at https://wikitech.wikimedia.org/wiki/HTTPS/domains to cover longer-term progress on the rest of these issues and leave this ticket open until we get them all resolved. We can remove the ones with standard cache termination (text, upload, misc, maps) because they're easily enumerated and dealt with elsewhere and will be enforced consistently by the code for the cache clusters, and expand that table with the new entries from the audit above, etc.
Apr 12 2016
In T111967#2198441, @BBlack wrote:With the impending removal of *.donate, we'll actually finally be able to HSTS wikimedia.org itself at the DNS level.
Mar 2 2016
Feb 23 2016
So is links.email.donate.wikimedia.org still in use? If not, can we remove it from the DNS record?
Jan 29 2016
Jan 7 2016
Dec 14 2015
@BBlack I suggest to remove at least VeriSignClass3_G2 and VeriSignClass1 from our trust list. According to [1], Class3_G2 is a 1024 bit root, and Class1 was replaced by Class1_G3 during 2010.
Dec 4 2015
Let's Encrypt is in Public Beta now. Everyone can get free certificates from them now.
Nov 18 2015
Nov 10 2015
We could start with one-off services that are more technical in nature, which normal users would rarely connect to and aren't critical to them, such as icinga.wikimedia.org.
I support this. There are many other such domains that I think we can turn to "mid" now, including gerrit, rt, wikitech, wikitech-static, ticket, librenms, and tendril. Note that https://lists.wikimedia.org already uses "mid" cipher suite.
Nov 6 2015
Are there any updates now?
Oct 20 2015
In T50501#1669896, @Chmarkine wrote:Let's Encrypt provides free trusted(*) DV non-wildcard certs. We have 31 domains lists here. If you think it's plausible, we can obtain 31 certs (one for each domain) from Let's Encrypt at zero cost.
(*) They will have their CA certificate cross-signed by IdenTrust next month, so the certs they issued won't be trusted until then.
Oct 1 2015
Did Microsoft fix this issue yet?
Sep 24 2015
In T50501#1670596, @Lixxx235 wrote:Chmarkine, there's always StartCom/StartSSL which has free certs, and they're already trusted by default in all major browsers.
Let's Encrypt provides free trusted(*) DV non-wildcard certs. We have 31 domains lists here. If you think it's plausible, we can obtain 31 certs (one for each domain) from Let's Encrypt at zero cost.
Sep 20 2015
I think this task can finally be closed as resolved, as there're no more domains that lack FS. (T91504 is now about DNSSEC.)
Aug 27 2015
I copied the CC list of T107575 to this one.
Aug 26 2015
Confirmed that this issue was fixed.
Aug 25 2015
According to DNS, download.wikimedia.org and gerrit.wikimedia.org are not behind misc-web. Why are these two domains in misc.inc.vcl.erb?
Aug 21 2015
Resolved since HTTPS has been enforced for everyone.
This was resolved when the canonical URLs on all pages point to HTTPS. T53002
Aug 15 2015
How about mapping download.Wikipedia.org to the text cluster, and then have it redirect to https://dumps.wikimedia.org?
Jul 31 2015
https://stats.wikipedia.org/ is broken. Error: 404, Domain not served here
Jul 30 2015
wikipedia.org is already on the preload list! Among Alexa Top 10 websites, Wikipedia is the only one that has all subdomains preloaded!
https://chromium.googlesource.com/chromium/src/+/master/net/http/transport_security_state_static.json#3319
https://twitter.com/konklone/status/626538394202570752
Jul 29 2015
Before wikimedia.org is ready to preload, how about emailing agl@chromium.org to request preloading some high traffic and sensitive subdomains of wikimedia.org, like commons, donate, payments, etc.?
Jul 28 2015
Jul 24 2015
Have these communities been notified yet?
In T106311#1476249, @valhallasw wrote:the question is: what should it be replaced with...?
Jul 21 2015
Why decline it? It has been resolved! Apache 2.2 now supports ECDHE. See T55259#1448222.
Jul 17 2015
How about doing "report-only" first with a longer max-age, like 7 days?
Jul 16 2015
Could you look at the referrers as well? Do most of the requests come from search engines?
Jul 14 2015
My thought is that we'd better support a cipher suite as long as someone is actively using it and it is not close to broken (such as RC4). So how about keeping AES256-SHA256 and cutting out other AES256 ciphers in mid and compat lists? Also, why not remove dhe-rsa-camellia256-sha too? It was not negotiated for 3 weeks.[1]
Jul 11 2015
In T105455#1445242, @BBlack wrote:Ok after some debugging with @mark (who has an xbox 360!), we've found what the incompatibility is. It's the same incompatibility that breaks ancient Java6 with us now: The Xbox360's IE9 supports DHE-based ciphersuites, but is incompatible with DH parameters greater than 1024-bit prime size, and we're using a 2048-bit prime parameter. Unfortunately, to give Forward Secrecy to other clients (and a lot of them are other Microsoft clients), we have to keep those DHE suites high on our preference list.
The best recourse on Microsoft's end of things would be upgrade the TLS library, if possible, to support 2048 (or even greater) -bit DH parameters for DHE ciphers.
Jul 10 2015
In T102814#1431376, @Reedy wrote:It would seem arbcom-(de|nl|en) are the main ones to worry about notifying...
Jul 7 2015
How many requests to these domains there are in the log? *.wap was deprecated in early 2009, and *.mobile was deprecated in late 2011. Google has 39,300 results for site:*.mobile.wikipedia.org, which is fewer than the 96,700 results for site:www.*.wikipedia.org. I think it is fine to delete them from DNS.
Jul 6 2015
Actually http://www.email.donate.wikimedia.org/ can be removed too.
Jul 5 2015
Oh, actually http://www.donate.wikimediafoundation.org/ redirects to https://wikimediafoundation.org/wiki/Home, and http://www.donate.mediawiki.org/ shows an "unconfigured domain" error page. So they are broken already.
Once these two domain names, www.donate.wikimediafoundation.org and www.donate.mediawiki.org are removed, wikimediafoundation.org and mediawiki.org can be preloaded. Fortunately, searching them on Google returns no results: https://www.google.com/search?q=%22www.donate.wikimediafoundation.org%22 and https://www.google.com/search?q=%22www.donate.mediawiki.org%22. So I think it is safe to remove at least these two.
Jul 3 2015
In T55259#1423112, @BBlack wrote:If you switch a site to strong, it *will* become inaccessible to several insecure and/or legacy client platforms, including: Android 2.x, IE8/XP, Java6, and any automated client code / bots which indirectly use OpenSSL versions < 1.0 (older Linux distros such as Ubuntu Lucid).
Jul 2 2015
Jun 27 2015
stats.wikimedia.org doesn't redirect http to https. It has mixed content (T93702). Do we need to fix that first?
Jun 21 2015
In T73156#1383664, @Dzahn wrote:so all is left here is OTRS it seems
Jun 18 2015
In T102815#1377168, @BBlack wrote:Noted on irc, tons of results in:
01:40 < Mjbmr> https://www.google.com/search?q=site:www.en.wikipedia.org
I wonder why google ignores the fact that those redirect and have rel=canonical in the content? Can we get google to get rid of these before we kill them?
Jun 5 2015
This has been fixed in T100825.
Jun 1 2015
git.wikimedia.org is behind misc-web. Is this cert still needed?
May 1 2015
Apr 29 2015
Why aren't the Vary and HSTS headers set for https://gdash.wikimedia.org, but are correctly set for http://gdash.wikimedia.org?
Apr 28 2015
In T49832#1240813, @BBlack wrote:As I've stated before, personally I'd prefer to do the hard redirects before the rel=canonical during the initial rollout process, simply because it's easier to take back in realtime if anything doesn't work out as planned in terms of load and capacity. We already have a process down for this stuff. It's not my place to speak to the rest, but I assure you people are aware and working on it.
Can we start to force HTTPS for all users from the US soon? They should have low latency impact, since they are close to the datacenters. Do we have a timeline now? One thing is that once we redirect to HTTPS for US users, Google will update the indexed Wikipedia links to HTTPS as well.
Apr 20 2015
Mar 26 2015
Mar 25 2015
Great! https://en.m.wikipedia.org now works in China. According to the report on zhwiki and tests on greatfire.org, only http://zh.wikisource.org/, http://zh.wikinews.org/, and http://ug.wikipedia.org/ are still blocked, which I think are blocked based on URL rather than IP. All other Wikimedia sites, (http and https) are not blocked now. Thank you!
In T91504#1140486, @DaBPunkt wrote:Sure it does, but the webserver for our OTRS doesn’t use it. HSTS is a nice idea, yes
Now https://planet.wikimedia.org redirects to http://meta.wikimedia.org/wiki/Planet_Wikimedia again.