Fri, Jan 24
Thu, Jan 23
Dec 19 2019
Closing this ticket as "declined" as we have removed the need to add spdx ID for now.
Dec 13 2019
Thats a good callout I wasn't aware of other places that can re-set the content license, i will be sure to address that if this ticket gets implemented. (Still waiting to see if the license info is necessary in an open patch right now https://gerrit.wikimedia.org/r/c/mediawiki/core/+/555560)
Dec 9 2019
Dec 3 2019
@Pchelolo oh - i missed that...ok thank you for clarifying.
@Pchelolo you seemed to touch on this earlier, but regarding content_model, I see a todo comment in Revision.php to remove the getContentModel() function. It states:
Dec 2 2019
Nov 26 2019
@apaskulin That all looks right! The only thing you could consider adding (for now) is the caveat for minor revisions. If the user requests count of minor revisions, and that count is above 2000, we will return an error. Although this might be fixed with object caching in T237430, and if it is that check will be removed.
Nov 25 2019
Nov 21 2019
@eprodromou 2 comments:
Nov 20 2019
@daniel thanks for your thoughts! I am wondering, if we put tests in Cirrus that hit our v1 /search endpoint, wouldn't that make it seem like Cirrus is responsible for supporting our v1 endpoint? (Correct me if i'm misunderstanding something you and @Clarakosi are suggesting)
Nov 19 2019
Nov 18 2019
On second thought, might rethink the above ^^ @eprodromou. After talking with @Anomie it might be better to just default to spitting back whatever the search engine gives us from "getTextSnippet()". So if the search terms exists in the text snippet, the snippet will be html. Otherwise, a plain string of the beginning snippet of the page will be returned.
Been testing with CirrusSearch locally and have noticed different behavior between what is delivered for the text snippet from the default MediaWiki search engine vs. CirrusSearch.
Nov 15 2019
I did see that line in @Pchelolo 's code and used it in my current implementation, but was writing the tests and was just wondering use cases for when that would actually be possible. So if I understand correctly, this would only occur for completely private wikis then, not that certain pages within a wiki would be private"?
Nov 14 2019
Pages that are unreadable by the current user are not returned.
Nov 12 2019
Thanks @Dzahn !
Nov 7 2019
Thanks for clarifying @eprodromou !
Requesting approval for this request from my manager @Fjalapeno
Nov 6 2019
@eprodromou In the case of a match in title and text on a single page, my assumption is we only return 1 page for both matches. In that case, would we want to return only the title excerpt HTML or both title and text HTML tags?
sounds good thanks!
@eprodromou Should this endpoint be searching all namespaces? (Main, Talk: User:, User Talk:, etc) by default? Or, should it accept a param that indicates which to search (like the current "opensearch" endpoint does?) .
Nov 5 2019
Nov 4 2019
Oct 31 2019
Oct 30 2019
ok cool - It looks like @holger.knust was the one who originally wrote config logic so i will keep it top level as he originally had it!
Is the 'configs/' directory meant to be strictly for production config files or you mind if the local.json config file goes in there too? (as opposed to just putting the local.json in the top level directory.)
Oct 28 2019
Thanks all! Tests are passing now :)
Oct 25 2019
Brennen: We've updated the integration tests that were causing those errors you saw, and those will be out Monday likely (CI is failing to install some packagist packages, someone in release eng. mentioned that happens intermittently). If you can push back out your changes anytime Monday that would be awesome, as the tests will be fixed as soon as we merge our changes in. Thanks for all the help!
Thanks for the info Brennen, looks like something is causing our integration tests to fail. Will look into and post an update. Thanks!
Oct 24 2019
Oct 21 2019
Oct 16 2019
Oct 15 2019
I have a wikitech account, username is my full name: Nikki Nikkhoui.
Oct 10 2019
AH apologies! Should be the "wmf" LDAP group