A top concern of consumers of Wikipedia is trust. There are concerns that content is biased, inaccurate, absurd, outdated or incomplete. This is a parent task to identify a means to provide meaningful guidance to a user that will help them evaluate how sure they can be that the content is something they could act on, quote publicly, guide further study etc.
The most prominent idea here is a trust signal or score.
One is a "trust score" for either a content chunk, revision or article, similar to or based off of WikiTrust, which scores content based on an implicit review process -- content contributed by trustworthy editors is scored highly. Content contributed by less trustworthy editors is scored poorly at first, but as subsequent edits happen to an article (especially edits by Trustworthy editors), the content's score grows.
More on Wikitrust: This is not an a machine learning approach. It is content analysis. It is extremely computationally intensive.