This could be Google-Code / Outreachy (11) project
== Description ==
Right now there is no simple way to detect syntax errors of a wikicode (text user input as a source code for wiki page). Its existence would make it simple for tools and bots to validate wikitext (tools like ORES, Huggle, AWB and many others) as well as it could make it possible for new features such as real-time syntax checking to be implemented into MediaWiki (so that user would be notified about syntax errors while typing the source code or upon page saving).
== Designs ==
=== Library ===
The "linting framework" should be probably implemented in a way that there is some library for linting which is as much portable as possible, so that tools could use it locally without having to contact some external web server. There also could be a web service similar to ORES or parsoid that would just accept wikitext and validated it using this library.
==== Potential issues with this design ====
If this service really was external, eg. it would be some library located on different system than MediaWiki installation, it probably would be very complex to adjust it to all customizations of target wiki for which the text is linted. Rules for wikitext source may differ based on MW version and all customizations and parser extensions, so if this was an external service it would probably be able to validate just basic syntax.
Pros:
* Very fast (linting can be run locally even for large amount of edits)
* More scalable and secure (linting is not performed on same machine as where wiki is installed, no way to DDoS the server where wiki is installed)
Cons:
* No way to check broken links
* Doesn't work for parser extensions of target wiki
=== Implement this into MediaWiki ===
So that there would be some linting interface / API. The wikicode would be evaluated by wiki itself, so that all extensions would be considered and even things like broken links / non-existent targets (Images or wiki pages) could be detected, as linting would have direct access to SQL database of a target wiki. This way it would be significantly more efficient, but probably also slightly harder to implement. There would be no way for 3rd tools to use this service other than having to contact the web server somehow.
Pros:
* Detects broken links
* Works with parser extensions
Cons:
* Requires 3rd tools to access this service through web requests, thus slower
=== Mix both library and integration to MW ===
There is also an option the code that would integrate linting into MediaWiki would itself use the mentioned library used for linting (and extended it with the all parser extensions and checks for missing targets). That is most complex design but has pretty much all benefits:
Pros:
* Very fast (linting can be run locally even for large amount of edits)
* Scalable
* Detects broken links (if used through mediawiki)
* Works with all extensions
This linting can be programmed in any language, but I strongly recommend using PHP (eventually JavaScript) just to stay consistent with MediaWiki itself.