Fairly self-explanatory (and weird).
Annotations get applied to the text in the order that they're applied in reality; we should probably make links higher-priority. Thus [[Foo|F'''''o''o''']] rather than [[Foo|F]]'''''[[Foo|o]]''[[Foo|F]]'''.
(In reply to comment #2)
Except you'd probably rather have ''[[Foo|Bar]]'' than [[Foo|''Bar'']]
Yeah; so it should only break out when it's not entirely nested? Spoke to Roan about this - he says it's a relatively-major change in DM that he did "about a quarter" of the work for as part of DM rewrite 2 (or similar). Pull from release?
To re-visit this, some rules I think encapsulate what we want:
Does this achieve what we want? (Obviously some of this is already done by Parsoid.)
In the meantime, can we suggest workarounds to "get the code right"?
I.e. to avoid [[Foo|''Bar'']] you should italicize first, and link only after that. I added this to the Italian User guide, other tips I might be missing? Thanks.
Now that T105239: Enable scrubWikitext=1 in VisualEditor's save route to Parsoid is done, this should only impact non-MW users in terms of save output. Still a blocker for sane RTC, but that's not a priority.
I haven't read all the comments about, but that scenario is currently not handled, but https://www.mediawiki.org/wiki/Talk:Parsoid/Normalizations has normalizations that are still on our plate.
https://www.mediawiki.org/wiki/Parsoid/Normalizations#Tag_minimization_.28.3Ci.3E.2F.3Cb.3E_tags.29 might handle some of the i/b scenarios that matter.
Is this still an issue? We've implemented tag minimization for <a> tags and it has been in production for a long time now. https://www.mediawiki.org/wiki/Parsoid/Normalizations#Tag_minimization_.28.3Ca.3E_tags.29
In the merged example, VE is generating <p><b><a href="Eat" rel="mw:WikiLink">Foo</a></b><a href="Eat" rel="mw:WikiLink">d</a></p> and Parsoid is turning it into '''[[Eat|Foo]]'''[[Eat|d]] which is a pretty faithful representation of VE's stupid DOM.
Ah, I see. At one point (actually my very first set of commits to Parsoid), I had implemented a complex minimization algorithm that would have dealt with this and other complex scenarios, but I removed it in favour of a simpler algorithm since that other algorithm couldn't keep up with all the DOM changes and other complexities that arrived over time and continued to be broken. But, I'll keep this in mind for future enhancements of our DOM normalization unless VE gets there first.