Link to Mark Owen’s excellent QFP library for soft float on Cortex-M0+ (ARMv6-M) and Cortex-M3/M4 (ARMv7-M).
https://www.quinapalus.com/qfplib.html
Nice write up here, too, I like the idea of a firm float.
Since the trick works on mantissa only without hidden 1 included, I wonder if that number of bits in mantissa was chosen because of it
I appreciate this warning near the top: This post contains floating point. Floating point is known to the State of California to cause confusion and a fear response in mammalian bipeds.
I wisely hit the back button :)
I saw an explanation (perhaps Fabien's) which made it click - floating point is just segment:offset addressing for numbers!
Then the full horror of it hit me.