Tue, 20 Oct 2009
The Stupidity of Fat Elf (was OSX Universal Binaries).
Early in 2008 I drafted a blog post about something on Mac OS X called Universal binaries. Now the same stupid idea is available for Linux, and planned for other ELF systems. Its called Fat Elf.
The original post follows.
As the main author and maintainer of libsndfile I've recently been involved in a number of email exchanges along the lines of this:
OS X User : Is it possible to build a Universal Binary of libsndfile? Me : No, please see the FAQ. OS X User : Autoconf sucks! Me : Even if you use Xcode, the problem of testing remains. OS X User : Other projects can build Universal Binaries, why can't yours?
to which I should probably respond "Have you stopped beating your wife/ girlfriend yet?".
For those lucky enough to be blissfully unaware of the issue, Universal Binaries are Apple's solution to the marketing problem that arose from their decision to ditch the PowerPC processors it had been using in its machines in favour of CPUs from Intel. Obviously, this change of CPU would have an impact on users and Apple chose a two pronged approach to ease the migration of its users from one CPU to another; Rosetta a PowerPC emulator which allows PowerPC code to run on Intel based Macs, and the idea of the Universal Binary (UB), a way of packaging binary code for PowerPC and Intel into a single monolithic file so that the OSX operating system can run which ever is appropriate for the CPU it is being run on.
Building Universal Binaries on Mac OS X using Apple's own development toolset Xcode is apparently quite trivial. Under the hood, Xcode uses a version of the GNU GCC compiler which Apple hacked to allow it to take in a single input source code file, pass it through both an Intel and a PowerPC back-end and then generate a single binary containing both Intel and PowerPC executable code.
Apple has obviously gone to a lot of trouble to make the process easy. So easy, that the average OSX developer expects it to just work, without realising exactly what it is they are doing and even more importantly, without realising that the process itself contains one very large flaw (which I'll get to shortly).
Building Universal Binaries of FLOSS (Free/Libre/Open Source ...) software is a whole different matter. A large majority of FLOSS software uses things like scons or autoconf and its related tools to detect characteristics of the target platform so that the source code can be compiled correctly for the target.
The problem is, that people who know that the Xcode version of GCC can generate a universal binary from one pass over a source code file expect to take a project like libsndfile, set the CFLAGS environment variable, run configure and build a working universal binary. Anyone who knows autoconf can guess that this is simply not going to work. The main output of the configure script generated by autoconf is a header file containing definitions to handle the detected features of the CPU and operating system. The problem is that the configure script is only run once and the characteristics detected on that run are then used to compile both the Intel and the PowerPC version of the executable code.
The result of the above is that people developing FLOSS like myself get emails from OSX users complaining that their UB version of libsndfile doesn't work. Since I first heard of Apple's Universal Binaries back in early 2006 I have personally spent over 50-100 hours trying to come up with a solution to an issue that only affects an operating system that I personally am not very interested in. Thats a bunch of hours that I could have spent in much more enjoyable pursuits.
Now, consider how many other FLOSS project would have been hit by similar problems. So lets say 10 hours per project across a thousand projects. Thats one hell of a lot of developer time that could have been spent doing more useful things. Why didn't Apple foresee this? Why didn't Apple spend some of its resources trying to mitigate the effect of UB on FLOSS projects. Why didn't Apple devote say 500 man hours of its developer's time to help ease the transition of FLOSS projects to UB; 500 man hours of work on autoconf and automake would likely have gone a long way.
The difficulty of generating UB for code using the autotools is not even the biggest problem with UB. What I see as the biggest problem is the huge green elephant standing in the corner that no one wants to talk about; testing (maybe that should have a blink tag).
For nearly a decade, testing, especially reliable, repeatable, automated testing has been considered an important part of the process of creating reliable software. Now Apple is encouraging the building of Universal Binaries, but I for one am very curious as to how these are being tested. I can think of the following possibilities:
- Ad-hoc and manual testing. Obviously this is a poor substitute for automated testing.
- Automated testing relying on copying test binaries to another machine and running the tests for the other architecture there. This requires easy access to the other machine and considerable effort to make the tests work.
- Automated testing relying on Rosetta. Like the previous option this requires considerable effort to make the tests work.
Given that all these options are difficult it is highly likely that testing falls by the wayside. In short Apple is effectively encouraging the release of untested binaries to users. When these untested binaries go wrong, who do users blame? The authors of that software even when those FLOSS authors are not Mac users and have little or no interest in Apple and its operating system.
Furthermore, Apple is about to add another spanner to the works; quad binaries. Quad binaries are like Universal Binaries that contain code for both 32 bit and 64 bit versions of PowerPC and Intel. Quad binaries will be even more difficult to test, more difficult to get right and still more pain for FLOSS developers.
Posted at: 20:53 | Category: CodeHacking | Permalink