So, glibc changes, breaks adobe, and it is adobe's fault? Even if adobe was relying on bad behavior, it was glibc's bad behavior it was depending on.
These are the kinds of problems that finally convinced me to move from Linux to OS X. I get the Unix without the egos (just the fanbois, but I can usually ignore them ;).
So, glibc changes, breaks adobe, and it is adobe's fault?
This reminds me of Joel Spolsky's article, "How Microsoft Won the API War:"
"There are two opposing forces inside Microsoft, which I will refer to, somewhat tongue-in-cheek, as The Raymond Chen Camp and The MSDN Magazine Camp.
Raymond Chen is a developer on the Windows team at Microsoft. He's been there since 1992, and his weblog The Old New Thing is chock-full of detailed technical stories about why certain things are the way they are in Windows, even silly things, which turn out to have very good reasons. . .
The other camp is what I'm going to call the MSDN Magazine camp, which I will name after the developer's magazine full of exciting articles about all the different ways you can shoot yourself in the foot by using esoteric combinations of Microsoft products in your own software."
He thinks the MSDN camp won, and that's bad, and he contrasts with, say, Apple in historical times:
"A lot of developers and engineers don't agree with this way of working. If the application did something bad, or relied on some undocumented behavior, they think, it should just break when the OS gets upgraded. The developers of the Macintosh OS at Apple have always been in this camp. It's why so few applications from the early days of the Macintosh still work. For example, a lot of developers used to try to make their Macintosh applications run faster by copying pointers out of the jump table and calling them directly instead of using the interrupt feature of the processor like they were supposed to. Even though somewhere in Inside Macintosh, Apple's official Bible of Macintosh programming, there was a tech note saying "you can't do this," they did it, and it worked, and their programs ran faster... until the next version of the OS came out and they didn't run at all. If the company that made the application went out of business (and most of them did), well, tough luck, bubby.
To contrast, I've got DOS applications that I wrote in 1983 for the very original IBM PC that still run flawlessly, thanks to the Raymond Chen Camp at Microsoft."
The thing is, Apple is still mostly like that. Want the new hotness? Upgrade. I've been using OS X since 2004 and have trouble remembering all the stuff that broke because of OS updates (NetNewsWire and printing were particularly common). I don't know if it's because of the MSDN camp in Apple, but I do find it ironic that you cite things breaking as a reason to move to Apple. There are plenty of them, but I'm not sure that's one.
I've got Mac apps from the late '80s that ran fine on the last non-Unix Mac OS, and ran fine in the Classic environment on OS X up until that was finally dropped.
Apple did in fact do a lot of bending over backwards for compatibility, at least when it was a major developer breaking the rules. System 7 had special code in the memory manager for Microsoft applications to make them work with the 32-bit memory manager and virtual memory.
That said, Microsoft does do an outstanding job in this area. I remember when Win98 was coming out, we were not in the beta program at work. We got a call from Microsoft telling us that a VxD of ours was not working on Win98, telling us what assumption it was making that was no longer valid, and inviting us into the beta program. We were not a large, well-known company. That was pretty cool.
Oddly enough, MacOS had better back-compatibility with the old applications than the newer ones. It seemed like pretty much every app broke at least once between Systems 7 and 9, even while the 1985 apps ran fine. (Perhaps the developers were being trickier in how they abused the OS.)
Incidentally, I think this figured into Apple's thinking regarding limited back-compat. They had already unintentionally forced a number of application upgrades, why not do something positive like move everyone to a new OS/CPU/API in the process?
"I do find it ironic that you cite things breaking as a reason to move to Apple."
To be clear, things breaking wasn't what I was referring to. Generally, things worked very well on Linux (except [at the time] suspending and wifi on my laptop) and I know they've gotten better.
On the other hand, there was far too much focus on software for the developer's sake and not software for the user's sake for my tastes (even as a developer). I'm fine with the people who are developing the (almost exclusively) open source software developing it for themselves, but it was too much headache for me, so I did the proverbial "voted with my wallet".
I still use Linux a lot, it is the platform my startup is deploying on, and it amazes me every time I ponder the changes to the entire community since kernel version 0.9'ish and slackware when I first started using Linux. I actually trust the Linux community far more than Apple to keep things working over a longer-term timeline (which is great for servers, but I don't care much about on my desktop).
Adobe had a bug which worked because of a Bug in memcpy(), the bug was fixed and Adobe's code broke.
win95 had a similar one with the game Civilisation, they actually put code into win95 to detect the game and change the way the OS worked - doesn't sound like a good solution
Read the spec for memcpy: memcpy's behavior on overlapping memory regions is undefined - not "required to corrupt memory", but undefined. Changing memcpy from not breaking on overlapping memory regions to breaking does not fix any bugs.
Adobe should not rely on non-spec-defined behavior, but there's no reason why glibc <i>should</i> be making this change without making a major version number change.
What? The whole raison d'être of Windows 95 was backwards-compatibility with primordial PC junk. The entire thing was a hack from top-to-bottom, far beyond a workaround for a particular game.
Also, to quote Linus:
"And what was the point of making [an OS] again? Was it to teach everybody a lesson, or was it to give the user a nice experience?"
> I first heard about this from one of the developers of the hit game SimCity, who told me that there was a critical bug in his application: it used memory right after freeing it [...] the Windows developers, who disassembled SimCity, stepped through it in a debugger, found the bug, and added special code that checked if SimCity was running, and if it did, ran the memory allocator in a special mode in which you could still use memory after freeing it.
Hmm. Well, I was assuming that glibc was following the spec all along but just changed some implementation detail that mattered because Adobe wasn't following the spec all along.
I think there's an argument to be made that the way an API is implemented is an implicit contract that ought to be upheld. But that's not the arugment I see being made.
Anyway, this level of detail is below the scope of the "specs vs pragmatism" debate that's going on.
I think there's an argument to be made that the way an API is implemented is an implicit contract that ought to be upheld.
That sounds wrong to me. If your consumers have to rely on assumptions beyond what's provided in the contract of the API, your API is leaky and/or broken.
[edit] I understand the pragmatism necessary in the case of the glibc issue, but to clarify my point I disagree with the general assertion I'm quoting.
Right, the memcpy() API is leaky because it leaves some things unspecified. You can deduce the implementation by providing various inputs to it that have "undefined behavior". This is a common problem in C, and there are usually functions that avoid it (Linus recommends memmove).
These are the kinds of problems that finally convinced me to move from Linux to OS X. I get the Unix without the egos (just the fanbois, but I can usually ignore them ;).