A myth about Linux which hardly goes away is that the installation of software is much harder than under commercial operating systems. Of course, the installation of a Linux-System itself is a hard nut to crack. Firstly, you have to chose your distribution – which can become one of the hardest tasks about the linux installation at all. Then, after choosing one distribution, making it run and support all the hardware you have is a complicated (and sometimes – expecially for new or extremely cheap hardware – even impossible) task. But installing actual software is usually no big deal. Every distribution has its package manager with a lot of packages, and mostly it either takes one short shell command or a few klicks in a friendly GUI to install the software you want.
Its just that … most people changing from a commercial desktop-os to a linux distribution expect that they have to download the software they want from somewhere. And they expect the software to take care of its own updates. And they are trying to do the same under Linux, and – since compiling and installing software by hand is really complicated for a newbie – they fail.
On the other hand, the package management systems are very convenient for both users and developers, and many commercial software is already distributed in package form, and some vendors (like Sun) even maintain package-repositories for dpkg and rpm for some of their Linux-Software.
Meanwhile, the installation process under Mac OS X is complicated and chaotic. You (mostly) have a dmg-Image, which (mostly) contains anything you need to install – sort of. Sometimes you just have to copy one app-file somewhere else where you can run it, sometimes you have to open (and run) a pkg-file and go through dialogs, sometimes you have to open the app to download the rest of the software, sometimes the app installs itself and then runs as the ordinary app, etc. Sometimes, you get a zipped pkg-file. Sometimes, you get a zipped dmg-image. Sometimes, you get a sitx-archive. Sometimes its enough to just delete the app-file to uninstall it, sometimes there are special uninstallers which you have to find and run, sometimes you have to manually delete directories.
I could tell similar stories about Windows, but at least under Windows, most software either comes with an installer that registers itself so Windows can find the uninstaller, or installs an uninstaller somewhere in its menu-folder, or doesnt need to be installed at all and can just be run directly.
Well, the situation is definitely not better, if not even worse than under most Linux-Distributions. But nevermind, at least both Windows and Mac OS X have some central registry for installed software to register. And both do have an integrated update mechanism for themselves. I wonder why then every software searches for updates itself. Why doesnt Microsoft or Apple just define a default protocol for upgrade searching, and provide a central update-search mechanism for all the installed programs?
Like – just downloading an RSS-Feed and passing it to some defined procedure or so?
Well, Windows seems to have an integrated package management for its components, at least there is some „pkgmgr.exe“ – but I dont actually know whether its just for Windows-Components or can be used for other software as well. In the latter case, I dont understand why so many software packages (Firefox, Adobe Reader and Flashplugin, Java RE, Apple BootCamp, etc.) have their own update scanners instead of using this one.
And many of the installers and update scanners are either not working properly, or getting on my nerves trying to remind me that the software they are upgrading is installed. And some of them are just linking to upgraded versions which install themselves, etc. – I think thats really annoying.
But well, also the existing package managers on Linux, Solaris, FreeBSD, etc., lack of some features I always waited to see. One thing we could learn from Windows and the app-Files from Mac OS X is to put anything you need into one directory, localized nearby the application itself, and thus not producing that much problems with colliding dependencies between versions (and architectures) of software. Having some sort of copy-on-write-hardlink for this would also make it possible to install one library into many directories without significant loss of space.
And – something I also dont like – often you have postinst and postrm scripts which are running binaries. There is nothing wrong with this, but on the other hand, these scripts tend to do a lot of complicated stuff, and if they fail, the package management itself cannot really undo what they have done, and their postrm-scripts get confused. Its nothing bad to have postinst and postrm scripts (in fact, in some cases it is necessary), but a good package system should provide a lot of additional possibilities for dependent configuration settings, etc., to make this unnecessary for as many cases as possible.
Well, package management is a complicated thing, and a solution for it always has to balance between not having a dependency handler at all and having a turing complete solution which gets easily confused or ist likely to be unusable. The main difficulty is – as far as I saw so far – to make the packaged software integrate itself in the package management. Is this really such a hard task for commercial software on a commercial OS?