Linux Desktop Done Right
Is having too many choices, limiting? Cruizer, awhile back, sent me an email on this article: what apple gets right and why linux keeps slipping behind. theosib makes some interesting points in that post: 1) applications on linux are scattered all over the place, like in unix. 2) configuration files are hell to maintain (believe me, if you're not careful maintaining this on a gentoo system--- you might very well trash a completely running system), 3) applications don't inter-operate: i.e. copying from the clipboard can be hell, and lastly theosib said, ego.
First of all--- as a recent switcher (from linux to mac, a year ago), i was simply amazed at how apps are installed. It's like this, for you non-OSX users: you download the "installer". You know how you put in a flash drive in and it suddenly appears on your computer? the disk image is like having a flash drive, but not physically having it--- it exists as a file on your computer and "mounting" is the thing that happens when you plug it in. you just double-click the dmg and it mounts itself. then you find the installer. For most apps on the Mac--- you simply drag files off your dmg to a folder called "Applications". (see this cut out from my finder window, above-left)
So what happens next? What do you do next?
Well go to your Applications fold and double click the file and enjoy your new app. that's it.
where did the install wizard go? for most apps that you'll be installing? nada. the dmg has all the files it needs to run. Apps don't expect you to be downloading this or that lib or .dll file. it doesn't expect you to be a geek that you have to know where program files is or that. it just works, out of the box. just like that. it doesn't let most people think.
certainly, there is no stopping most people to find out that where /Applications is--- physically or that in a Mac OS X drive will have /usr, /var/log in there. For you Unix/Linux geeks--- be calm, breathe in, breathe out: they're all in the same places, we know and love.
i've come to appreciate this simplicity. to be sure, it hasn't stopped me from getting fink or darwinports to fetch my latest unix app. to be quite honest, i found just downloading the source file for wget for example and compiling it by hand--- is simpler. sure i miss looking at the single place for all unix apps--- like what portage does, or for you Ubuntu/Debian folks: apt-get repositories. i don't think it really is all that needed on a mac.
I did find that having an app that installs php+mysql+apache by just downloading dmg files is great for development purposes or having the same "magic" happening similarly for ruby on rails through locomotive. yes, there is no substitute for compiling these apps by hand and configuring it specifically for you, and you should do that if ever you're in the business of going production or running a data center--- but in most cases like development work, like testing, this way is the easiest way to get started working right now. no thought as to how this or that works, no thought at how the systems administrators do it. just getting down to work, right now: hassle free (or as close to it as conveniently possible).
In windows, there are similar ways on how this happens. don't ask me how because i've never done more than installing postgresql and an apache on an xp machine and it was an academic exercise more than anything really. my experience--- i don't know if it has changed in the year and half since i last done that, nothing beats the OS X experience in installing an application.
theosib talked about how configuration files are hell to maintain. he is correct. i once had a gentoo system that wouldn't boot properly because i messed up the network configuration script. i had to restart the lan--- to my surprise it wouldn't because i remember migrating from an old net configuration to a new "pristine"/default.
In Linux, all these configuration files can all be found in one place. usually, it is in /etc. now, redhat will have its way to manage itself. as does ubuntu, as does gentoo and there are as many variations as there are distros. it always peeved me that being a gentoo user, the /etc for the other distros is all over the place. Take for example, Redhat (at least when i used to use it WAY back when): /etc/xinet.d will have most of boot scripts. I forget where Ubuntu keeps its stuff but in gentoo, it's either in /etc/conf.d (for most of the boot stuff like net device configuration, clock, ntp-client, etc.) and /etc for your .conf needs.
For most people--- who the hell cares where /etc is? Take the clock: keeping time is the clock right? and when you adjust the clock in gnome or in kde--- it should properly reflect in the configuration file (whether or not it is in /etc/conf.d or in /etc or /var or /whatnut), so that when the next time you log on to the machine, the changes will be made right? It isn't really the case is it?
I always set the linux clock using vi on /etc/conf.d and installing ntp to keep it in sync. people shouldn't have to do that on their desktop right?
Networking: i blogged about this a while back in "Why I Made the Switch" and "Will You Enlist". It is what makes our world go around now. Just about every computer on Earth is connected to the Internet. Why is it in this day and age, Linux Network Management on the client/desktop side is SO archaic, it makes Windows' Networking look like a brand new luxury car? Zeroconf, NetworkManager--- so many different things, doing exactly one thing: managing the network.
Then, there is the whole debacle on Gnome and KDE. Honestly, I'm a Gnome person: i find it more intuitive than kde and i also find kde to be so much like windows not just in the look and feel but how it does things. Other than the time I ran a machine using Sabayon (a few weeks ago), i've never used KDE for years. Here you have essentially two user interfaces essentially competing against each other. are they any different? in how they do things, yes but their end goals are the same. they both benefit from beryl. they both benefit from xorg. they both exist on that space between X and the user. so why can't we have just one?
people will chose between gnome and kde, based on individual preference and based on where they get trained into using first. with respect to both the gnome and kde developers--- you guys do a hell of a job, but seriously, isn't it about time to form one single unified linux desktop?
That is the heart of the matter isn't it? Has having too many choices-- having so many parallel development created a whole mediocre ecosystem?
Don't get me wrong: Linux is robust. Its performance as a kernel is well known. It is a great kernel and the distros--- all of them do a great job of delivering great software. Technically speaking, Linux is superior technology. But I must ask: hasn't the proliferation of so many distributions--- essentially the same thing, with slight variations, almost quirk like hampered a combined effort to build the true Linux Desktop Interface?
Anyone can build a linux distribution. Every gentoo box is compiled from source and theoretically you can configure it to follow your own quirk, and from that stand point is unique. As a linux user, i don't care if i have to run VLC off a command line for instance. Or MPlayer. i don't care if by popping a DVD into the drive, i have to manually type into the command line the mount command and issue the mplayer command to play the DVD.
on the other hand, the keyboard is often the quickest way between two points and why do you think apps like Quicksilver on the Mac is such a great productivity tool?
Put this way, as geeks: we don't care if the car has a cupholder, we just need the car to go really fast. My point is that Linux is a geek thing and often things like User Interfaces get lost in the translation: "normal" people don't necessarily can follow what comes easy for most geeks. Linux, inherits this from Windows because Windows' greatest failing is that it tries to apply a geek thing to what normal people do.
OS X's human-centric user interface brings a little humanity to computing.
People rave about how great Feisty Fawn is, that Linux is finally getting to be user friendly. If by user friendly you mean you can install apps as easy as you do in windows, may I submit that is still not the best way to do it? Still, it has done a remarkable job.
By being Linux users, we've changed the world. We've challenged the technological domination of mediocre technology like Windows, with all it's holes and quirks and problems and is slowly turning the tide. Windows for all its fault is in this rat hole (not just because it wasn't ready for a networked world) because it was a victim of it's own successful ecosystem.
May i submit that we may be exchanging one dog for a similar one? For all the technological greatness of linux under the hood, it still is that roadster you built. it isn't in that level of user-ready sophistication. it is still, so very geek and not elegant. i will trust an e-commerce site to run great on linux. i will trust a production database to run great on linux or the network (dhcp, firewall) to linux or as an OS for a supercomputer.
In many ways what has made Linux so successful--- it's technological prowess is stifling it. Where is our creativity?
Then again, is it still true that hackers rule the kernel? Take this blog on Who wrote Linux 2.6.20? How large is the community? Are they not mostly being paid now by corporations to help in this? hackers need to be paid for their efforts of course, my point really is why can't the same organizations and the communities band together and focus on building a humanized, next generation user interface?
The whole cathedral and bazaar thing--- we've proven that indeed, it works. Linux being the poster child of Open Source is the best example that it works. Heck, this whole open source thing is being replicated across the world in things like the Open Architecture Network, and they want to improve the living standards of millions of people.
Linux is on every little space on the planet. It is being supported by Companies around the world, who are making a profit off the open source model. why shouldn't they? it's a good idea. Linux can be found on phones, on huge computers. it is on the top 500 computers on the planet. What linux has failed to capture is desktop space. it has eluded it. Yet why?
Apple has a Unix-based operating system as well and it is found in at least 20 million computers across the planet and it is based on open source. small number, sure, but linux can easily dwarf that, right? Windows is so bug-filled that you expect the computer to restart, several times in a day (because it is starting to be that it isn't the case anymore). And the viruses, worms and trojans--- huge problem that an entire industry has grown up both to prevent and another to use as vehicles for their nefarious schemes.
There isn't really a lack of applications on Linux, not anymore. With wine, you can run Photoshop on Linux or go Gimp. PDF readers are all free on linux. it comes with both Gnome and KDE. OpenOffice is relatively good enough for most new documents--- it's your old word docs that really is the problem and of course interoperability with just about everyone else.
Linux is "free", compared to Microsoft's Windows or Apple's Mac OSX. For one thing, you don't need to pirate something that retails for US$0.00. You don't need specialized hardware to be running Linux like Mac hardware (yes, they're still somewhat specialized in light of the Intel-switch). But people will pirate Windows and pay a few dollars for a pirate copy of it because it works with their PC and its devices.
So what gives?
people tend to forget that Linux is just a kernel. it's a brain that can coordinate the eyes, the ears, the hands, the skin. it is a great piece of software that can be ported to just about every space there is. Embedded, done. Supercomputing, done. Video Recorder (PVR/DVR), done. Desktop, you bet. technically speaking, it is possible that everything and anything can be built on top of the linux kernel. this configurability, this ability to customize is the most power feature of Linux.
don't believe me?
just look at wine. it is a Linux implementation of the Windows api. long story short it translates your windows' applications instructions to the computer into linux speak so that it can run under any linux distribution.
what is to stop Microsoft from doing what Apple has done with rosetta? Microsoft in a heartbeat can build their next generation operating system around the linux kernel or a bsd one. it may take a few years to do it, and several billion dollars but it can. What better weapon is there than to use your enemy's knowledge (i.e. open source) against them? (it can happen, no one would have thought apple would go intel.)
In the same vein--- you can basically build a whole new user interface around the Linux kernel: a single unified user interface. After all, linux is just a kernel.
Distros need to standardize more on common functionality, we should stop re-inventing the wheel. In doing so, would it still be the same diversity that distros now enjoy?
As I explained in my post, Why I Made the Switch, it wasn't because Linux is open source that I use it. Or windows isn't Open Source, why I don't use windows:
Mac OS X is the Linux Desktop without having to go through all the headache and heartache of making your computer work because hint to developers and manufacturers: just works, no longer works.
Apple showed us that an Open Source operating system can be beautiful, elegant, and powerful piece of software. They were able to build, what is now without doubt: the reigning gold standard in user interfaces. What Linux needs, if it wants to rule the world is to build a humanized, unified user interface that makes the computer work for us, and not the other way around. It can. It should. it can jump over existing User Interfaces and "wow" us (and i'm not just talking about eye candy like beryl, which by the way is good). it is a higher level of creativity that we need to cultivate for the next user interface. Today, Mac OS X is Linux Desktop done right.
So what happens next? What do you do next?
Well go to your Applications fold and double click the file and enjoy your new app. that's it.
where did the install wizard go? for most apps that you'll be installing? nada. the dmg has all the files it needs to run. Apps don't expect you to be downloading this or that lib or .dll file. it doesn't expect you to be a geek that you have to know where program files is or that. it just works, out of the box. just like that. it doesn't let most people think.
certainly, there is no stopping most people to find out that where /Applications is--- physically or that in a Mac OS X drive will have /usr, /var/log in there. For you Unix/Linux geeks--- be calm, breathe in, breathe out: they're all in the same places, we know and love.
i've come to appreciate this simplicity. to be sure, it hasn't stopped me from getting fink or darwinports to fetch my latest unix app. to be quite honest, i found just downloading the source file for wget for example and compiling it by hand--- is simpler. sure i miss looking at the single place for all unix apps--- like what portage does, or for you Ubuntu/Debian folks: apt-get repositories. i don't think it really is all that needed on a mac.
I did find that having an app that installs php+mysql+apache by just downloading dmg files is great for development purposes or having the same "magic" happening similarly for ruby on rails through locomotive. yes, there is no substitute for compiling these apps by hand and configuring it specifically for you, and you should do that if ever you're in the business of going production or running a data center--- but in most cases like development work, like testing, this way is the easiest way to get started working right now. no thought as to how this or that works, no thought at how the systems administrators do it. just getting down to work, right now: hassle free (or as close to it as conveniently possible).
In windows, there are similar ways on how this happens. don't ask me how because i've never done more than installing postgresql and an apache on an xp machine and it was an academic exercise more than anything really. my experience--- i don't know if it has changed in the year and half since i last done that, nothing beats the OS X experience in installing an application.
theosib talked about how configuration files are hell to maintain. he is correct. i once had a gentoo system that wouldn't boot properly because i messed up the network configuration script. i had to restart the lan--- to my surprise it wouldn't because i remember migrating from an old net configuration to a new "pristine"/default.
In Linux, all these configuration files can all be found in one place. usually, it is in /etc. now, redhat will have its way to manage itself. as does ubuntu, as does gentoo and there are as many variations as there are distros. it always peeved me that being a gentoo user, the /etc for the other distros is all over the place. Take for example, Redhat (at least when i used to use it WAY back when): /etc/xinet.d will have most of boot scripts. I forget where Ubuntu keeps its stuff but in gentoo, it's either in /etc/conf.d (for most of the boot stuff like net device configuration, clock, ntp-client, etc.) and /etc for your .conf needs.
For most people--- who the hell cares where /etc is? Take the clock: keeping time is the clock right? and when you adjust the clock in gnome or in kde--- it should properly reflect in the configuration file (whether or not it is in /etc/conf.d or in /etc or /var or /whatnut), so that when the next time you log on to the machine, the changes will be made right? It isn't really the case is it?
I always set the linux clock using vi on /etc/conf.d and installing ntp to keep it in sync. people shouldn't have to do that on their desktop right?
Networking: i blogged about this a while back in "Why I Made the Switch" and "Will You Enlist". It is what makes our world go around now. Just about every computer on Earth is connected to the Internet. Why is it in this day and age, Linux Network Management on the client/desktop side is SO archaic, it makes Windows' Networking look like a brand new luxury car? Zeroconf, NetworkManager--- so many different things, doing exactly one thing: managing the network.
Then, there is the whole debacle on Gnome and KDE. Honestly, I'm a Gnome person: i find it more intuitive than kde and i also find kde to be so much like windows not just in the look and feel but how it does things. Other than the time I ran a machine using Sabayon (a few weeks ago), i've never used KDE for years. Here you have essentially two user interfaces essentially competing against each other. are they any different? in how they do things, yes but their end goals are the same. they both benefit from beryl. they both benefit from xorg. they both exist on that space between X and the user. so why can't we have just one?
people will chose between gnome and kde, based on individual preference and based on where they get trained into using first. with respect to both the gnome and kde developers--- you guys do a hell of a job, but seriously, isn't it about time to form one single unified linux desktop?
That is the heart of the matter isn't it? Has having too many choices-- having so many parallel development created a whole mediocre ecosystem?
Don't get me wrong: Linux is robust. Its performance as a kernel is well known. It is a great kernel and the distros--- all of them do a great job of delivering great software. Technically speaking, Linux is superior technology. But I must ask: hasn't the proliferation of so many distributions--- essentially the same thing, with slight variations, almost quirk like hampered a combined effort to build the true Linux Desktop Interface?
Anyone can build a linux distribution. Every gentoo box is compiled from source and theoretically you can configure it to follow your own quirk, and from that stand point is unique. As a linux user, i don't care if i have to run VLC off a command line for instance. Or MPlayer. i don't care if by popping a DVD into the drive, i have to manually type into the command line the mount command and issue the mplayer command to play the DVD.
on the other hand, the keyboard is often the quickest way between two points and why do you think apps like Quicksilver on the Mac is such a great productivity tool?
Put this way, as geeks: we don't care if the car has a cupholder, we just need the car to go really fast. My point is that Linux is a geek thing and often things like User Interfaces get lost in the translation: "normal" people don't necessarily can follow what comes easy for most geeks. Linux, inherits this from Windows because Windows' greatest failing is that it tries to apply a geek thing to what normal people do.
OS X's human-centric user interface brings a little humanity to computing.
People rave about how great Feisty Fawn is, that Linux is finally getting to be user friendly. If by user friendly you mean you can install apps as easy as you do in windows, may I submit that is still not the best way to do it? Still, it has done a remarkable job.
By being Linux users, we've changed the world. We've challenged the technological domination of mediocre technology like Windows, with all it's holes and quirks and problems and is slowly turning the tide. Windows for all its fault is in this rat hole (not just because it wasn't ready for a networked world) because it was a victim of it's own successful ecosystem.
May i submit that we may be exchanging one dog for a similar one? For all the technological greatness of linux under the hood, it still is that roadster you built. it isn't in that level of user-ready sophistication. it is still, so very geek and not elegant. i will trust an e-commerce site to run great on linux. i will trust a production database to run great on linux or the network (dhcp, firewall) to linux or as an OS for a supercomputer.
In many ways what has made Linux so successful--- it's technological prowess is stifling it. Where is our creativity?
Then again, is it still true that hackers rule the kernel? Take this blog on Who wrote Linux 2.6.20? How large is the community? Are they not mostly being paid now by corporations to help in this? hackers need to be paid for their efforts of course, my point really is why can't the same organizations and the communities band together and focus on building a humanized, next generation user interface?
The whole cathedral and bazaar thing--- we've proven that indeed, it works. Linux being the poster child of Open Source is the best example that it works. Heck, this whole open source thing is being replicated across the world in things like the Open Architecture Network, and they want to improve the living standards of millions of people.
Linux is on every little space on the planet. It is being supported by Companies around the world, who are making a profit off the open source model. why shouldn't they? it's a good idea. Linux can be found on phones, on huge computers. it is on the top 500 computers on the planet. What linux has failed to capture is desktop space. it has eluded it. Yet why?
Apple has a Unix-based operating system as well and it is found in at least 20 million computers across the planet and it is based on open source. small number, sure, but linux can easily dwarf that, right? Windows is so bug-filled that you expect the computer to restart, several times in a day (because it is starting to be that it isn't the case anymore). And the viruses, worms and trojans--- huge problem that an entire industry has grown up both to prevent and another to use as vehicles for their nefarious schemes.
There isn't really a lack of applications on Linux, not anymore. With wine, you can run Photoshop on Linux or go Gimp. PDF readers are all free on linux. it comes with both Gnome and KDE. OpenOffice is relatively good enough for most new documents--- it's your old word docs that really is the problem and of course interoperability with just about everyone else.
Linux is "free", compared to Microsoft's Windows or Apple's Mac OSX. For one thing, you don't need to pirate something that retails for US$0.00. You don't need specialized hardware to be running Linux like Mac hardware (yes, they're still somewhat specialized in light of the Intel-switch). But people will pirate Windows and pay a few dollars for a pirate copy of it because it works with their PC and its devices.
So what gives?
people tend to forget that Linux is just a kernel. it's a brain that can coordinate the eyes, the ears, the hands, the skin. it is a great piece of software that can be ported to just about every space there is. Embedded, done. Supercomputing, done. Video Recorder (PVR/DVR), done. Desktop, you bet. technically speaking, it is possible that everything and anything can be built on top of the linux kernel. this configurability, this ability to customize is the most power feature of Linux.
don't believe me?
just look at wine. it is a Linux implementation of the Windows api. long story short it translates your windows' applications instructions to the computer into linux speak so that it can run under any linux distribution.
what is to stop Microsoft from doing what Apple has done with rosetta? Microsoft in a heartbeat can build their next generation operating system around the linux kernel or a bsd one. it may take a few years to do it, and several billion dollars but it can. What better weapon is there than to use your enemy's knowledge (i.e. open source) against them? (it can happen, no one would have thought apple would go intel.)
In the same vein--- you can basically build a whole new user interface around the Linux kernel: a single unified user interface. After all, linux is just a kernel.
Distros need to standardize more on common functionality, we should stop re-inventing the wheel. In doing so, would it still be the same diversity that distros now enjoy?
As I explained in my post, Why I Made the Switch, it wasn't because Linux is open source that I use it. Or windows isn't Open Source, why I don't use windows:
"The deal breaker for me why my primary machine is OS X was the ability of Apple to put together this Unix-based desktop workstation that had grace, flexibility and it had power: that's why i made the switch"At a time when the cost of Apple Macs and Windows PCs are at par, and unless you're really going for that budget (i.e. total lowest acquisition cost) machine--- why would you go linux, when you could find the perfect unix desktop already in Mac OS X?
Mac OS X is the Linux Desktop without having to go through all the headache and heartache of making your computer work because hint to developers and manufacturers: just works, no longer works.
Apple showed us that an Open Source operating system can be beautiful, elegant, and powerful piece of software. They were able to build, what is now without doubt: the reigning gold standard in user interfaces. What Linux needs, if it wants to rule the world is to build a humanized, unified user interface that makes the computer work for us, and not the other way around. It can. It should. it can jump over existing User Interfaces and "wow" us (and i'm not just talking about eye candy like beryl, which by the way is good). it is a higher level of creativity that we need to cultivate for the next user interface. Today, Mac OS X is Linux Desktop done right.
0 comments:
Post a Comment