CLI versus GUI argument

Discussion topics, Linux related - not requests for help

Moderators: ChrisThornett, LXF moderators

CLI versus GUI argument

Postby brothers » Mon Oct 18, 2010 6:15 pm

In a recent podcast, and in a newsletter, there has been discussion to the effect that the CLI scares newbies away. But surely there are some who avail themselves of the many tutorials and hints for beginners, roll up their sleeves, and pretty soon learn to love it.
For me, forays onto the command line led to some beginner bash scripts; then I hired someone to tutor me for the things I could not figure out on my own; then I started to learn Perl. I am progressing and having a great time.
I think the CLI can (1) be loved by the long-time linux user, (2) be too daunting for some beginning users, and (3) open up a whole world to the interested beginner. There is a lot of positive in (3) that doesn't seem to get talked about.
poundie
brothers
 
Posts: 2
Joined: Thu Nov 26, 2009 9:33 pm
Location: Santa Monica, CA

Postby guy » Mon Oct 18, 2010 7:49 pm

My first decade of computing was spent on punched cards and tape - my "command line" was a golfball typewriter hooked into the tape punch, debugging involved holding the tape up to the light and looking for rogue binary patterns.

My second decade was (apart from the odd hex keypad and row of LEDs) spent on terminals with a command line along the bottom and a variety of insane things going on above - PDP11, Sinclair Spectrum, BBC Micro and friends.

Then I got an Archimedes - yay! a real GUI! Luckily, a command shell was but a couple of keys away. Like the once-famous WordPerfect, you could run the GUI above and the command shell below. Tho' truth is, being lazy I usually stuck with the default GUI-only experience.

Nowadays I spend so much time doing graphical things that a pure command shell would be useless. But a pure GUI never quite cuts it. Applications > Accessories > Terminal helps, but I long for that old direct interaction between command line and display. Some web editors come close, which is probably why I got hooked on tech authoring for so long - and why I have migrated to wikis with such enthusiasm.
"Klinger, do you know how many zoots were killed to make that one suit?" — BJ Hunnicutt
User avatar
guy
LXF regular
 
Posts: 1100
Joined: Thu Apr 07, 2005 12:07 pm
Location: Worcestershire

Postby DocMindwipe » Wed Oct 20, 2010 9:29 am

I started out with an Amiga A500 with Kickstart 1.3.

At first, I thought the GUI was a wonderful thing, but then I realised, you can only do so much with the gui-only approach. And so I started to use the Shell/CLI a bit more...

And it's hung in there ever since. Sometimes, the GUI approach is all I need, sometimes I find I have a need to use the CLI. The Redmond approach in recent years of going away from the CLI, has to me been a bit annoying, because of my want to use the CLI for some (many) things.

How do you make a script in a GUI only environment? Point and click to what you want included, with all the optional swithces and keys? I've still not seen a GUI where you can make use of the switches to many of the commands without resorting to the CLI.

In Win7, you can still open a CLI and type "format c: /u" for example.... or "dir c:\windows a*.exe"

There's little or no way around the cli for that sort of thing.

However.... we (as in the Linux community as a whole) should try to make the approach to (for example) compiling a new kernel and installing it, much more userfriendly.

Could even be an option in (Ubuntus) System -> Administation menu "Compile and install new kernel" where you basically get "make gtk-config" and a GUI approach to that, and at the bottom you get buttons for "Compile new kernel"(which does the tough job of typing "make bzImage" on the commandline) "check Kernel for errors"(which checks that the kernel will actually work when you boot from it) and "Install new Kernel" (make install)

This will hopefully help newly converted people do less error, and, more importantly, help them make a kernel that is tailormade for THEIR hardware.

Andother thing that could go into the Kernel making, could be "check hardware and autoselect" where the underlying scripts probes your hardware and selects the kernel options for you... the more advanced of us will always go in and check the options anyways, but for new people it should be a huge help.

In any case, that's just one way of making Linux a little bit less Shell-dependant. Some will, no matter how good the GUI's are, always revert to the CLI for some tasks even so, however, for some people, a GUI approach like the one mentioned above, will be far prefereable to the CLI
DocMindwipe
 
Posts: 15
Joined: Tue Jun 29, 2010 2:39 pm

Postby nelz » Fri Oct 22, 2010 1:41 pm

The GUI doesn't have arguments, only the command line :P
"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)
User avatar
nelz
Site admin
 
Posts: 8553
Joined: Mon Apr 04, 2005 11:52 am
Location: Warrington, UK

Postby Bazza » Fri Oct 22, 2010 3:18 pm

Hi DocMindWipe...

"However.... we (as in the Linux community as a whole) should try to make the approach to (for example) compiling a new kernel and installing it, much more userfriendly."

Jeez, if only I were that clever.

I barely know how to turn on a computer. :-O
73...

Bazza, G0LCU...

Team AMIGA...
User avatar
Bazza
LXF regular
 
Posts: 1482
Joined: Sat Mar 21, 2009 11:16 am
Location: Loughborough

Postby wyliecoyoteuk » Fri Oct 22, 2010 4:32 pm

Even Windows and Macs still have quite a few things that you need a CLI for, usually "sysadmin" type things.

For example, enabling an anonymous local relay (for example to allow a scanner to send emails) for exchange 2007 requires this command, after adding a new receive connector for the IP of the scanner:

Get-ReceiveConnector "connector name" | Add-ADPermission -User "NT AUTHORITY\ANONYMOUS LOGON" -ExtendedRights "ms-Exch-SMTP-Accept-Any-Recipient"

Changing the default screen mode for an unrecognised monitor on a MAC involves editing a text file.

There is no GUI method for these tasks, and quite a few more, particularly for Exchange servers.
The sig between the asterisks is so cool that only REALLY COOL people can even see it!

*************** ************
User avatar
wyliecoyoteuk
LXF regular
 
Posts: 3465
Joined: Sun Apr 10, 2005 10:41 pm
Location: Birmingham, UK

Postby nelz » Fri Oct 22, 2010 7:51 pm

It's already fairly easy to generate a kernel for your existing hardware. There's an argument to make that selects all the modules currently in use. The problem with this approach s that modern computers (desktop and laptop) don't have a fixed set of hardware. People are always plugging in USB hard drives, cameras, webcams, microphones, speakers and more. So you end up needing a kernel with everything available as a module, which is what most distros do.

Compling a kernel is dead easy
Code: Select all
make all mopdules_install install

it's selecting the right modules that is tricky and needs either plenty of reading of the help messages or the generic build-everything-as-a-module approach.
"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)
User avatar
nelz
Site admin
 
Posts: 8553
Joined: Mon Apr 04, 2005 11:52 am
Location: Warrington, UK

Postby DocMindwipe » Sat Oct 23, 2010 8:05 am

it's selecting the right modules that is tricky and needs either plenty of reading of the help messages or the generic build-everything-as-a-module approach.
Which is why a Wizard-ish approach would be alot less daunting to freshly converted people....

The wizard should be able to scan YOUR hardware, and select modules/builtin drivers automatically. No need to selct USB support if your computer don't ahve any USB-ports, for example....

Of course, "oldtimers" enjoy the make menuconfig approach (I know I do, I started with Linux when 3.2 was the newest Slackware available so make menuconfig is something I never quite "got past"), however, if a Mac-user comes along and tries that, he'll prolly go away after 5 seconds with both brown AND wet underwear.
DocMindwipe
 
Posts: 15
Joined: Tue Jun 29, 2010 2:39 pm

Postby Rhakios » Sat Oct 23, 2010 1:37 pm

Why would a user, new to Linux, want to compile their own kernel? I have played around with self-compiled kernels in the past, but for routine desktop use I cannot discern any real use.

I would have thought that people who need to compile their own kernels would not find the modest technicalities required beyond their grasp.
Bye, Rhakios
User avatar
Rhakios
Moderator
 
Posts: 7634
Joined: Wed Apr 06, 2005 11:18 pm
Location: Midlands, UK

Postby Bazza » Sat Oct 23, 2010 1:43 pm

Hi DocMindwipe...

> Which is why a Wizard-ish approach would be alot less
> daunting to freshly converted people....

> The wizard should be able to scan YOUR hardware, and
> select modules/builtin drivers automatically. No need to
> selct USB support if your computer don't ahve any
> USB-ports, for example....

So long as an installation method works does it really
matter?

Nowadays new Linux installs are as easy as any other
platforms, sometimes EASIER...

:shock: Did I actually say that! :shock:
(Nelz keep quiet... :) )

If it is memory usage that is your worry then get more
memory.

If it is a performance hit that is your worry then I suspect
even my 300MHz PII shite won`t be too affected by driver
code that MIGHT be present in the kernel. Again why worry?

IMO the odds of any none used kernel code that MAY be
loaded into memory causing a performance hit will probably
be " >NIL: "... ;o)

As I mentioned before, new Linux installs are as easy as
any other platforms. :shock:
73...

Bazza, G0LCU...

Team AMIGA...
User avatar
Bazza
LXF regular
 
Posts: 1482
Joined: Sat Mar 21, 2009 11:16 am
Location: Loughborough

Postby DocMindwipe » Sat Oct 23, 2010 7:42 pm

Bazza:I DO agree with that... Ubuntu and Fedora and SuSE and all those, are very easy to INSTALL

What you don't seem to get in my suggestion, is this:

COMPILING the kernel, EDITING the config-scripts.... while oldtimers are comfortable with it, the vast majority of new Linux users comes from the Redmond wannabe OS. And they've never realised that compiling a custom kernel can have HUGE advantages for their system. Because the wannabe only comes with one kernel, and "one kernel fits all" according to them.

For us enlightened people, one kernel is what it takes to install, after that, we compile a custom kernel, useable only on my computer, and noone elses, unless they've got the EXACT same setup as mine.

I compile my own kernel. But I do realise, that (in Ubuntu) pressing CTRL+ALT+F1, login as me self, type su, enter root password, then "make menuconfig" do my things in the menus, then typing "make modules_install install" and wait for however long it takes to compile the new kernel, can be somewhat of a challenge for thos who don't realise it's actually easy. And far preferable to the wannabe approach (well, IMO, of course).

Or if you add another HD to your system, trying to figure out /etc/fstab and all that.... getting the HD partitioned and formatted with the right filesystem(s) and mount the partitions in the right places.... _I_ know how to do that. _YOU_ know how to do that.

And while the Linux-community is helpful, GPartEd can be somewhat dauting to someone who's used to "new hardware detected, please wait" --> "new harddrive detected, please wait while configuring" --> "New harddrive available as drive G:" --> "Your new hardware is ready to use"

You see, I've got a feeling that some basic tasks, like adding a new HD to your rig, is not as easy for a Wannabe converted person, as it is for you and me and most other people who's been playing around with many different computers for 20+ years.

The wannabe has ripped off other UI's and other systems' approaches since day one. Time to implement their "wizards" as they should've been.

I'm thinking something like the Amiga's "Installer" programme.... one binary doing the behind the scenes work as instructed by the script, as in adding the neccessary line(s) to fstab, hosts, manpath, and all the other scripts in /etc that might need editing, running partition editors, format drives, and all that.

And not just for the kernel compilation. using it for the script-editing, installing new programmes, etc... would be helpful for the Linux community as a whole, have ONE way of installing a programme, instead of tarballs (slackware), sourcetarballs (Gentoo), RPM (Redhat) or DEB (Debian) packages.

One system which automatically (because the actual instructions is an "installer script" if you think AmigaOS3.x), does the correct handling of dependencies and paths and scripts, rather than what it is now, at least 4 different ways of handling all that. Plus, while you can use Debian .debs in Ubunutu, I've been told you can ue Ubuntu .debs in Debian, as Canonical has slightly changed the format to work with Ubuntu, rather than be "backwards compatible" with Debian.

This myriad of packages formats and compatability in between them, is also a reason why most software vendors can't be a***d making software for Linux. Far too much diversity in our packaging system. And if they only make .debs.... then the RedHat (and derivatives) will cry foul. And the other way around.

so my Wizard-ish idea still holds water as an allround installer/configurator, the software vendors can then just make ONE script, and it should work on ALL distroes.

PLUS, it'll make the wannabe-converts less afraid to do things that might break their system
DocMindwipe
 
Posts: 15
Joined: Tue Jun 29, 2010 2:39 pm

Postby nelz » Sat Oct 23, 2010 7:51 pm

DocMindwipe wrote:The wizard should be able to scan YOUR hardware, and select modules/builtin drivers automatically. No need to selct USB support if your computer don't ahve any USB-ports, for example....


The point I failed to make is that it isn't enough to add support for your USB ports (how many desktop and laptop computers don't have them?). You also need support for anything you have plugged into them (easy) or may plug into them in the future.

How can an automagic kernel compiler know what you have in your camera bag?
"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)
User avatar
nelz
Site admin
 
Posts: 8553
Joined: Mon Apr 04, 2005 11:52 am
Location: Warrington, UK

Postby wyliecoyoteuk » Sat Oct 23, 2010 7:58 pm

I am a little puzzled.
If you plug in a formatted hard drive to most distros, it will be automatically mounted and appear on the desktop.
If you plug it into windows, it will be mounted and appear in "my computer"
If you plug in an unformatted or unrecognised drive with either system, it is either ignored, or windows offers to format it for you, even if it has data on it.
Adding a new unformatted hard disk to windows is no easier than in linux.

So what are you complaining about?
The sig between the asterisks is so cool that only REALLY COOL people can even see it!

*************** ************
User avatar
wyliecoyoteuk
LXF regular
 
Posts: 3465
Joined: Sun Apr 10, 2005 10:41 pm
Location: Birmingham, UK

Postby nelz » Sat Oct 23, 2010 8:02 pm

DocMindwipe wrote:I'm thinking something like the Amiga's "Installer" programme.... one binary doing the behind the scenes work as instructed by the script, as in adding the neccessary line(s) to fstab, hosts, manpath, and all the other scripts in /etc that might need editing, running partition editors, format drives, and all that.


You have clearly never written Amiga installer scripts. Having written a 2000+ line Amiga installer script, I can tell you I'd far rather use the standard GNU tools like sed, grep and awk.

The pipeline approach of the GNU tools makes chaining everything together easy and the whole thing can run from a shell script. You don't need a dedicated binary when you have an environment that makes all of this so easy.
"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)
User avatar
nelz
Site admin
 
Posts: 8553
Joined: Mon Apr 04, 2005 11:52 am
Location: Warrington, UK

Postby Rhakios » Sat Oct 23, 2010 9:36 pm

DocMindwipe wrote:For us enlightened people, one kernel is what it takes to install, after that, we compile a custom kernel, useable only on my computer, and noone elses, unless they've got the EXACT same setup as mine.


Oh dear! Then it seems I'm not enlightened, as I run with the standard kernel and accept the distribution updates as they become available. My system must cost me many milliseconds in wasted time every day. I can only hope that the tens of minutes I don't spend compiling my own kernel in some way compensates. :roll:
Bye, Rhakios
User avatar
Rhakios
Moderator
 
Posts: 7634
Joined: Wed Apr 06, 2005 11:18 pm
Location: Midlands, UK

Next

Return to Discussion

Who is online

Users browsing this forum: No registered users and 0 guests