From: Clayton Weaver
Subject: Re: argc (omitted argv[0] in count, corrected)
Date: 
Message-ID: <Pine.SUN.3.96.1000622065000.9473B-100000@eskimo.com>
#>printargc -a -b
3

#>printargc "-a -b"
2

(reminder to scan argv[n] for embedded field separators still stands)

-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)

From: Joe Marshall
Subject: Re: argc (omitted argv[0] in count, corrected)
Date: 
Message-ID: <r99pn3ve.fsf@alum.mit.edu>
Clayton Weaver <······@eskimo.com> writes:

> #>printargc -a -b
> 3
> 
> #>printargc "-a -b"
> 2
> 
> (reminder to scan argv[n] for embedded field separators still stands)

That would likely be an incorrect `solution'.  If you did this
unconditionally, then it would be impossible for the calling process
to pass in an argument that contains that particular field separator.
This occurs way too often in Windows, where people like to put spaces
in filenames, but is not unknown to happen in other OS's.
From: Clayton Weaver
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <Pine.SUN.3.96.1000622233226.27028D-100000@eskimo.com>
> > #>printargc -a -b
> > 3

> > #>printargc "-a -b"
> > 2

> > (reminder to scan argv[n] for embedded field separators still stands)

> That would likely be an incorrect `solution'.  If you did this
> unconditionally, then it would be impossible for the calling process
> to pass in an argument that contains that particular field separator.
> This occurs way too often in Windows, where people like to put spaces
> in filenames, but is not unknown to happen in other OS's.

For file/directory/folder names that exist as a file, you can look for
path separators (/ and \) in the string as a possible indication that it
is a file and try to open it as a file. If you can't open it, you can't
very well operate on it either, so it must be not a pathname or simply an
error. Even without path separators (it might be in the current directory,
assuming that the operating environment even has directories), you can try
opening the whole string, field separators and all, if the element of
argv[] with embedded field separator characters is in a position where a
pathname is expected or a position in the arg array where a pathname could
occur and be a valid argument to the program.

But you still need to check for commands or scripts that simply bracket
multiple, space-separated command line options with quotes (to keep the
shell from doing odd things with shell metacharacters that happen to be in
the option strings that end up as argv[]). A more careful user may quote
each argument that needs protection from shell interpretation
individually, but a program that successfully handles the user or script 
that quotes a whole set of arguments to the program as one quoted string
is more robust.

If users go to the trouble of reading enough of the documentation to know
what the options are, the least we can do is handle any quoting style that
delivers all of the appropriate arguments to the program, even if they
occasionally all end up in one argv[] element.

Regards,



-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)
From: Joe Marshall
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <r99o34h2.fsf@alum.mit.edu>
Clayton Weaver <······@eskimo.com> writes:

> > > #>printargc -a -b
> > > 3
> 
> > > #>printargc "-a -b"
> > > 2
> 
> > > (reminder to scan argv[n] for embedded field separators still stands)
> 
> > That would likely be an incorrect `solution'.  If you did this
> > unconditionally, then it would be impossible for the calling process
> > to pass in an argument that contains that particular field separator.
> > This occurs way too often in Windows, where people like to put spaces
> > in filenames, but is not unknown to happen in other OS's.
> 
> For file/directory/folder names that exist as a file, you can look for
> path separators (/ and \) in the string as a possible indication that it
> is a file and try to open it as a file.  If you can't open it, you can't
> very well operate on it either, so it must be not a pathname or simply an
> error. 

Just because you can't open it doesn't mean that the name is not
useful!  (Ever create a file?)

> Even without path separators (it might be in the current directory,
> assuming that the operating environment even has directories), you can try
> opening the whole string, field separators and all, if the element of
> argv[] with embedded field separator characters is in a position where a
> pathname is expected or a position in the arg array where a pathname could
> occur and be a valid argument to the program.

Let's consider the `touch' command.  It sets the last write time on a
file to the current time, and if the file doesn't exist, it creates
the file first.  Touch can take multiple files as arguments.

Now I write

touch foo bar/baz

Do I want to create one file, "foo bar/baz", or two, "foo" and
"bar/baz"?

Suppose I want to create three files called "-t", "06071142", and
"foo".  What do I type?  (Hint: Not `touch -t 06071142 foo')

Suppose I want to create a file called "Quote marks (\") suck"?

> But you still need to check for commands or scripts that simply bracket
> multiple, space-separated command line options with quotes (to keep the
> shell from doing odd things with shell metacharacters that happen to be in
> the option strings that end up as argv[]). A more careful user may quote
> each argument that needs protection from shell interpretation
> individually, but a program that successfully handles the user or script 
> that quotes a whole set of arguments to the program as one quoted string
> is more robust.

This simply cannot be done because the mapping from input strings (the
ones that appear in argv) to arguments (the final form of what you
think the arguments are) is not invertible.  Without a one-to-one
mapping, there will be certain strings that you will unable to pass to
your program.

Not only that, a clear and concise description of exactly which
strings are not allowed would be difficult to create.

> If users go to the trouble of reading enough of the documentation to know
> what the options are, the least we can do is handle any quoting style that
> delivers all of the appropriate arguments to the program, even if they
> occasionally all end up in one argv[] element.

*any* quoting style?  There is no way for a program to know how it got
started.  Sure, that stray " may have accidentally got through because
the user didn't know how to escape it for the shell.  On the other
hand, maybe the user isn't using CSH or BASH and is instead using SCSH
which uses a *completely different* quoting style!

The program should expect that the arguments it got were exactly what
the user wanted it to get.  It shouldn't try to second-guess the user
or his shell.

If the user has problems getting the quoting right, he should consider
a better shell.
From: Clayton Weaver
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <Pine.SUN.3.96.1000624095211.6070D-100000@eskimo.com>
On 23 Jun 2000, Joe Marshall wrote:

> Clayton Weaver <······@eskimo.com> writes:
> 
> > > > #>printargc -a -b
> > > > 3
> > 
> > > > #>printargc "-a -b"
> > > > 2
> > 
> > > > (reminder to scan argv[n] for embedded field separators still stands)
> > 
> > > That would likely be an incorrect `solution'.  If you did this
> > > unconditionally, then it would be impossible for the calling process
> > > to pass in an argument that contains that particular field separator.
> > > This occurs way too often in Windows, where people like to put spaces
> > > in filenames, but is not unknown to happen in other OS's.
> > 
> > For file/directory/folder names that exist as a file, you can look for
> > path separators (/ and \) in the string as a possible indication that it
> > is a file and try to open it as a file.  If you can't open it, you can't
> > very well operate on it either, so it must be not a pathname or simply an
> > error. 
 
> Just because you can't open it doesn't mean that the name is not
> useful!  (Ever create a file?)

We were discussing whether scanning for embedded field separator
characters in argv[n] strings (or any equivalent represented by
the shell to the program as a single command argument) is necessary
and useful. All of the possibilities of what can go wrong with
file/document/folder/persistent_storage_object identifiers supplied
by the user independent of the practice of simply quoting multiple
arguments together is a separate issue that needs separate code
to handle it.

> Let's consider the `touch' command.  It sets the last write time on a
> file to the current time, and if the file doesn't exist, it creates
> the file first.  Touch can take multiple files as arguments.
 
> Now I write
> 
> touch foo bar/baz
> 
> Do I want to create one file, "foo bar/baz", or two, "foo" and
> "bar/baz"?

Two, assuming that neither already exists. Since you didn't quote them
together, touch will assume two independent pathnames.
 
> Suppose I want to create three files called "-t", "06071142", and
> "foo".  What do I type?  (Hint: Not `touch -t 06071142 foo')

It's ambiguous to touch if you include them all on the same command
line, whether you quote them together or not. You need to supply
the "-t" filename to a touch command by itself or at any rate without
an argument following it that can be interpreted by touch as a timestamp.

An artificial case that demonstrates ambiguity does not change the
simple wisdom of checking command argument strings to see if they contain
multiple options to a program.

> Suppose I want to create a file called "Quote marks (\") suck"?

Feel free to figure out how to do that.
 
> > But you still need to check for commands or scripts that simply bracket
> > multiple, space-separated command line options with quotes (to keep the
> > shell from doing odd things with shell metacharacters that happen to be in
> > the option strings that end up as argv[]). A more careful user may quote
> > each argument that needs protection from shell interpretation
> > individually, but a program that successfully handles the user or script 
> > that quotes a whole set of arguments to the program as one quoted string
> > is more robust.
 
> This simply cannot be done because the mapping from input strings (the
> ones that appear in argv) to arguments (the final form of what you
> think the arguments are) is not invertible.  Without a one-to-one
> mapping, there will be certain strings that you will unable to pass to
> your program.

Of course it can be done. It can still fail in the presence of
ambiguous args, as you demonstrate above with the touch example, but that
doesn't mean that such checking will never allow a program to operate
correctly that would fail without checking for multiple arguments in a
single command string. Its a matter of restricting the failure modes of
the program in the presence of quoted command line args, rather than of
eliminating them completely. Because a determined use can still provide
a sequence of options that the program will find ambiguous does not
mean that such argument scanning will not save most users from
accidentally breaking it by quoting multiple arguments together.
 
> Not only that, a clear and concise description of exactly which
> strings are not allowed would be difficult to create.

Well, it's part shell metacharacter syntax, part quoting, and part command
semantics. The description would vary with what command exactly you want
to run with which options. It's a bit difficult to have a clear and
concise description of an issue that is open-ended in the context of
what programs may have for options. The best you can do is describe
what characters the shell considers to be metacharacters, how quoting
affects shell interpretation of those characters, what characters
are not allowed in pathnames by the operating system (consider a 0
byte for example), and what a command expects for options.
 
> > If users go to the trouble of reading enough of the documentation to know
> > what the options are, the least we can do is handle any quoting style that
> > delivers all of the appropriate arguments to the program, even if they
> > occasionally all end up in one argv[] element.
 
> *any* quoting style?

Any quoting style and set of arguments that it is possible to make
unambiguous by finding multiple arguments in a single command string
that would remain ambiguous for a simplistic program that performed
no such inspection.

>  There is no way for a program to know how it got
> started.  Sure, that stray " may have accidentally got through because
> the user didn't know how to escape it for the shell.  On the other
> hand, maybe the user isn't using CSH or BASH and is instead using SCSH
> which uses a *completely different* quoting style!

How is this an issue with checking for multiple args in a single
argv[] element in a C program that uses argv[], or in any other
program where multiple args quoted together can end up in a single
"command argument" object when the program inspects its arguments,
which was the original subject of discussion?

> The program should expect that the arguments it got were exactly what
> the user wanted it to get.  It shouldn't try to second-guess the user
> or his shell.

> If the user has problems getting the quoting right, he should consider
> a better shell.
 
Ok, we'll just throw out that % of the users that fail to figure
out that quoting arguments individually works but aggregating them
together between quotes does not work. We'll lose work that we could have
saved by detecting the simple cases that aren't ambiguous if we split
the argument string into what would be the most likely intent of 
the user in the normal case, to accomodate people that want to have
bizarre filenames in an abnormal case.

Right, I can sell that.

Regards,

Clayton Weaver
<·············@eskimo.com>
(Seattle)

"Everybody's ignorant, just in different subjects."  Will Rogers


-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)
From: Rob Warnock
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <8j3o5m$2eqi7$1@fido.engr.sgi.com>
Clayton Weaver  <······@eskimo.com> wrote:
+---------------
| On 23 Jun 2000, Joe Marshall wrote:
| > Suppose I want to create three files called "-t", "06071142", and
| > "foo".  What do I type?  (Hint: Not `touch -t 06071142 foo')
| 
| It's ambiguous to touch if you include them all on the same command
| line, whether you quote them together or not. You need to supply
| the "-t" filename to a touch command by itself...
+---------------

Not really, if you use the standard Unix idiom for that very case:

	% touch ./-t 06071142 foo
	% ls -l
	total 0
	-rw-r--r--    1 rpw3     engr           0 Jun 24 18:43 -t
	-rw-r--r--    1 rpw3     engr           0 Jun 24 18:43 06071142
	-rw-r--r--    1 rpw3     engr           0 Jun 24 18:43 foo
	% 


-Rob

-----
Rob Warnock, 41L-955		····@sgi.com
Applied Networking		http://reality.sgi.com/rpw3/
Silicon Graphics, Inc.		Phone: 650-933-1673
1600 Amphitheatre Pkwy.		PP-ASEL-IA
Mountain View, CA  94043
From: Clayton Weaver
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <Pine.SUN.3.96.1000625001108.27977A-100000@eskimo.com>
> | On 23 Jun 2000, Joe Marshall wrote:
> | > Suppose I want to create three files called "-t", "06071142", and
> | > "foo".  What do I type?  (Hint: Not `touch -t 06071142 foo')

> | It's ambiguous to touch if you include them all on the same command
> | line, whether you quote them together or not. You need to supply
> | the "-t" filename to a touch command by itself...
> +---------------

> Not really, if you use the standard Unix idiom for that very case:

> 	% touch ./-t 06071142 foo
> 	% ls -l
> 	total 0
> 	-rw-r--r--    1 rpw3     engr           0 Jun 24 18:43 -t
> 	-rw-r--r--    1 rpw3     engr           0 Jun 24 18:43 06071142
> 	-rw-r--r--    1 rpw3     engr           0 Jun 24 18:43 foo
> 	% 

That works. If I ever actually need a file named "-[char]" I can
use that.

I will finish with the opinion that nitpicking the corner cases where
it can do the wrong thing does not change the benefit of splitting command
arguments with embedded field separators in them to see if they make sense
as discrete command options in the average case. If there are security
implications, then deal with those for those arguments with security
issues in whatever way is appropriate.

Anyone who suspects that their command line option parser will not be able
to disambiguate multiple options in a single argument string from
persistent storage object names with embedded field separators often
enough for the code to do that to be worth the effort of designing
that functionality into their program is free to not implement it.

Regards,

(RSN)


-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)
From: Erik Naggum
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <3170917784127386@naggum.no>
* Clayton Weaver <······@eskimo.com>
| I will finish with the opinion that nitpicking the corner cases
| where it can do the wrong thing does not change the benefit of
| splitting command arguments with embedded field separators in them
| to see if they make sense as discrete command options in the average
| case.  If there are security implications, then deal with those for
| those arguments with security issues in whatever way is appropriate.

  This could have been fine if there were only a fixed set of field
  separators.  There aren't.  The shell user/programmer may set the
  internal field separators (shell variable IFS) to whatever he wants
  precisely to avoid parsing problems with random user input.  IFS
  does not have to be exported to the environment, so there's no
  telling what the internal field separators were.

| Anyone who suspects that their command line option parser will not
| be able to disambiguate multiple options in a single argument string
| from persistent storage object names with embedded field separators
| often enough for the code to do that to be worth the effort of
| designing that functionality into their program is free to not
| implement it.

  Oh, great!  Let's have all programs differ in the way they do this!

  My conclusion: Beware of people who speak of the "average case" or
  "often enough" when designing user or programming interfaces.

#:Erik
-- 
  If this is not what you expected, please alter your expectations.
From: Clayton Weaver
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <Pine.SUN.3.96.1000625083242.21212A-100000@eskimo.com>
> | I will finish with the opinion that nitpicking the corner cases
> | where it can do the wrong thing does not change the benefit of
> | splitting command arguments with embedded field separators in them
> | to see if they make sense as discrete command options in the average
> | case.  If there are security implications, then deal with those for
> | those arguments with security issues in whatever way is appropriate.

>   This could have been fine if there were only a fixed set of field
>   separators.  There aren't.  The shell user/programmer may set the
>   internal field separators (shell variable IFS) to whatever he wants
>   precisely to avoid parsing problems with random user input.  IFS
>   does not have to be exported to the environment, so there's no
>   telling what the internal field separators were.
 
> | Anyone who suspects that their command line option parser will not
> | be able to disambiguate multiple options in a single argument string
> | from persistent storage object names with embedded field separators
> | often enough for the code to do that to be worth the effort of
> | designing that functionality into their program is free to not
> | implement it.

>   Oh, great!  Let's have all programs differ in the way they do this!

They are going to anyway, aren't they? This is what I was saying about
the difficulty of clearly and concisely documenting quoted argument 
behavior. You can only document what your shell passes to a program
execed with quoted command options. You can't document in advance,
all in one convenient place, what programs that haven't been written yet
are going to do with those argument strings.

So you think there should be a Posix standard for argument splitting
inside programs?

>   My conclusion: Beware of people who speak of the "average case" or
>   "often enough" when designing user or programming interfaces.

Actually I agree that this is a user interface problem, and the nature
of the problem is reuse of the same characters in different contexts,
the overlap between argument separation and the arguments themselves.

A user that knows enough about shells to change the IFS isn't likely
to have problems with option aggregation in single argument objects
and failing to anticipate how the program that he/she is running
is going to react to that, but I admit that a program that splits args on
whitespace when something else is the IFS for the command line could have
unexpected results. I still think that depends on the program
itself, what it does, and it's option grammar. Can you disambiguate by
looking for an "option_start" character or string after the whitespace
that would be an illegal character in identifiers?

How can one construct one's option grammar so that whitespace is either
clearly an argument separator or clearly part of an argument string,
and no ambiguity is possible, whether or not the arguments arrive in a
single command argument upon program invocation?

I was considering the user that doesn't know that a shell aggregates
anything inside a pair of matching quotes into a single argument,
but who still needs to run the occasional batch command (quite a common
case, I'd bet). How can a program remove en masse quoting of option sets
on a command line as a source of error?

Interactively one could use a gui list box with radio buttons and
text entry fields, and the user never sees how the command options are
separated in the program's argument list, so quoting is not an issue.

Why can't this sort of separation of context be exported to command line
interfaces for systems that don't run guis? What about "ctrl-_" (control
underline, "unit separator", ascii 0x1f), what was the original function
of that? It looks like it was designed for something like this.

Regards,



-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)
From: Joe Marshall
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <1z1lpmmx.fsf@alum.mit.edu>
Clayton Weaver <······@eskimo.com> writes:

> > | I will finish with the opinion that nitpicking the corner cases
> > | where it can do the wrong thing does not change the benefit of
> > | splitting command arguments with embedded field separators in them
> > | to see if they make sense as discrete command options in the average
> > | case.  If there are security implications, then deal with those for
> > | those arguments with security issues in whatever way is appropriate.
> 
> >   This could have been fine if there were only a fixed set of field
> >   separators.  There aren't.  The shell user/programmer may set the
> >   internal field separators (shell variable IFS) to whatever he wants
> >   precisely to avoid parsing problems with random user input.  IFS
> >   does not have to be exported to the environment, so there's no
> >   telling what the internal field separators were.
>  
> > | Anyone who suspects that their command line option parser will not
> > | be able to disambiguate multiple options in a single argument string
> > | from persistent storage object names with embedded field separators
> > | often enough for the code to do that to be worth the effort of
> > | designing that functionality into their program is free to not
> > | implement it.
> 
> >   Oh, great!  Let's have all programs differ in the way they do this!
> 
> They are going to anyway, aren't they? 

No.  They could all do exactly the same thing --- nothing at all.

> This is what I was saying about the difficulty of clearly and
> concisely documenting quoted argument behavior.  You can only
> document what your shell passes to a program execed with quoted
> command options.  You can't document in advance, all in one
> convenient place, what programs that haven't been written yet are
> going to do with those argument strings.

I can.  We simply adopt the convention that *no matter what the shell
does with the argument strings, that the program accepts the input
without questioning its origins*.

> A user that knows enough about shells to change the IFS isn't likely
> to have problems with option aggregation in single argument objects
> and failing to anticipate how the program that he/she is running
> is going to react to that, 

He will if the ultimate program starts munging around with the arguments.

> but I admit that a program that splits args on
> whitespace when something else is the IFS for the command line could have
> unexpected results. I still think that depends on the program
> itself, what it does, and it's option grammar. Can you disambiguate by
> looking for an "option_start" character or string after the whitespace
> that would be an illegal character in identifiers?

It's simple.  You do not disambiguate at all.
From: Clayton Weaver
Subject: Re: argc (not exported IFS was the game breaker)
Date: 
Message-ID: <Pine.SUN.3.96.1000625181351.5189A-100000@eskimo.com>
> > >   Oh, great!  Let's have all programs differ in the way they do this!

> > They are going to anyway, aren't they? 

> No.  They could all do exactly the same thing --- nothing at all.

Ok, thinking about redefining IFS in a shell and not exporting it into a
program's environment, I admit that I was wrong about this (I tested
it with a small program to see if IFS was really not exported into
the execed program's environment by default). If the program doesn't know
what the shell that called it is actually using for argument separators,
it has no basis on which to judge whether splitting an argument string on
the default argument separators for some particular group of well-known
command line shells is really in the user's best interest, so it shouldn't
muck about with the arguments containing those characters at all.

This is what the earlier scsh comment was intended to impart.

You can still split args when they aren't separated by anything at
all and your program knows to look for merged options (like

  gzip -fqr9 /dir/

for example, which amounts to command name, four option args, and a
filesystem object arg in what the program will see as a 3 token command
line), but that wasn't the original issue. I only mention it for
completeness.

(But I don't really think that "quotes suck" and that command line shells
are antique, user unfriendly command interfaces. They seem to me just
flexible tools that require some study to use effectively, and generally
lacking in irritating BSOD-type behavior.)

Regards,


-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)
From: Clayton Weaver
Subject: Re: argc (being wrong, and back to the beginning)
Date: 
Message-ID: <Pine.SUN.3.96.1000626022044.19570A-100000@eskimo.com>
I hate to be wrong in public technical discussions, but that's not a valid
technical reason for holding to an indefensible  position, so it's out of
scope in the arg splitting on arg separator character discussion. I
weaseled on it as much as I could in good concsience, any more would be
simply not facing facts, arbitrarily shifting the semantic environment
of discussion from the issue of relative technical merit to personal
issues.

This is how most Usenet flame wars get started, as a shift in semantic
environment from discussing issues of public interest to issues of
only personal interest, for no good reason.

--

Note on the original "argc was clueless design" comment:

The shell already has the argument count to set up argv[] properly.
Why is passing it to the execed program excessive? Why count them
again or do the array_base+offset calculation in a

  if (argv[n] != NULL) {

condition test if you can do a 

 if (n != argc) {

test instead? Where is the hidden cost?

Regards,




-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)
From: Clayton Weaver
Subject: Re: argc (is argc bogus in general, and, "What's a pipe?")
Date: 
Message-ID: <Pine.SUN.3.96.1000627011323.10553A-100000@eskimo.com>
Correction: it's not the shell or or other caller of a C program, which
already knows argc as one less than the number of elements in argv[], that
computes argc for main(). Looking at execve(), it must be the C runtime
that computes argc (by iterating over argv[] until it finds the NULL) and
puts it in main()'s arguments.

So argc is computed whether the program actually takes any arguments
or not (if it didn't take command line arguments and didn't change
it's behavior based on what it was called as, it wouldn't need argc or
argv[] at all). Is that the only issue, that argc is unnecessary
work by the C runtime when the program starts in the context of
that particular sort of program?

-- 

Further thought on GUIs for the user interface part of layered interfaces
that separate what the user has to do to keep command arguments separated
(use a dialog box that frees the user from manually concatenating
arguments) from what the command interpreter needs to accomplish the same
disambiguation (appropriately quote overloaded separator or metacharacters
embedded in argument strings to tunnel them unmolested through to the
program):

"What's a pipe?"

This is the usual problem. When you need to string multiple commands
together in a pipeline, you have to open a console window with a
command line interface, which means that now you need to know shell
quoting after all.

So the GUI only solves the "arcane complexity of command options on a
command line" problem part of the time. You could dispense with the GUI
user interface altogether except for programs that only run
interactively and/or require a graphic display, since you need to
know shell command line syntax anyway to use the workstation or server 
to its full potential.

And if the console shell that you have is some crippled junk that doesn't
even approach the functionality of 20 year old unix shells, God help you
when you need to string the output of one command into the input of
another or use a pattern matching expression or operate on files/folders
en masse.

If you only use interactive programs and don't do any batch filtering,
GUIs being pipe-ignorant doesn't cost you anything (common enough case
with casual users and even some workstation users).

Regards,

Clayton Weaver
<·············@eskimo.com>
(Seattle)

"Everybody's ignorant, just in different subjects."  Will Rogers



-- 

Clayton Weaver
<·············@eskimo.com>
(Seattle)
From: Lieven Marchand
Subject: Re: argc ("-handling -multiple -args -that -look -like -1 -arg")
Date: 
Message-ID: <m3u2ekmiaq.fsf@localhost.localdomain>
Clayton Weaver <······@eskimo.com> writes:

> If users go to the trouble of reading enough of the documentation to know
> what the options are, the least we can do is handle any quoting style that
> delivers all of the appropriate arguments to the program, even if they
> occasionally all end up in one argv[] element.

It's been tried in Lisp before. See the second definition of DWIM in
the jargon file. And see the story attached to it on why it isn't a
good idea.

-- 
Lieven Marchand <···@bewoner.dma.be>
When C++ is your hammer, everything looks like a thumb.      Steven M. Haflich
From: Erik Naggum
Subject: Re: argc (omitted argv[0] in count, corrected)
Date: 
Message-ID: <3170681795858375@naggum.no>
* Joe Marshall <·········@alum.mit.edu>
| That would likely be an incorrect `solution'.

  Quite the reverse of what we want, actually.  It's a design flaw
  that arguments to programs aren't typed and that quoting, etc, is
  stripped by the time you get it in the program.  The effort required
  of a shell script to ensure that random arguments to random programs
  are not misinterpreted as options is distinctily non-trivial.

| If you did this unconditionally, then it would be impossible for the
| calling process to pass in an argument that contains that particular
| field separator.  This occurs way too often in Windows, where people
| like to put spaces in filenames, but is not unknown to happen in
| other OS's.

  The user tendency to restrict the names of files to whatever is
  convenient to type in the shell is a symptom of the design flaw.
  Files should be named whatever the user thinks is descriptive.  We
  have thoroughly abandoned the six- or eight-character filename.

#:Erik
-- 
  If this is not what you expected, please alter your expectations.
From: Christopher C Stacy
Subject: Re: argc (omitted argv[0] in count, corrected)
Date: 
Message-ID: <x8lhfakdi43.fsf@world.std.com>
>>>>> On 22 Jun 2000 16:56:35 +0000, Erik Naggum ("Erik") writes:
 Erik>   The user tendency to restrict the names of files to whatever is
 Erik>   convenient to type in the shell is a symptom of the design flaw.
 Erik>   Files should be named whatever the user thinks is descriptive.  We
 Erik>   have thoroughly abandoned the six- or eight-character filename.

The six character file name on ITS could contain spaces.
The syntax was  "DEV:DIR;FN1 FN2".   There is also a space
that seperates FN1 and FN2.  Like, "DSK:FOO;READ ME".
But you could make "DSK:FOO;RE AD ME" if you wanted.
The native file system used SIXBIT to represent the filename components.
(The system also handled foreign filenames, which were arbitrary ASCII
strings of some long length.)