Tech Insider					     Technology and Trends


			      USENET Archives

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site harvard.ARPA
Path: utzoo!linus!philabs!cmcl2!harvard!kosower
From: koso...@harvard.ARPA (David A. Kosower)
Newsgroups: net.internat
Subject: Languages, Computers, and Problems (Long message)
Message-ID: <471@harvard.ARPA>
Date: Sun, 3-Nov-85 03:14:03 EST
Article-I.D.: harvard.471
Posted: Sun Nov  3 03:14:03 1985
Date-Received: Tue, 5-Nov-85 08:02:11 EST
Distribution: net
Organization: Aiken Computation Laboratory, Harvard
Lines: 165

[Munch, munch]

   Recently, there has been a fair amount of discussion in this 
newsgroup on the dual subjects of [human] languages and character
sets.  Several points that ought to be made have not been, and so
I would like to make them here.

   At the moment, there are two basic kinds of activities people
use computers for: programming, and text-processing.  This
is a crass generalization, but I believe it captures the distinction
between handling material intended primarily for a mechanical or
technical audience and handling material intended primarily for a
human audience.  

   There is no question that the primary human language in the
first activity is English, and that the widely-used computer languages
have their roots in an English-speaking milieu.  There is little doubt
in my mind that this situation will continue for a long time,
certainly well into the coming century.  The reasons are varied but
powerful, including the near-universal use of English in scientific
research of any consequence, and the enormous size of and intense
activity within the American computer industry and markets.  The
relative ease with which one can introduce new terminology into
the language, and the lack of barriers to acquiring a basic
facility in English also play a role.

   This suggests that every reasonable computer system in coming
years will have to handle English and the ASCII character set
(witness that even IBM was forced to use ASCII for
its ventures into the PC market -- and there aren't any other
companies able to impose their own low-level standards on some
other segment of the market).  Programming and system internals
will continue to be done as they are done in the US, if only 
because the overwhelming fraction of programs written will 
continue to be written by English-speaking individuals.  Folks
doing things differently elsewhere would be wasting their time
and dooming themselves to incompatibility for its own sake.

   On the other hand, there is a vast Babel of languages used by 
people to communicate with other people, and I do not expect that
situation to change in my lifetime, either.  I wouldn't want it to;
we would lose many cultural riches were that to happen.  In order
that that non-technical folk be willing to use computers as intermediaries,
computers must be able to handle their *language*.  This certainly
includes the ability to handle non-ASCII character sets, but much
more than that: editors that give help messages in the native language,
dictionaries, spelling and syntax checkers in that language, text
processors that understand how to hyphenate in the native language,
and much more.

   This suggests that foreign-language utilities will appear as a
veneer on top of an English-based system.  (The Xerox Star is an example of
a well-designed system of this kind).

   People will certainly want the ability to use *languages* and
character sets other than their own (this is far more important
outside the US than inside).
But the importance, and need for efficiency, are not independent of
distance: it is much more important for a Dane, say, to be able to
write in German than it is for him to be able to write in Korean.
Thus the Dane's text-handling utilities should be optimized for
handling European languages and character sets, though they should
be *able* to handle any [reasonable] language or character set.
The situation is not symmetrical, because of the world-wide 
importance of English, but the general idea is that "local"
languages should be handled more fully and more efficiently.
This has direct bearing on the question of character set
representation; it is likely that the correct choice for intra-
computer representation of a text is different on machines in
different parts of the world: the Dane might use an 8-bit
representation for characters, with escape sequences for Oriental
characters, while the Korean would use 16-bit characters internally,
perhaps mixed with an 8-bit representation for English.  As an aside,
note that the information density comes out to be roughly the same
(actually, I can't vouch for Korean, but this is more-or-less true
for Japanese and Chinese): although each character takes up more
bits, it also conveys more information, typically a phoneme (2-3
characters in English) rather than merely a letter.

   What about inter-computer transmission of information?  Where
should the conversion take place?  On both sides, of course: neither
representation is the most efficient for transmitting information,
so a standard allowing for data compression should be utilized for
this purpose.

   Such a standard must also be able to handle the meta-questions
of transmission: what happens if the recipient cannot represent all
of the information in the file being transmitted?  This could happen
because the recipient's system is more limited in its ability to
handle foreign character sets, or because the information transmitted
contained characters private to the transmitting site (e.g. logos --
even 16 bits won't be enough if we want our universal character set
to include everyone's logos and special symbols!).

   I am suggesting that the low-level internal character-set
standards -- the questions of which bit patterns represent which
abstract characters, other than the omni-present ASCII, are not 
likely to be consistent from one machine to another.  Nor is it 
really that important; the more abstract issues of designing and 
implementing foreign-language and multi-lingual applications to sit 
on top of the operating system are much more crucial.  There is, for 
example, a large body of knowledge that has been built up over the years
on the proper and elegant way to handle hyphenation automatically
in English.  There are a variety of algorithms and methods that
text formatters use.  Unless I am mistaken, there is a good deal
less known about such questions even in most European languages,
let alone others (say, Hebrew).  It is these questions we ought
to be applying ourselves to.

  To pick another example, lexical sorting, a trivial task in
ASCII-encoded English, becomes somewhat non-trivial for foreign
languages.  Of course, for a *single* foreign language, one can
handle this by replacing the character-comparison loop with a table-
lookup scheme, or a mixed scheme.  There are even ways to order kanji
(after all, the Japanese have dictionaries, too), and these can be
taught to a computer.  The real problems arise in mixed-language situations;
what is the appropriate ordering in that case?  One way out is
to have a [hierarchical] notion of `default' or `environment'
language.  Thus, a few foreign words, representable in a Latin
alphabet, appearing in an English document on an American computer
system, should probably sort in the index as though they were
English words; this is what is most intuitive to an English
speaker.  A user on a foreign system might want the index
to be sorted in the same fashion (on the grounds that the he
must understand English anyhow), or he might want it to be sorted
according to his native language's sorting rules, since he feels
more comfortable with those.  It is quite likely that different
users will feel differently about such issues, indeed even the
same user might make different decisions about different documents.
The text-processing applications will thus have to be more
flexible in dealing with such issues; a sorting order cannot be
hard-coded into a specific bit representation, since it may be
context-sensitive.  Implicit in this attitude, incidentally, is
the viewpoint that a document manipulated on a computer is a far
more fluid and flexible object than some instantiation of it
produced by a laser printer.  After all, we may have different,
even mutually incompatible, views of a given document in different
contexts; why shouldn't our computers be the same?  (This is NOT
a plea for some utopian God-and-reality AI system; I am not expecting
the computer to *understand* my document, merely to be able to
show it differently when contexts change).

  I close with a list, intended to be thought-provoking, not
exhaustive, of problems for automated or automation-assisted tools
that I believe are interesting and merit attention:

   Multi-lingual sorting.
   Hyphenation in foreign languages.
   Checking spelling, syntax, agreement, and usage in foreign
      languages and in multi-lingual documents.
   Language-to-language dictionaries and other translation aids.
   Aesthetics of text formatting, especially in Oriental languages.
   Handling dialects.
   Transmission of information to and through systems with a more
      limited linguistic ability.
   What is a property of a document, and what is a property of
      a local computer system?  (E.g., do the special symbols, and fonts
      associated with a document travel along with it?)
   Efficient search and match algorithms in foreign languages.  
      (Especially in Oriental languages, where users may want to search
       for characters that are part of other characters, perhaps in
       a [visually] altered form).

                                     David A. Kosower
                                     koso...@harvard.HARVARD.EDU.ARPA

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.1 6/24/83; site mmintl.UUCP
Path: utzoo!linus!philabs!pwa-b!mmintl!franka
From: fra...@mmintl.UUCP (Frank Adams)
Newsgroups: net.internat
Subject: Hyphenation
Message-ID: <773@mmintl.UUCP>
Date: Tue, 5-Nov-85 10:52:56 EST
Article-I.D.: mmintl.773
Posted: Tue Nov  5 10:52:56 1985
Date-Received: Fri, 8-Nov-85 08:24:37 EST
References: <471@harvard.ARPA>
Reply-To: fra...@mmintl.UUCP (Frank Adams)
Distribution: net
Organization: Multimate International, E. Hartford, CT
Lines: 19

[Not food]

In article <4...@harvard.ARPA> koso...@harvard.ARPA (David A. Kosower) writes:
>There is, for 
>example, a large body of knowledge that has been built up over the years
>on the proper and elegant way to handle hyphenation automatically
>in English.  There are a variety of algorithms and methods that
>text formatters use.

Yes, and none of them are any good.  Have you seen the things those
algorithms do?  The only successful hyphenation algorithm is to look
the word up in a dictionary.

There are probably more and better on-line dictionaries available for
English than for any other language.  This is an issue that must be
addressed.

Frank Adams                           ihpn4!philabs!pwa-b!mmintl!franka
Multimate International    52 Oakland Ave North    E. Hartford, CT 06108

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.1 6/24/83; site mmintl.UUCP
Path: utzoo!watmath!clyde!bonnie!akgua!gatech!seismo!cmcl2!philabs!pwa-b!mmintl!franka
From: fra...@mmintl.UUCP (Frank Adams)
Newsgroups: net.internat
Subject: Re: Hyphenation (Long message)
Message-ID: <795@mmintl.UUCP>
Date: Sat, 16-Nov-85 01:19:41 EST
Article-I.D.: mmintl.795
Posted: Sat Nov 16 01:19:41 1985
Date-Received: Wed, 20-Nov-85 01:08:01 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <968@enea.UUCP> <501@harvard.ARPA>
Reply-To: fra...@mmintl.UUCP (Frank Adams)
Distribution: net
Organization: Multimate International, E. Hartford, CT
Lines: 45


In article <5...@harvard.ARPA> koso...@harvard.ARPA writes:
>Most native speakers will probably hyphenate
>at least a fair percentage of words by... looking them up in
>a printed dictionary.

Actually, I think most native speakers will put a hyphen in in a place
where they are reasonably sure one belongs, and will acheive a rather
high success rate at doing so.  I do agree that fully interactive
hyphenation is unacceptable.  However, a reasonably sized dictionary,
with resort to interaction instead of to an algorithm, seems to me
to be a viable option in many cases.  From experience, I would say
that most words not found in a 30,000 or so word dictionary are proper
nouns, and not likely to be found even in a much larger dictionary.

>In fact, there are three significant numbers about any hyphenation
>mechanism ("mechanism" here includes dictionary lookup):
>
>   o  The percentage of incorrect hyphenations it produces.
>
>   o  The percentage of all possible hyphenations that it actually
>      finds.
>
>   o  Its efficiency.
>
>Both of the first two numbers should of course be measured for realistic
>text samples, i.e. they should weighted for REALISTIC frequencies
>of word appearances.  We want the first number to be as close to
>zero as possible, and the second number to be as close to 100%
>as possible.  But while we would probably not tolerate a percentage
>of incorrect hyphens greater than about 5% (remember that hyphenation
>isn't all that frequent in most documents, so this already amounts
>to a rather infrequent error), we might well tolerate an algorithm
>that produces signficantly less than 100% of all possible hyphens,
>especially if the hyphens it does find break the word up into
>small enough chunks; I would estimate that a figure as low as 70 to
>80% might be acceptable here.

I would quibble with these figures.  I think you want the first number
under 1% for a general purpose algorithm.  On the other hand, I think
even 50% is quite adequate for the second.  Since the Knuth-Liang
algorithm [description in original article not quoted here] apparently
meets these criteria, I will withdraw my claim.

Frank Adams                           ihpn4!philabs!pwa-b!mmintl!franka
Multimate International    52 Oakland Ave North    E. Hartford, CT 06108

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site brl-tgr.ARPA
Path: utzoo!watmath!clyde!cbosgd!gatech!seismo!brl-tgr!wmartin
From: wmar...@brl-tgr.ARPA (Will Martin )
Newsgroups: net.internat
Subject: Re: Hyphenation (Long message)
Message-ID: <3353@brl-tgr.ARPA>
Date: Mon, 18-Nov-85 15:39:51 EST
Article-I.D.: brl-tgr.3353
Posted: Mon Nov 18 15:39:51 1985
Date-Received: Tue, 19-Nov-85 06:03:27 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <968@enea.UUCP> <501@harvard.ARPA> <795@mmintl.UUCP>
Distribution: net
Organization: USAMC ALMSA, St. Louis, MO
Lines: 31

Actually, it seems to me that you are using this hyphenation difficulty
in the wrong way. Instead of going to great lengths to overcome it, you
could instead use it as a tool to eliminate the bad and noxious practice
of hyphenation itself! As more and more text-production facilities
become computerized, any restraints and limitations imposed by the
computerization will become de facto industry standards. So those who
want hyphenated text for justified right margins or whatever other
reasons could eventually become segregated into the manual-production
part of the field, IF you people, who are the ones that make
computerized hyphenation possible, will simply stick together against it!

There is NO *real* reason to hyphenate words to split them across lines;
it is merely a convention, established over hundreds of years by the
printing establishment. We have a chance here to overcome this hidebound
and annoying custom, and establish instead either variable spacing for
justified right margins, or, better yet, settle on irregular right
margins as the new standard.

Don't cooperate! Instead of making an effort to get the machines to do
what the people say they want, instead expend that same effort to
convince the people that they need no longer indulge in the antiquated
custom of hyphenating at all. And if no computer types do the bidding of
those wanting hyphenation, it will simply die out as the computerized
text-handling becomes ubiquitous. (You don't spend effort to get the
trailing "s" character to print like "f", do you? Treat hyphenation the
same way!)

Will

(If you hadn't guessed by now, I am against hyphenation, and never do it
myself. :-)

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site sunybcs.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!princeton!rocksvax!sunybcs!colonel
From: colo...@sunybcs.UUCP (Col. G. L. Sicherman)
Newsgroups: net.internat
Subject: Re: why hyphenate
Message-ID: <2539@sunybcs.UUCP>
Date: Fri, 22-Nov-85 12:38:45 EST
Article-I.D.: sunybcs.2539
Posted: Fri Nov 22 12:38:45 1985
Date-Received: Sun, 24-Nov-85 05:35:52 EST
References: <501@harvard.ARPA> <795@mmintl.UUCP> <3353@brl-tgr.ARPA>
Distribution: net
Organization: Save the Dodoes Foundation
Lines: 16

[.hw hyp-hen]

> There is NO *real* reason to hyphenate words to split them across lines;
> it is merely a convention, established over hundreds of years by the
> printing establishment. We have a chance here to overcome this hidebound
> and annoying custom, and establish instead either variable spacing for
> justified right margins, or, better yet, settle on irregular right
> margins as the new standard.

This view is rather naive.  Right margins are justified to make the text
easier to read, and centuries of experience have vindicated the practice.
-- 
Col. G. L. Sicherman
UU: ...{rocksvax|decvax}!sunybcs!colonel
CS: colonel@buffalo-cs
BI: csdsicher@sunyabva

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site othervax.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!mcnc!
philabs!micomvax!othervax!ray
From: r...@othervax.UUCP (Raymond D. Dunn)
Newsgroups: net.internat,net.text
Subject: Re: Hyphenation, Re: Why Hyphenate
Message-ID: <731@othervax.UUCP>
Date: Fri, 29-Nov-85 14:20:15 EST
Article-I.D.: othervax.731
Posted: Fri Nov 29 14:20:15 1985
Date-Received: Sun, 1-Dec-85 03:43:40 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> etc
Reply-To: r...@othervax.UUCP (Raymond D. Dunn)
Followup-To: net.text
Distribution: net
Organization: Philips Information Systems - St. Laurent  P.Q., Canada
Lines: 107
Summary: 

(Double posted to net.text as this is where the discussion probably
belongs - follow-ups to there)

Having worked for seven years for a developer/manufacturer of
typesetting and other equipment for both the newspaper and general
graphics arts industries, I would like to add my two cents worth.

It is interesting to note that the graphic *arts* industry is one
which has retained the concepts of style and attention to detail, and
has laudably forgone the all too commonly seen solution of making do
with what automation can provide "easily".

Instead, it has continuously forced the typography equipment
manufacturers to meet their stringent subjective standards of what is
"right" and what is "wrong" in typeset material.  This includes some
exceedingly hard to implement requirements which gave (to
non-insiders) very marginal improvements in "quality".

(An interesting aside, even these standards were not enough for
Knuth, who set off on his (excellent) Tex and Metafont tangent
because of his dissatisfaction with the typesetting of his Life's
Work.  This contains much "scientific" content, a particularly
difficult typography task.  It's only a pity that he chose a
traditional embedded command approach to the typesetting problem,
rather than something more interactive and immediate).

Newspaper production should be disassociated from any serious
discussion about hyphenation, style etc.  Newspapers work to
different rules - the papers must hit the streets.  If several
consecutive lines contain hyphenations, paragraphs contain massive
rivers, or there is more white-space in a line than text - WHO CARES
(they dont)!  However, what an opportunity, if we provide adequate
tools, newspapers may become readable (:-)!

Hyphenation is generally (correctly) regarded as a "Bad Thing".
Unfortunately, it is necessary when meeting the other (subjectively
more important) objectives of layout and style.  These in general
conform to the rule that, when glancing at a typeset page or
paragraph, one's eyes should not be drawn automatically to any place
not specifically intended by the typographer.  In general, although
specific parts of the text may be harder to read, a "noisy" page is
regarded as being more difficult to read overall, than a "quiet" one.

Any arguments in this context, for and against hyphenation in
general, and concering justification/ragged-right, are specious.
They fall into the category of "I like/hate Picasso".  Certainly
there is room for other styles, and we must provide technological
solutions for *all* of them.

Traditionally, hyphenation has been implemented by algorithm, with an
associated exception-word-dictionary. This was the case *only*
because it was impractical to store and access a full dictionary.

It *IS NOT* possible to implement acceptable hyphenation solely by
algorithm (in English certainly).  There are many classical examples,
the one that immediately comes to mind is "therapist", "the-
rapist" (I hope this is not Freudian).  If your pet algorithm can
handle this one, then there will be other examples on which it too
will fail.

It *IS* by definition possible to implement hyphenation solely by
dictionary.  If the dictionary is large enough, the assumption that a
word is non-hyphenable if it does not appear there is perfectly
acceptable.  As has already been pointed out in previous articles, a
dictionary can easily be structured to handle all the "peculiars",
like hyphenation also causing a word to change its spelling (this was
news to me).

Now to get the arguments rolling (:-) :

It is almost certain that as the use of What-You-See-Is-What-You-Get
systems increase, as storage costs go down, and *SPELLING CORRECTION
DICTIONARIES* become the norm on text manipulation systems,
hyphenation *WILL* be done automatically solely by (that) dictionary.

Tex, and the current UNIX tools for typeset text preparation, are
rapidly becoming dinosaurs - they probably have already become so.
Visible typography commands embedded in text, and separate H & J/page
makeup runs are passe (see - we need an extended character set even
for English (:-)), even if we have a "soft typesetter" screen to see
the results before we commit the text to the typesetter/printer.

You cannot expect the "average" user to struggle with an embedded
typesetting langauge in which (s)he has to go through a mental
mapping process from ad-hoc command to spacial effect, and this user
will increasingly demand full typographic features as (s)he fully
realises the capabilties of laser printers.

WYSIWYG systems (with the associated demise of much of the graphic
arts industry) are becoming increasingly practical and popular, from
Interleave to the good old "Mac".  The drop in price of both quality
laser printers, RAM, and the obvious need to manipulate text and
graphics together (both pictures and line drawings), can only speed
up this trend.

For the doubters, even within the traditional graphics arts industry
WYSIWYG systems were always regarded as the favoured solution.  They
have been around for at least 10 years in specific applications like
display-ad make-up, and were only limited by their lack of
appropriate cost effective technology (both hardware and software).


Ray Dunn.   ..philabs!micomvax!othervax!ray

Disclaimer: The above opinions are my own, for what they are worth,
            and I have no direct connection with the current graphics
            arts industry.

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site glacier.ARPA
Path: utzoo!watmath!clyde!burl!ulysses!mhuxr!mhuxn!ihnp4!nsc!glacier!reid
From: r...@glacier.ARPA (Brian Reid)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <1861@glacier.ARPA>
Date: Sun, 1-Dec-85 11:56:36 EST
Article-I.D.: glacier.1861
Posted: Sun Dec  1 11:56:36 1985
Date-Received: Tue, 3-Dec-85 07:34:14 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<731@othervax.UUCP>
Reply-To: r...@glacier.UUCP (Brian Reid)
Distribution: net
Organization: Stanford University, Computer Systems Lab
Lines: 27

In article <7...@othervax.UUCP> r...@othervax.UUCP (Raymond D. Dunn) writes:
>Tex, and the current UNIX tools for typeset text preparation, are
>rapidly becoming dinosaurs - they probably have already become so.
>Visible typography commands embedded in text, and separate H & J/page
>makeup runs are passe

BALONEY. There is a place in the world for WYSIWYG systems that do not use
embedded commands, but there is a large class of documents that 
cannot be done at all well with the kind of interactive system that you are
talking about. Anything where the structure is as important as the content.
Cookbooks like the @i[Joy of Cooking]. Encyclopedias. Airline schedules.
Dictionaries. Reference manuals for computer software.

Interactive systems are just fine for small documents, and for documents
whose appearance is extremely important with respect to their content. They
are not OK for large documents.

Whether or not interactive systems will EVER be ok for this kind of material
is an open research topic. My own belief is that it is possible to build an
interactive system that does not throw away all of the extra capability that
the batch systems give you right now.  However, as long as the interactive
text formatting programs are being programmed by people who think that
interactive systems are inherently better, there is no danger of them
becoming better.
-- 
	Brian Reid	decwrl!glacier!reid
	Stanford	r...@SU-Glacier.ARPA

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site umd5.UUCP
Path: utzoo!watmath!clyde!bonnie!akgua!gatech!seismo!umcp-cs!cvl!umd5!zben
From: z...@umd5.UUCP
Newsgroups: net.text
Subject: Re: Hyphenation, Re: Why Hyphenate
Message-ID: <803@umd5.UUCP>
Date: Sun, 1-Dec-85 13:13:32 EST
Article-I.D.: umd5.803
Posted: Sun Dec  1 13:13:32 1985
Date-Received: Tue, 3-Dec-85 08:28:09 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<731@othervax.UUCP>
Reply-To: z...@umd5.UUCP (Ben Cranston)
Distribution: net
Organization: U of Md, CSC, College Park, Md
Lines: 43
Summary: My two cents for yours

In article <7...@othervax.UUCP> r...@othervax.UUCP (Raymond D. Dunn) writes:

>It *IS* by definition possible to implement hyphenation solely by
>dictionary.  If the dictionary is large enough, the assumption that a
>word is non-hyphenable if it does not appear there is perfectly
>acceptable.  As has already been pointed out in previous articles, a
>dictionary can easily be structured to handle all the "peculiars",
>like hyphenation also causing a word to change its spelling (this was
>news to me).

Oh really.  What then, pray tell, would your dictionary entry for the word
"record" contain?  When used as a verb ("to record the data") it should be
"re-cord", but when used as a noun ("give me the record") it should be
"rec-ord" (assuming one hyphenates at syllables, anyway)...

Oh, I guess you'd leave that word out...  :-)

>WYSIWYG systems (with the associated demise of much of the graphic
>arts industry) are becoming increasingly practical and popular, from
>Interleave to the good old "Mac".  The drop in price of both quality
>laser printers, RAM, and the obvious need to manipulate text and
>graphics together (both pictures and line drawings), can only speed
>up this trend.

WYSIWYG systems have their proponents and their uses.  They are VERY good
for novice users, and given the way this field is growing I should think
that "novice users" are going to be the MAJORITY of users until the entire
society is computer literate.  (This much like "automobile literate" was
the thing to be when I was a teenager - something 19 year old males can be
macho about...)

However, there are times when the WYSIWYG paradigm breaks down badly.  As
a somewhat strained analogy, a strict WYSIWYG system might have you use a
mouse to pick out letters from a menu, rather than using a conventional
keyboard.  This would be easier for the "novice user" than learning to type,
but would ultimitely limit data-entry rates to values far below those
attainable by a practiced keyboard operator...

Admittedly a strained example, but take a look around for such pathological
cases the next time you study a WYSIWYG system...

-- 
Ben Cranston  ...{seismo!umcp-cs,ihnp4!rlgvax}!cvl!umd5!zben  z...@umd2.umd.edu

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10 beta 3/9/83; site utecfc.UUCP
Path: utzoo!utcsri!utai!uthub!utecfa!utecfc!dennis
From: den...@utecfc.UUCP (Dennis Ferguson)
Newsgroups: net.text
Subject: Re: Hyphenation, Re: Why Hyphenate
Message-ID: <46@utecfc.UUCP>
Date: Sun, 1-Dec-85 16:37:32 EST
Article-I.D.: utecfc.46
Posted: Sun Dec  1 16:37:32 1985
Date-Received: Sun, 1-Dec-85 21:32:12 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> <731@othervax.UUCP>
Reply-To: den...@utecfc.UUCP (Dennis Ferguson)
Distribution: net
Organization: Mechanical Engineering, University of Toronto
Lines: 52
Summary: 

In article <7...@othervax.UUCP> r...@othervax.UUCP (Raymond D. Dunn) writes:
>Having worked for seven years for a developer/manufacturer of
>typesetting and other equipment for both the newspaper and general
>graphics arts industries, I would like to add my two cents worth.
>
>It is interesting to note that the graphic *arts* industry is one
>which has retained the concepts of style and attention to detail, and
>has laudably forgone the all too commonly seen solution of making do
>with what automation can provide "easily".
...
>Hyphenation is generally (correctly) regarded as a "Bad Thing".
>Unfortunately, it is necessary when meeting the other (subjectively
>more important) objectives of layout and style.  These in general
>conform to the rule that, when glancing at a typeset page or
>paragraph, one's eyes should not be drawn automatically to any place
>not specifically intended by the typographer.  In general, although
>specific parts of the text may be harder to read, a "noisy" page is
>regarded as being more difficult to read overall, than a "quiet" one.
>
>Any arguments in this context, for and against hyphenation in
>general, and concering justification/ragged-right, are specious.
>They fall into the category of "I like/hate Picasso".  Certainly
>there is room for other styles, and we must provide technological
>solutions for *all* of them.

If this is true, I find the divergence of the `subjective' opinion of
the graphics arts industry concerning what looks prettier on the page
with the objectively-established opinion of the scientific community
concerning what is easier to read quite interesting.

I spent several years working in a psychology lab for a professor whose
research interests included the acquisition of written language.  Our
own work, which involved the evaluation of readability of text by the
analysis of eye movement data, concurred with the great body of existing
experimental measurements of such things as understanding, retention
and speed of reading of written language in showing that text was most
easily and efficiently read when it was unhyphenated and unleaded, with
a ragged right.  In fact, during the period I worked there, the professor
was involved with the organization of a conference devoted to the topic.
The proceedings, which he editted, were typeset entirely in this form, with
the right ragged.

While my memory is dim, I recall that the original reason for right
justification was technical.  Early printing presses, the kind with actual
lead type, required that the text be set in a square block to keep even
pressure over the paper to prevent slippage of the paper and consequent
smearing of the right-hand ends of long lines.  While the technical reasons
for right justification have long since disappeared, I guess old habits die
hard.
---
				    Dennis Ferguson
				    ...!{decvax,ihnp4}!utcsri!utecfc!dennis

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site mips.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!decwrl!
glacier!mips!mash
From: m...@mips.UUCP (John Mashey)
Newsgroups: net.text
Subject: Re: embedded-command text systems [vs WYSIWYG, support for Reid]
Message-ID: <250@mips.UUCP>
Date: Mon, 2-Dec-85 03:46:26 EST
Article-I.D.: mips.250
Posted: Mon Dec  2 03:46:26 1985
Date-Received: Thu, 5-Dec-85 04:44:33 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<731@othervax.UUCP> <1861@glacier.ARPA>
Distribution: net
Organization: MIPS Computer Systems, Mountain View, CA
Lines: 71

Brian Reid, decwrl!glacier!reid, writes:
> In article <7...@othervax.UUCP> r...@othervax.UUCP (Raymond D. Dunn) writes:
> >Tex, and the current UNIX tools for typeset text preparation, are
> >rapidly becoming dinosaurs - they probably have already become so....

> BALONEY. There is a place in the world for WYSIWYG systems that do not use
> embedded commands, but there is a large class of documents that 
> cannot be done at all well with the kind of interactive system that you are
> talking about. Anything where the structure is as important as the content....
> 
> Whether or not interactive systems will EVER be ok for this kind of material
> is an open research topic. My own belief is that it is possible...

1] reid's comments are good. Although I get good use from my Mac, and Interleaf
is fine, I still find Scribe, TeX, or troff+friends unavoidable for
some classes of documents.  Considering the number of people who have
access to both classes of support, and how many still burn billions of
CPU cycles running the latter, others must share this opinion.

Certainly, at least in the late 70s inside Bell Labs, it was almost
impossible to "sell" easier-to-use, but more restricted facilities over
harder-to-learn, but more powerful ones.  For example, that's how the
-MM macros got to be huge: we wanted to snuff out most of the (slightly
different) macro packages that were springing up and causing total chaos,
and had to preserve as much flexibility as possible, even at the
expense of adding hordes of options and consuming CPU cycles.  [This does
not imply that a Bell Labs is necessarily a "typical" environment [it isn't],
but that there exists a sizable audience for very powerful facilities.]

2] In one sense the "dinosaur" comment is sadly true.  After all, the
fundamental ideas of nroff/troff derive from 20-year-old runoff;
the troff/tbl/eqn/(-ms or -mm) group was all there by late 1976.
[Much useful work has been done since, on portability, device-independence,
new filters like pic & ideal, etc.  Nevertheless, most of what most people
use still matches what was there 9-10 years ago].  As I recall, Scribe
and TeX appeared in 1978 [reid, correct me please!]  We've certainly
made engineering progress in the use and support of these things; what's
not clear is how much fundamental progress we've made.

3] I too believe that it is possible to build interactive systems that
don't throw away the power of the most powerful current formatters.
I sure hope so: it would be pleasant to have something that would
totally replace troff+friends (or equivalent) earlier than 10 years
from now.  Maybe reid would give some pointers to a
few of the most interesting current research efforts
[i.e., that combine good features of WYSIWYG and markup languages.]?
I've generally thought that the text-processing system I've always
wanted on my desk needed
a) WYSIWYG editing + the best of structural description.  The latter should
be able to do about as well as Scribe or troff -MM, else no go.
b) Integrated graphics, images
c) interactive eqn, and especially tbl equivalent.
d) Multiple concurent views that let me edit at least the formatted
(WYSIWYG) view or the markup-language view [I'd like Hypertext-like features,
and holphrastic displays on document structure, and a bunch of others,
but no need to get greedy.]
e) Integrated spelling checker, Writer's Workbench, etc.
f) Desktop workstation with 8-10X VAX-780 integer performance, 8-32MB memory,
[my guess at what it takes to do a)-f) with reasonable programming.]

Now, e) exists commercially, b) is there now and/or coming soon;
examples of d) have been around, more-or-less. I haven't yet seen a
good version of c).  f) is clearly here within 2-3 years.
That really leaves a), so I hope people are working hard on the problem!
I WANT one of these things, so I hope there are some running in the
lab now, so that the software might be there when the hardware is.
-- 
-john mashey
UUCP: 	{decvax,ucbvax,ihnp4}!decwrl!mips!mash
DDD:  	415-960-1200
USPS: 	MIPS Computer Systems, 1330 Charleston Rd, Mtn View, CA 94043

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site glacier.ARPA
Path: utzoo!watmath!clyde!cbosgd!ihnp4!nsc!glacier!reid
From: r...@glacier.ARPA (Brian Reid)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <1919@glacier.ARPA>
Date: Tue, 3-Dec-85 11:48:16 EST
Article-I.D.: glacier.1919
Posted: Tue Dec  3 11:48:16 1985
Date-Received: Thu, 5-Dec-85 07:49:46 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<1861@glacier.ARPA> <116@utastro.UUCP>
Reply-To: r...@glacier.UUCP (Brian Reid)
Distribution: net
Organization: Stanford University, Computer Systems Lab
Lines: 35


If you look at the interaction of technology and industry for the past few
hundred years, you will see a recurring theme. A new technology gets
invented. The in-place industry applies that technology to automate what
they are currently doing. This is often inefficient, as the new technology
is often better applied by changing the fundamental premises of the
industry. Gradually new companies grow up, which use the new technology in a
different way, and if it is more cost-effective, then the new industry
drives the old one out of business.

The "obvious" application of computer technology to the graphic arts
industry is to give them computer systems that mimic the way they have been
doing business--wysiwyg systems. Naturally they will prefer this. A
non-obvious approach is to eliminate the graphic arts industry, applying the
new technology to make 80% of its work force redundant. Then it doesn't
matter what they think.

I claim that, for a wide range of publications, the traditional graphic-arts
industry approach of cutting, pasting, and wysiwyg systems simply cannot be
competitive with more software-intensive approaches such as embedded-command
systems. You will be trading one programmer for 4 graphic artists.

At the moment we are in a transition phase. The graphic arts industry is
discovering computers, and they are molding them in their own image, taking
the things that they have done by hand since the invention of cold type and
putting them isomorphically onto the computer. Simultaneously, however,
thousands of businesses are discovering that they don't NEED the graphic
arts industry. With simple computer tools they can achieve their end
results--the publication of books or newsletters or catalogs--without
graphic artists. If history serves as any guide, then in half a generation
the traditionalist approach will no longer be competitive and will have to
pull out of those markets completely.
-- 
	Brian Reid	decwrl!glacier!reid
	Stanford	r...@SU-Glacier.ARPA

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site osu-eddie.UUCP
Path: utzoo!watmath!clyde!cbosgd!osu-eddie!elwell
From: elw...@osu-eddie.UUCP (Clayton M. Elwell)
Newsgroups: net.text
Subject: Typesetting systems / WYSIWYG / ...
Message-ID: <903@osu-eddie.UUCP>
Date: Tue, 3-Dec-85 15:49:33 EST
Article-I.D.: osu-eddi.903
Posted: Tue Dec  3 15:49:33 1985
Date-Received: Thu, 5-Dec-85 06:02:30 EST
Distribution: net
Organization: Ohio State Univ., CIS Dept., Cols, Oh.
Lines: 58

In this argument about the suitability of embedded-command vs. WYSIWYG and
laser printer vs. photocomposer, several distinct issues seem to have
become confused.

WYSIWYG systems are extremely useful for some applications.  Embedded-command
systems are just as useful for other applications.  For example, if I were
producing, for example, a newsletter for limited distribution (such as for
a club or other local organization), I would use a WYSIWYG page layout system
and a small laser printer.  A Macintosh with Aldus PageMaker and a LaserWriter
would be the way to go.  It would allow me to do a ONE-TIME layout quickly and
accurately, with fast proofing and output quality good enough for xerographic
reproduction.  It certainly beats a Selectric.  This is being popularly
referred to as ``desktop publishing.''  This is what such products as PageMaker
were designed for.

On the other hand, if I were writing a book that would be conventionally
printed, I would use TeX (or a TeX macro package such as LaTeX) with a
laser printer for proofing and a photocomposer for reproduction masters.
Put simply, I have not found a better system for getting the highest quality
output with the smallest expenditure of effort.  I don't WANT to manually
lay out each page of my document.  I don't even want to manually lay out
section headings and the like.  I want to specify the format I want once,
in excrutiating detail if necessary, and not worry about it again.  Aside
from that, TeX's handling of kerning, page and line breaking, etc., works
correctly.  In the uncommon situation that you need it to act differently,
it can.  All you have to do is tell it what you want.  If you want
unhyphenated, ragged-right text set without leading, you can do it by putting
one line at the beginning of your text.  If you decide it was a bad idea
after all, take out that line.  Voila!.

Ah, but I hear the objection that it wastes paper to reformat and print it
to make sure it came out the way you wanted it.  This is not true.  Since
TeX puts out a device independent file, I can preview it on my screen first,
and then print out selected pages.

As to output devices, laser printers are very nice toys.  They allow
quite reasonable-looking output on a demand basis at a fairly low cost.
If a photocopy is good enough quality (such as for a reference manual
for a computer program, high-class form correspondence, etc.), a laser
printer is usually the right solution.  For professional printing,
however, there is no substitute for a photocomposer.  I can see jaggies
on a 300 dpi LaserWriter.  It does an admirable approximation, but it
isn't the same, especially when reproduced.

What type of software & hardware is the ``best way'' depends on what you
are doing.  Let me draw an analogy--If I want to move my belongings from
Ohio to California, I'll use a moving van.  If I want to get from my home
to a conference in another state, I'd rather have a Ferrari.  Neither
is better than the other.  What's important is to use the right tools
for the job at hand.

-- 
				-- Clayton Elwell
				Elw...@Ohio-State.CSNET
				Elwell%Ohio-St...@CSNET-RELAY.ARPA
				...!cbosgd!osu-eddie!elwell
-----------------
"Roads? Where we're going, we don't need roads..."

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10 5/3/83; site utzoo.UUCP
Path: utzoo!henry
From: he...@utzoo.UUCP (Henry Spencer)
Newsgroups: net.text
Subject: Re: Hyphenation, Re: Why Hyphenate
Message-ID: <6201@utzoo.UUCP>
Date: Tue, 3-Dec-85 17:07:01 EST
Article-I.D.: utzoo.6201
Posted: Tue Dec  3 17:07:01 1985
Date-Received: Tue, 3-Dec-85 17:07:01 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> etc, <731@othervax.UUCP>
Organization: U of Toronto Zoology
Lines: 22

> ...  It's only a pity that [Knuth] chose a
> traditional embedded command approach to the typesetting problem,
> rather than something more interactive and immediate...

He really didn't have any choice, since he probably didn't feel like spending
$50k or so (remember this was some years ago) for the sort of equipment he'd
need to build something more interactive and immediate.  He probably also
felt that it would be nice if what he did were usable from an ordinary ASCII
terminal, so that it could be used by the masses instead of just the lucky
few.  (Even today, most of us still work on ASCII terminals.)

A contributing consideration may have been the desire to produce documents
that could be compiled for different output devices without needing manual
reworking.  This implies that the document must be specified in fairly
abstract ways, not in terms of exactly how it looks.  It is possible to
combine this kind of high-level document specification with interactive
immediacy, but it is harder.  Note that Knuth works hard to do things like
"hyphenating" equations well automatically, to avoid manual tuning even in
that fairly-extreme case.  (And you thought hyphenating English was bad...)
-- 
				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,linus,decvax}!utzoo!henry

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site othervax.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!ucbvax!
ucdavis!lll-crg!gymble!umcp-cs!seismo!cmcl2!philabs!micomvax!othervax!ray
From: r...@othervax.UUCP (Raymond D. Dunn)
Newsgroups: net.text
Subject: Re: Hyphenation, Re: Why Hyphenate
Message-ID: <733@othervax.UUCP>
Date: Wed, 4-Dec-85 12:05:40 EST
Article-I.D.: othervax.733
Posted: Wed Dec  4 12:05:40 1985
Date-Received: Sat, 7-Dec-85 03:18:57 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<731@othervax.UUCP> <803@umd5.UUCP>
Reply-To: r...@othervax.UUCP (Raymond D. Dunn)
Distribution: net
Organization: Philips Information Systems - St. Laurent  P.Q., Canada
Lines: 93

In article <8...@umd5.UUCP> z...@umd5.UUCP (Ben Cranston) responds to my
earlier posting:

>> It *IS* by definition possible to implement hyphenation solely by
>> dictionary.  If the dictionary is large enough, the assumption that a
>> word is non-hyphenable if it does not appear there is perfectly
>> acceptable...                ^^^^^^^^^^^^^^^
            [I should have said "does not contain any hyphenation points"]

> Oh really.  What then, pray tell, would your dictionary entry for the word
> "record" contain?  When used as a verb ("to record the data") it should be
> "re-cord", but when used as a noun ("give me the record") it should be
> "rec-ord" (assuming one hyphenates at syllables, anyway)...
>
> Oh, I guess you'd leave that word out...  :-)

To be fair, I missed examples of this type (even if I don't
necessarily agree with your hyphenation of "rec-ord").

However this does *not* contradict the dictionary argument, in fact
it enhances it.

Assuming a parser was used to determine the part of speech of a word,
no practical hyphenation algorithm could be devised to hyphenate
words accordingly.  The dictionary of course *could* easily be
constructed to contain different hyphenation points for different
uses of a word when necessary.

>> WYSIWYG systems (with the associated demise of much of the graphic
>> arts industry) are becoming increasingly practical and popular, from
>> Interleave to the good old "Mac"...

> WYSIWYG systems have their proponents and their uses.  They are VERY good
> for novice users, and given the way this field is growing I should think
> that "novice users" are going to be the MAJORITY of users until the entire
> society is computer literate.  (This much like "automobile literate" was
> the thing to be when I was a teenager - something 19 year old males can be
> macho about...)

By your definition then, the majority of automobile users are
novices, and always will be.  Their literacy does not extend further
than the use of five or six controls.  There is no desire in the
majority, nor *need*, to become "automobile literate" in your sense,
the user interface has been designed that way.  The same argument
applies to computer systems.

> However, there are times when the WYSIWYG paradigm breaks down badly.  As
> a somewhat strained analogy, a strict WYSIWYG system might have you use a
> mouse to pick out letters from a menu, rather than using a conventional
> keyboard.....

Not just a strained analogy, totally irrelevant.  Its like saying "a
keyboard *might* have just one key which you hit repeatedly until the
character of choice appears, thus any computer system which uses a
keyboard is ...".  

We are discussing WYSIWYG systems, and the ability of the general
user to do typesetting, not the pros and cons "of mice over
keyboards" (gosh there's a title for a paper (:-)).  WYSIWYG implies
an *approach*, not necessarily a specific user interface.

>Admittedly a strained example, but take a look around for such pathological
>cases the next time you study a WYSIWYG system...

It is difficult enough dealing with the real pathological cases
without trying to handle imaginary ones!

OK, so you're trying to make a point on "efficiency".  Good, that's
what I'm doing as well.  With complex tasks like typesetting, to
reduce this to a measure of keystroke counts is absurd.

The use of a traditional typesetting system requires much dedication
and training, and the ability to visualise the mapping from embedded
commands to the resulting typeset page.  (Even with an expert,
several trial runs on the hardcopy typesetter, or to a "soft" screen,
are often required before the desired effect is achieved).

Many people do not, and can never, have this ability, nor should they
be *required* to train themselves for tasks ancilliary to their
mainstream interest.  They didn't in the past, they turned to an
"expert" (and paid him big bucks).  They shouldn't have to now, they
turn to a computer.  Their literacy need only be how to drive the
thing in a natural way to them, not to be able to manipulate the
nuts and bolts.

*That* is efficiency!

A last point.  Compare this area of expertise with what has happened
in the computerisation of other disciplines (spreadsheets, data
managers, report generators, and the birth of the prime example,
expert systems).

Ray Dunn.  ..philabs!micomvax!othervax!ray

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site umd5.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!ucbvax!
ucdavis!lll-crg!gymble!umcp-cs!cvl!umd5!zben
From: z...@umd5.UUCP
Newsgroups: net.text
Subject: Re: WYSIWYG
Message-ID: <807@umd5.UUCP>
Date: Wed, 4-Dec-85 18:13:50 EST
Article-I.D.: umd5.807
Posted: Wed Dec  4 18:13:50 1985
Date-Received: Fri, 6-Dec-85 07:46:38 EST
References: <280@opus.UUCP>
Reply-To: z...@umd5.UUCP (Ben Cranston)
Organization: U of Md, CSC, College Park, Md
Lines: 43
Summary: Some war stories

To be perfectly honest I should inform the reader that I have spent a
significant amount of time over the past 10 years writing and enhancing
an "embedded command" typesetting program (called DPS) that runs on large
Sperry Univac mainframes.

In article <2...@opus.UUCP> r...@opus.UUCP (Dick Dunn) writes:

>One of the reasons that an H&J pass is a problem is that you can't
>generally find all of the problems (bad breaks, rivers, etc.) without
>looking at the final page.  If you have to wait for the entire document to
>be formatted and a proof copy printed, the sort of fine-tuning necessary
>for really high-quality output can be very tedious.  Once you find and fix,
>say, a bad line break, you have to re-run a proof copy and check everything
>after that until a boundary that puts you back to the old document (meaning
>at least the end of a page and perhaps the end of a chapter).  It's like
>fixing compilation errors in a program when the compiler only shows you a
>few at a time.

One of my more onerous tasks is doing the typesetting for the Undergraduate
and Graduate Catalogs of the University.  These are LARGE documents, about
40000 lines of 128 column source text, and about 200 to 250 typeset output
pages.  The first runs took 8 hours clock on an 1100/42; I am down to less
than a half-hour clock on an 1100/92.

There is a tremendous pressure to keep the (physical) weight of the Undergrad
catalog below one pound.  We mail these catalogs to high schools around the
country, and going over a pound would kill us in postage!  Some refugee from
a Thanksgiving dinner decided to compress the document by not starting each
section on a new page...

So, this problem Dick mentions is worst-case here.  It doesn't get back into
sync until the next CHAPTER break.  Needless to say, this makes it virtually
impossible to do the kind of fine-tuning described.  The whole rest of the
chapter shifts, and you might end up creating dozens of bad widows in trying
to fix one.  Given the extreme time pressure (a week or two from when I get
the data till the printer needs the typeset pages) it is impossible to do any
fine tuning at all.

Thank <insert-euphemism-for-deity-here> that the powers-that-be are more
interested in getting the thing out on time than the subtle, finer points
of typesetting...
-- 
Ben Cranston  ...{seismo!umcp-cs,ihnp4!rlgvax}!cvl!umd5!zben  z...@umd2.umd.edu

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.1 6/24/83; site umcp-cs.UUCP
Path: utzoo!watmath!clyde!cbosgd!gatech!seismo!umcp-cs!chris
From: ch...@umcp-cs.UUCP (Chris Torek)
Newsgroups: net.text
Subject: Re: Hyphenation, Re: Why Hyphenate
Message-ID: <2425@umcp-cs.UUCP>
Date: Thu, 5-Dec-85 01:40:54 EST
Article-I.D.: umcp-cs.2425
Posted: Thu Dec  5 01:40:54 1985
Date-Received: Fri, 6-Dec-85 06:44:12 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> <731@othervax.UUCP> <46@utecfc.UUCP>
Distribution: net
Organization: U of Maryland, Computer Science Dept., College Park, MD
Lines: 48

In artcile <4...@utecfc.UUCP> den...@utecfc.UUCP (Dennis Ferguson) writes:

>In article <7...@othervax.UUCP> r...@othervax.UUCP (Raymond D. Dunn) writes:
>>... Any arguments in this context, for and against hyphenation in
>>general, and concering justification/ragged-right, are specious.
>>They fall into the category of "I like/hate Picasso".  Certainly
>>there is room for other styles, and we must provide technological
>>solutions for *all* of them.

This is important!  Back to den...@utecfc.UUCP:

>If this is true, I find the divergence of the `subjective' opinion of
>the graphics arts industry concerning what looks prettier on the page
>with the objectively-established opinion of the scientific community
>concerning what is easier to read quite interesting. ...

>[our work] concurred with the great body of existing experimental
>measurements of such things as understanding, retention and speed
>of reading of written language in showing that text was most easily
>and efficiently read when it was unhyphenated and unleaded, with
>a ragged right. ...

I will assume these measurements have been made with existing
typographics; or if not, that you were careful to bring in the
graphics arts folks first.  Done wrong, right justification seems
to me much worse than ragged right.  Even if you did your own
typesetting, this is still a lesser point:

>While the technical reasons for right justification have long since
>disappeared, I guess old habits die hard.

*This* is important.  Old habits do die hard; yet they are not only
on the part of the typesetters, but also on that of the readers.
As an anecdotal example, I recently bought a collection of Twain's
writings.  It is set ragged-right, unleaded, and unhyphenated.  I
find that the right margin keeps bothering me.  But of course I
have been `conditioned' to expect a flush right margin in typeset
text.

But that I have been `conditioned' does not mean that I am in the
wrong, and that all text should forevermore be printed ragged-right!

There is room for many styles, and we must provide technological
solutions for *all* of them.
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 4251)
UUCP:	seismo!umcp-cs!chris
CSNet:	chris@umcp-cs		ARPA:	ch...@mimsy.umd.edu

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site othervax.UUCP
Path: utzoo!linus!philabs!micomvax!othervax!ray
From: r...@othervax.UUCP (Raymond D. Dunn)
Newsgroups: net.text
Subject: Re: WYSIWYG and lasers printers versus PTS's
Message-ID: <735@othervax.UUCP>
Date: Thu, 5-Dec-85 14:59:21 EST
Article-I.D.: othervax.735
Posted: Thu Dec  5 14:59:21 1985
Date-Received: Sat, 7-Dec-85 16:16:17 EST
References: <280@opus.UUCP>
Reply-To: r...@othervax.UUCP (Raymond D. Dunn)
Organization: Philips Information Systems - St. Laurent  P.Q., Canada
Lines: 77
Summary: 


Fantastic!  Other than the "BALONEY" comment (and to a certain extent my
response to it - sorry), the content to noise/flame level in this
discussion is, for USENET, at the perfect level.  People are even
admitting their vested-interest biases for heaven's sake!

Dick Dunn's response in article <2...@opus.UUCP> covers virtually all
the bases.  He said much that I left unsaid or wanted to respond with
(I only disagree with some small nuances - you wouldn't be interested.

The product development he is associated with sounds *exactly* like
the one my group tried to get off the ground between 1979 and 1981
(it was canned, but that's another story - anyone interested in reams
of un-used software (:-)).

There is one point I feel should be added to the discussion however,
in article <9...@osu-eddie.UUCP> Clayton M. Elwell says:

>In this argument about the suitability of embedded-command vs. WYSIWYG and
>laser printer vs. photocomposer, several distinct issues seem to have
>become confused.
>...
>As to output devices, laser printers are very nice toys.  They allow
>quite reasonable-looking output on a demand basis at a fairly low cost.
>If a photocopy is good enough quality....        ..... a laser
>printer is usually the right solution.  For professional printing,
>however, there is no substitute for a photocomposer.

The differences between laser printers and photo-typesetters (PTS's),
are small, and getting less.  In 1980 Mergenthaler Linotype
introduced a laser PTS at 720dpi which produced camera ready copy
(i.e. positive) and used a xerographic process similar to today's
laser printers (but took 6 minutes per page!!).

720dpi (1/100th point resolution) is regarded as being at the bottom
of acceptable typesetter resolution.

I am not acquainted with the currently available laser typesetter
offerings.

At Comdex, several suppliers announced the imminent launch of 600dpi
printers (getting close!).  I personally know of at least one laser
printer in final development (to sell into the PTS's market) which
produces 120 pages/minute at 600dpi, with a 1200dpi version in early
development (calculate the available time for computing each dot!).
These use full PTS like digitised fonts, and can generate a wide
range of point sizes, "electronic" italic, character rotation etc etc
from the basic fonts.  Online fonts are only limited by the
configuration chosen.  Admittedly these are *expen$ive*, as they
require very fast parallel-processing bit-sliced architectures.

Combine this with the fact that "laser plate makers" have been around
for at least 6 years (they produce the litho printing plates for
large-run jobs - many newspaper etc systems go straight to plate
maker without any photographic process involved).  It seems that the
two output methods will merge very quickly,  except for specialised
applications.

BTW, someone implied that the graphics arts industry was
[paraphrased] "just learning how to deal with computers".  This is
demonstratably false I believe.  PTS's (paper-tape in, wet-process
developed "galley" output), and front-end systems to drive them
(often off-line), go back to the sixties.  What the graphics arts
industries *is* learning to deal with, is the fact that computer
solutions are removing their customer base, and whereas in the past
the computer industry provided them with highly tailored front-end
systems to suit their application, they are now often being thrown to
the wolves with general purpose machines - there's another
interesting subject for discussion.

Ray Dunn.   ..philabs!micomvax!othervax!ray

Disclaimer again: I have no current commercial vested interest in the
                  graphic arts industry, nor does my direct employer,
                  although other divisions of Philips *do* make laser
                  printers, and probably somewhere there is a division
                  which.....

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site othervax.UUCP
Path: utzoo!watmath!clyde!bonnie!akgua!gatech!seismo!cmcl2!philabs!micomvax!
othervax!ray
From: r...@othervax.UUCP (Raymond D. Dunn)
Newsgroups: net.text
Subject: Re: WYSIWYG and lasers printers versus PTS's
Message-ID: <737@othervax.UUCP>
Date: Fri, 6-Dec-85 16:13:09 EST
Article-I.D.: othervax.737
Posted: Fri Dec  6 16:13:09 1985
Date-Received: Mon, 9-Dec-85 03:04:51 EST
References: <280@opus.UUCP> <735@othervax.UUCP>
Reply-To: r...@othervax.UUCP (Raymond D. Dunn)
Organization: Philips Information Systems - St. Laurent  P.Q., Canada
Lines: 7

In article <7...@othervax.UUCP> I wrote:

>720dpi (1/100th point resolution) ....

Typo! - 72 points to the inch, i.e. 1/10th point resolution!

Ray Dunn.  ..philabs!micomvax!othervax!ray

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84 SMI; site sun.uucp
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!decwrl!
sun!guy
From: g...@sun.uucp (Guy Harris)
Newsgroups: net.text
Subject: Re: embedded-command text systems [vs WYSIWYG, support for Reid]
Message-ID: <3064@sun.uucp>
Date: Sat, 7-Dec-85 02:23:20 EST
Article-I.D.: sun.3064
Posted: Sat Dec  7 02:23:20 1985
Date-Received: Mon, 9-Dec-85 03:33:51 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<731@othervax.UUCP> <1861@glacier.ARPA> <250@mips.UUCP>
Distribution: net
Organization: Sun Microsystems, Inc.
Lines: 57

> 2] In one sense the "dinosaur" comment is sadly true.  After all, the
> fundamental ideas of nroff/troff derive from 20-year-old runoff...
> the troff/tbl/eqn/(-ms or -mm) group was all there by late 1976.
> ...As I recall, Scribe and TeX appeared in 1978 [reid, correct me please!]
> We've certainly made engineering progress in the use and support of these
> things; what's not clear is how much fundamental progress we've made.

Well, I certainly consider the change from the "tell it what to do, in
detail" model to the "tell it what you want" model (more particularly, the
change from ".in", ".ti", traps, diversions, etc. to the object/stylesheet
model) to be fundamental progress (although various macro packages and
preprocessors put a model like this on top of, to use the wonderful phrase
of someone here, "full-frontal troff").

> I've generally thought that the text-processing system I've always
> wanted on my desk needed
> a) WYSIWYG editing + the best of structural description.  The latter should
> be able to do about as well as Scribe or troff -MM, else no go.
> b) Integrated graphics, images

What's missing in systems like Interleaf?  It does have an object/stylesheet
model, so it provides some amount of structural description (derived,
according to somebody from Interleaf, from the model of the Etude system at
MIT).  It also has integrated graphics, and even MacPaint-ish images in the
latest release of the top-of-the-line version.

> c) interactive eqn, and especially tbl equivalent.

Interleaf doesn't have an interactive EQN equivalent, and it supports
multiple varieties of tabs but no TBL equivalent, to my knowledge.  I
believe the Xerox Star software does have interactive EQN and possibly TBL
equivalents, though.

> d) Multiple concurent views that let me edit at least the formatted
> (WYSIWYG) view or the markup-language view [I'd like Hypertext-like
> features, and holphrastic displays on document structure, and a bunch
> of others, but no need to get greedy.]

Sounds somewhat like IBM's Janus, where there were two screens, one of which
displayed the formatted text and one of which displayed the text+markup
language.  My own prejudice is that I'd rather spend 99-100% of my time
editing the formatted view, and maybe have an Etude/Interleaf-style sidebar
showing the object types of the markup and pop-up property sheets to show
the object attributes.  (I find markup information quite distracting, except
when I'm actually editing it; most of the time, I'm editing the content of
the document, not its style.)

> f) Desktop workstation with 8-10X VAX-780 integer performance, 8-32MB
> memory, [my guess at what it takes to do a)-f) with reasonable programming.]

I think this is rather more than what's needed.  From what I've seen, a
system like Interleaf seems fast enough, and it runs on a desktop
workstation with (if I remember our corporate propaganda correctly) ~1X
VAX-750 integer performance, and 2-4MB memory).  (And no, Interleaf does NOT
require the raster-op chip - it runs on the newer Suns which don't have it.)

	Guy Harris

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site unc.unc.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!mcnc!
unc!rentsch
From: rent...@unc.UUCP (Tim Rentsch)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <705@unc.unc.UUCP>
Date: Sat, 7-Dec-85 20:15:17 EST
Article-I.D.: unc.705
Posted: Sat Dec  7 20:15:17 1985
Date-Received: Mon, 9-Dec-85 03:39:25 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<1861@glacier.ARPA> <116@utastro.UUCP> <1919@glacier.ARPA>
Reply-To: rent...@unc.UUCP (Tim Rentsch)
Distribution: net
Organization: CS Dept, U. of N. Carolina, Chapel Hill
Lines: 91
Summary: 

In article <1...@glacier.ARPA> r...@glacier.UUCP (Brian Reid) writes:
>
>If you look at the interaction of technology and industry for the past few
>hundred years, you will see a recurring theme. A new technology gets
>invented. The in-place industry applies that technology to automate what
>they are currently doing. This is often inefficient, as the new technology
>is often better applied by changing the fundamental premises of the
>industry. Gradually new companies grow up, which use the new technology in a
>different way, and if it is more cost-effective, then the new industry
>drives the old one out of business.
>
   ...
>
>At the moment we are in a transition phase. The graphic arts industry is
>discovering computers, and they are molding them in their own image, taking
>the things that they have done by hand since the invention of cold type and
>putting them isomorphically onto the computer. Simultaneously, however,
>thousands of businesses are discovering that they don't NEED the graphic
>arts industry. With simple computer tools they can achieve their end
>results--the publication of books or newsletters or catalogs--without
>graphic artists. If history serves as any guide, then in half a generation
>the traditionalist approach will no longer be competitive and will have to
>pull out of those markets completely.

Brian is absolutely right that new technology tends to replace older
industry, and does this by delivering cheaper products.  The
operative word, however, is "cheaper", not "better".  Sad to say,
cheaper usually also means worse.  The new technology survives
because (1) it's usually only slightly worse, and so most people
don't care, (2) capitalism works well with mass markets, since by
definition the average customer is not as demanding as the more
discerning customer, and (3) eventually people get acclimated and no
one remembers the advantages offered by the older (higher priced)
product.  [Incidental note:  the new technology may also offer other
advantages, such as smaller size or weight.  Again, these are
engineering improvements and do not directly relate to the quality
of the final product (it being understood that the product is
*produced* by the technology, the technology is not the product
itself).]

This is exactly what we see with document processing systems and
technical typesetters.  Pick up a copy of Ullman's book on
Databases, typeset with TeX.  The typesetting is inexcusably bad!
(The standard TeX fonts are also terrible, but that doesn't affect the
typesetting.)  Why then was TeX used, rather than a conventional
typesetting?  Almost certainly it was to lower the cost of producing
the book, with the attitude that TeX output was "good enough".

(I have heard that Don Knuth developed TeX in response to his
publisher's statement that re-typesetting second editions of Knuth's
books would be too expensive.  I believe this to be true, but I
cannot remember the source.)  

By the way, don't take my word for it;  get Ullman's book and try
reading two consecutive chapters.  Don't just skim them (I suspect
you will find yourself wanting to do this, because of the
subconcious resistance to the bad typesetting), but make yourself
read and try to digest the book as a text.  Measure the results by
how you respond to the material in the book, and how little you were
bothered by the typesetting.  (In a perfectly typeset book the
typesetting completely disappears, so that it is never noticed by
itself.)

Of course it is not fair to judge a typesetter by only one of its
uses.  Look around.  Almost all of the documents (books, papers) I
have read that were typeset with TeX are awful.  If this is the
future of technical typesetting, I don't want it -- and it doesn't
matter whether it was done with TeX, Scribe, WYSIWYG, or chiseling
stone tablets.

Rather than continue in the style of a debate, let's look at the
good points of each of the two approaches.

WYSIWYG is good at:
	local things (i.e., a screenfuls worth)
	appearance
	user feedback

Text processors are good at:
	document structure
	textual computation (referencing, indexing, etc.)
	preserving intention

I see no contradiction in integrating both sets of good points into
one system.  What we lack is a good language to express both sets of
things conveniently.  What the WYSIWYG people (I confess I am in this
camp) ought to be doing is trying to find out how to incorporate the
good features of text processors into interactive systems.  That way
we wouldn't need those document processors (to be fair, we wouldn't
need any of the existing WYSIWYG systems either), and we could all go
on to more entertaining and more productive discussions.

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/5/84; site umd5.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!panda!talcott!
harvard!seismo!umcp-cs!cvl!umd5!zben
From: z...@umd5.UUCP
Newsgroups: net.text
Subject: Re: Hyphenation, Re: Why Hyphenate
Message-ID: <811@umd5.UUCP>
Date: Sun, 8-Dec-85 01:54:55 EST
Article-I.D.: umd5.811
Posted: Sun Dec  8 01:54:55 1985
Date-Received: Mon, 9-Dec-85 03:25:43 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> 
<731@othervax.UUCP> <803@umd5.UUCP> <733@othervax.UUCP>
Reply-To: z...@umd5.UUCP (Ben Cranston)
Distribution: net
Organization: U of Md, CSC, College Park, Md
Lines: 145
Summary: And the bits go on...

In article <7...@othervax.UUCP> r...@othervax.UUCP (Raymond D. Dunn) responds
to my perhaps too-hastily posted flame:

>In article <8...@umd5.UUCP> z...@umd5.UUCP (Ben Cranston) responds to my
>earlier posting:

>>> It *IS* by definition possible to implement hyphenation solely by
>>> dictionary.  If the dictionary is large enough,  ...

>> "record"...  verb "re-cord" noun "rec-rd" ...

>To be fair, I missed examples of this type (even if I don't
>necessarily agree with your hyphenation of "rec-ord").
>However this does *not* contradict the dictionary argument, in fact
>it enhances it.
>Assuming a parser was used to determine the part of speech of a word,
>no practical hyphenation algorithm could be devised to hyphenate
>words accordingly.  The dictionary of course *could* easily be
>constructed to contain different hyphenation points for different
>uses of a word when necessary.

All true, and my posting was probably out of line.  I just thought the
claim "it can be done solely by dictionary" a bit too overly-general to
pass without at least a token challenge...

>>> WYSIWYG systems (with the associated demise of much of the graphic
>>> arts industry) are becoming increasingly practical and popular, from
>>> Interleave to the good old "Mac"...

>> WYSIWYG systems have their proponents and their uses.  They are VERY good
>> for novice users, and given the way this field is growing I should think
>> that "novice users" are going to be the MAJORITY of users until the entire
>> society is computer literate.  (This much like "automobile literate" was
>> the thing to be when I was a teenager - something 19 year old males can be
>> macho about...)

>By your definition then, the majority of automobile users are
>novices, and always will be.  Their literacy does not extend further
>than the use of five or six controls.  There is no desire in the
>majority, nor *need*, to become "automobile literate" in your sense,
>the user interface has been designed that way.  The same argument
>applies to computer systems.

I seem to remember a posting from Brian some time ago making an analogy
between some task (which escapes me) and the design of carborators.  He
claimed there were <small integer> number of people in the country who can
actually design such a beast, that it is fraught with black magic, etc.
I believe him - I can barely manage a rebuild :-)

Now, does this qualify or disqualify me as "automobile literate"?  After
all, I don't smelt my own silicon either...

>> However, there are times when the WYSIWYG paradigm breaks down badly.  As
>> a somewhat strained analogy, a strict WYSIWYG system might have you use a
>> mouse to pick out letters from a menu, rather than using a conventional
>> keyboard.....

>Not just a strained analogy, totally irrelevant.  Its like saying "a
>keyboard *might* have just one key which you hit repeatedly until the
>character of choice appears, thus any computer system which uses a
>keyboard is ...".  

>We are discussing WYSIWYG systems, and the ability of the general
>user to do typesetting, not the pros and cons "of mice over
>keyboards" (gosh there's a title for a paper (:-)).  WYSIWYG implies
>an *approach*, not necessarily a specific user interface.

I don't see the two approaches as mutual exclusives, either.  A screen with
one window on the source script, another window on output document, and
real-time updating (:-) would be just dandy.  And yes, I am quite aware of
the resources such a beast would consume.  The 4k by 4k bitmapped terminal
wouldn't come cheap either.  But, the availability of such a beast could
really help with the training of users (more on this later).

>OK, so you're trying to make a point on "efficiency".  Good, that's
>what I'm doing as well.  With complex tasks like typesetting, to
>reduce this to a measure of keystroke counts is absurd.

>The use of a traditional typesetting system requires much dedication
>and training, and the ability to visualise the mapping from embedded
>commands to the resulting typeset page.  (Even with an expert,
>several trial runs on the hardcopy typesetter, or to a "soft" screen,
>are often required before the desired effect is achieved).

>Many people do not, and can never, have this ability, nor should they
>be *required* to train themselves for tasks ancilliary to their
>mainstream interest.  They didn't in the past, they turned to an
>"expert" (and paid him big bucks).  They shouldn't have to now, they
>turn to a computer.  Their literacy need only be how to drive the
>thing in a natural way to them, not to be able to manipulate the
>nuts and bolts.

>*That* is efficiency!

If people could CHEAPLY answer questions like "what would happen if we 
decided to use Basketball Oversize instead of Bimbo Stencil for that table 
on page three" (experimental approach with system described above) it
could help a great deal in helping people *develop* such abilities.

Of course, your argument is that they should not be *forced* to develop
those abilities.  I can only claim that *someone* will, because the very
high-level ideas of how *I* want to "drive the thing" will have to be
somehow translated into the low-level commands to the output device.  If
that takes an expert and big bucks, OK.  If it takes a computer, you will
be spending some bucks for that solution too.

Isn't your "in a way natural to them" a bit ambitious?  It seems to me 
that here you subsume a lot of the functionality of that "expert" who
you are cutting out of the circuit because his "bucks" are too "big".
You run the risk of turning people loose with too much freedom and too
little guidance.  SOMEBODY with SOME amount of graphics art knowlege and
experience is going to have to be around.

>A last point.  Compare this area of expertise with what has happened
>in the computerisation of other disciplines (spreadsheets, data
>managers, report generators, and the birth of the prime example,
>expert systems).

Ya know, I'd feel a whole lot better about these expert systems if we
knew more about how bad rules would affect system performance.  Its just
like us dumb old Humans to make conflicting rules and then refuse to
acknowlege the conflicts, and then some poor innocent gets really screwed.

I think one of the things really wrong with the present scheme of things is
that those people who really have the clout to make decisions and change
things are hidden away from the world and kept apart from the public by
massive burocracies.  If you don't know what I mean, try complaining to the
lady behind the desk at the airline counter at an airport.  Sure, she's
hired to be there and talk to you, but you can yell at her until you're blue
in the face and it still won't get back to that incompetent manager three
levels up the totem pole.

And now, not only are they hiding behind people, but you're going to have
them hide behind computers too.  Not to mention the possibility of some
brass hat general promulgating a rule that "airplanes from Cuba are to be
nuked without warning" into SDI, and ending up French-frying the last of
the Cuban capitalists out...

Other disciplines?  OK, companies are processing more bits with fewer workers
than ever before, and that may well be your idea of success.  But when 
things DO mess up, its a doozy.  That recent SNAFU over the wire-transfer 
switch in New York would have been hilarious except that it had a measurable
effect on the national economy...
-- 
Ben Cranston  ...{seismo!umcp-cs,ihnp4!rlgvax}!cvl!umd5!zben  z...@umd2.umd.edu

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site glacier.ARPA
Path: utzoo!watmath!clyde!cbosgd!ihnp4!nsc!glacier!reid
From: r...@glacier.ARPA (Brian Reid)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <2168@glacier.ARPA>
Date: Sun, 8-Dec-85 11:23:07 EST
Article-I.D.: glacier.2168
Posted: Sun Dec  8 11:23:07 1985
Date-Received: Mon, 9-Dec-85 06:31:18 EST
References: <705@unc.unc.UUCP>
Reply-To: r...@glacier.UUCP (Brian Reid)
Distribution: net
Organization: Stanford University, Computer Systems Lab
Lines: 35

In article <7...@unc.unc.UUCP> rent...@unc.UUCP (Tim Rentsch) writes:
>This is exactly what we see with document processing systems and
>technical typesetters.  Pick up a copy of Ullman's book on
>Databases, typeset with TeX.  The typesetting is inexcusably bad!

Actually, the typesetting isn't all THAT bad. What is inexcusably bad, at
least in my copy of that book, is the type imaging and printing. The pages
are fuzzy. Most of this is a failure of the printer when making lithographic
plates. Some of it is the design of the type face. Neither of those has
anything to do with TeX, save that TeX only knows how to work with its own
type faces.

>Of course it is not fair to judge a typesetter by only one of its
>uses.  Look around.  Almost all of the documents (books, papers) I
>have read that were typeset with TeX are awful.  If this is the
>future of technical typesetting, I don't want it -- and it doesn't
>matter whether it was done with TeX, Scribe, WYSIWYG, or chiseling
>stone tablets.

You are right that most of what is done with TeX is ugly, but the reason for
this has nothing to do with TeX. It has to do with the aesthetic sense of
the person using TeX. Systems like TeX give the author too much control over
the appearance of the document, and if the author misuses that control the
resulting document is ugly.

I would like to offer up the new Addison-Wesley PostScript reference manual
as an example of an attractive book typeset in Scribe. The reason it is
attractive is that its appearance was specified by a professional graphic
designer and not by a programmer. 

WYSIWYG systems give the user even more control over the appearance than TeX
does--with the concomitant possibility of even more abuse of that control.
-- 
	Brian Reid	decwrl!glacier!reid
	Stanford	r...@SU-Glacier.ARPA

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site mips.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!decwrl!
glacier!mips!mash
From: m...@mips.UUCP (John Mashey)
Newsgroups: net.text
Subject: Re: embedded-command text systems [vs WYSIWYG, support for Reid]
Message-ID: <258@mips.UUCP>
Date: Tue, 10-Dec-85 15:34:41 EST
Article-I.D.: mips.258
Posted: Tue Dec 10 15:34:41 1985
Date-Received: Thu, 12-Dec-85 05:27:10 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP>
Distribution: net
Organization: MIPS Computer Systems, Mountain View, CA
Lines: 95

[> > are me, > are Guy Harris]
> > 2] In one sense the "dinosaur" comment is sadly true.  After all, the
> > fundamental ideas of nroff/troff derive from 20-year-old runoff...
> > the troff/tbl/eqn/(-ms or -mm) group was all there by late 1976.
> Well, I certainly consider the change from the "tell it what to do, in
> detail" model to the "tell it what you want" model (more particularly, the
> change from ".in", ".ti", traps, diversions, etc. to the object/stylesheet
> model) to be fundamental progress (although various macro packages and
> preprocessors put a model like this on top of, to use the wonderful phrase
> of someone here, "full-frontal troff").
That was the point: -ms (& especially -mm, no bias here!) did that in
1975/1976, to the extent that it was possible on top of troff.  For example,
see the paper on PWB/MM in the 2nd Intl Conf on Soft. Eng, 1976:
("Documentation Tools and Techniques", J. R. Mashey, D. W. Smith):

"PWB/MM features permit the user to concentrate on the logical structure
of the document, not on its eventual appearance. We feel this is a desirable
direction of evolution for text processing. Some specific examples include
the implementations of headings, various styles of lists, and footnotes....
What must be noted is that though the style may vary, the way of typing a
heading does not. A few global parameters control the overall final appearance."
[This is, of course, object/stylesheet, although without the good graphics.]

"This approach not only contributes to the uniformity of style within a
document, but also allows the user to make radical changes in style after
the document has been entered.  Finally, the same text can be included in
several documents that must adhere to differing standards, as in the case when
an internal report is submitted to a journal that requires another format."
This stuff was designed in 1975, and in widespread use by 1977.
> 
> > I've generally thought that the text-processing system I've always
> > wanted on my desk needed
> > a) WYSIWYG editing + the best of structural description.  The latter should
> > be able to do about as well as Scribe or troff -MM, else no go.
> > b) Integrated graphics, images
> 
> What's missing in systems like Interleaf?  It does have an object/stylesheet
> model, so it provides some amount of structural description (derived,
> according to somebody from Interleaf, from the model of the Etude system at
> MIT).  It also has integrated graphics, and even MacPaint-ish images in the
> latest release of the top-of-the-line version.
Interleaf is fine.  When it can do what -mm .H and .LI do I'll be happier.
The right model is there, but some of the details aren't there yet. As for
why I care, we looked at documents at BTL long ago, and the most frequent
-mm commands were .P (paragraph), .LI (list item), and .H (heading);
auto-everythinged lists and headers are very important in some classes of
documentation.
> 
> > d) Multiple concurent views that let me edit at least the formatted
> > (WYSIWYG) view or the markup-language view [I'd like Hypertext-like
> > features, and holphrastic displays on document structure, and a bunch
> > of others, but no need to get greedy.]
> 
> Sounds somewhat like IBM's Janus, where there were two screens, one of which
> displayed the formatted text and one of which displayed the text+markup
> language.  My own prejudice is that I'd rather spend 99-100% of my time
> editing the formatted view, and maybe have an Etude/Interleaf-style sidebar
> showing the object types of the markup and pop-up property sheets to show
> the object attributes.  (I find markup information quite distracting, except
> when I'm actually editing it; most of the time, I'm editing the content of
> the document, not its style.)
I don't think we're actually arguing here, but rather expressing preferences.
maybe I've spent more time dealing with heavily structured documentation,
and like to see the structuring information more often. I think the
fundamental wish we both have is to see the document in different ways
whenever we want it, and manipulate the whole thing in whichever way is
more convenient.  I like to be able to see a document as one big tree
sometimes, with most ofthe details suppressed, when doing big
rearrangements, assuring parallel constructions, etc.
> 
> > f) Desktop workstation with 8-10X VAX-780 integer performance, 8-32MB
> > memory, [my guess at what it takes to do a)-f) with reasonable programming.]
> 
> I think this is rather more than what's needed.  From what I've seen, a
> system like Interleaf seems fast enough, and it runs on a desktop
> workstation with (if I remember our corporate propaganda correctly) ~1X
> VAX-750 integer performance, and 2-4MB memory).  (And no, Interleaf does NOT
> require the raster-op chip - it runs on the newer Suns which don't have it.)

Interleaf is certainly fast enough at what it does. I just want some more
things that I have reason to believe have heavy computational loads; the
Interleaf people worked pretty hard to tune it as it is, and it has to bypass
windowing systems in some cases for enough performance [on 68010s, maybe not
on 68020s].  I don't think the RasterOp part of it is the expensive part,
but rather the rest of the calculations, which can easily turn into
fairly expensive constraint-based things (like IDEAL, for example).
NOTE: none of the above should be construed as criticism of Interleaf or
existing workstations, merely an observation that what I really want is
the best of both worlds, and that I'd like to be able to stop using troff
without having to restrict document formats more than I'd like.
-- 
-john mashey
UUCP: 	{decvax,ucbvax,ihnp4}!decwrl!mips!mash
DDD:  	415-960-1200
USPS: 	MIPS Computer Systems, 1330 Charleston Rd, Mtn View, CA 94043

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site mips.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!allegra!mit-eddie!genrad!decvax!decwrl!
glacier!mips!mash
From: m...@mips.UUCP (John Mashey)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <259@mips.UUCP>
Date: Tue, 10-Dec-85 15:44:02 EST
Article-I.D.: mips.259
Posted: Tue Dec 10 15:44:02 1985
Date-Received: Thu, 12-Dec-85 05:27:32 EST
References: <705@unc.unc.UUCP> <2168@glacier.ARPA> <151@utastro.UUCP>
Distribution: net
Organization: MIPS Computer Systems, Mountain View, CA
Lines: 29

Ed Nather writes:
> While I agree that ugliness often lies in an underdeveloped sense of style,
> I can't agree with Brian's solution, as exemplified by the Scribe text
> formatter: don't let the user get at the machinery because he'll muck it up.
>...
> 
> But it's not clear to me you can't have a system that allows the user to
> specify, in a general way, what he's after, and still permit him to reach
> under the hood and twiddle if he's willing to learn how.  Since many ...

This is something we tried to do with the old -MM macros, but it always
seems that for any given technology, you reach a point where it is very
hard to maintain the extensibility without severe performance penalities,
or without incredible complexification.  This seems to be a fundamental
design problem: how do you build systems that are easy to learn,
give people a lot of leverage, and whose modification costs rise linearly
as you depart from the defaults, and which can still be modified to
reach a long distance away from those defaults.
For example:
	raw troff lets you do anything, but nothing is simple.
	troff + (good macro package) lets you do many things easily,
	sometimes with serious perforamnce cost.  Certain extensions are
	easily done, but with others, you fall off the edge of the world,
	and need to start from scratch.
-- 
-john mashey
UUCP: 	{decvax,ucbvax,ihnp4}!decwrl!mips!mash
DDD:  	415-960-1200
USPS: 	MIPS Computer Systems, 1330 Charleston Rd, Mtn View, CA 94043

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10 5/3/83; site utzoo.UUCP
Path: utzoo!henry
From: he...@utzoo.UUCP (Henry Spencer)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <6214@utzoo.UUCP>
Date: Tue, 10-Dec-85 17:20:30 EST
Article-I.D.: utzoo.6214
Posted: Tue Dec 10 17:20:30 1985
Date-Received: Tue, 10-Dec-85 17:20:30 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc>
Organization: U of Toronto Zoology
Lines: 26

> Incidentally, they do not call laser printer output "typeset material."  The
> discernable resolution and poor kerning is still too crude, in their opinion.
> To us computer types, used to crummy dot-matrix output, it looks great.  To
> the professional typographer, the one with the 20X loupe magnifier in his
> shirt pocket, it is simply amusing.

To those of us who read text *without* using a 20X loupe magnifier, this
manic concern with how text looks when a page is blown up to the size of
a football field is simply amusing.

Yes, I'm aware that readability can be affected by subtle issues, and that
the 20X loupe can help you spot such problems, but professional typographers
have a tendency to push this far beyond the point of diminishing returns.
300/inch laser printers will not replace typesetting machines, but they're
going to steal a lot of the typesetting-machine market.  Unless it is pointed
out to them most explicitly, most users can see the difference only with poor
fonts or difficult jobs.  Which means that most users will accept 300/inch
as adequate for most jobs.  They don't feel that the increment of quality
gained by going to real typesetting is worth the hassle and expense, barring
the occasional job where maximum quality is an explicit objective.

(Obviously I am talking about competently-"laserset" documents printed using
good fonts, not about the typical output of MacWrite enthusiasts.)
-- 
				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,linus,decvax}!utzoo!henry

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site unc.unc.UUCP
Path: utzoo!watmath!clyde!bonnie!akgua!mcnc!unc!rentsch
From: rent...@unc.UUCP (Tim Rentsch)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <719@unc.unc.UUCP>
Date: Wed, 11-Dec-85 00:29:44 EST
Article-I.D.: unc.719
Posted: Wed Dec 11 00:29:44 1985
Date-Received: Thu, 12-Dec-85 06:46:15 EST
References: <705@unc.unc.UUCP> <2168@glacier.ARPA>
Reply-To: rent...@unc.UUCP (Tim Rentsch)
Distribution: net
Organization: CS Dept, U. of N. Carolina, Chapel Hill
Lines: 41
Summary: 

In article <2...@glacier.ARPA> r...@glacier.UUCP (Brian Reid) writes:
>You are right that most of what is done with TeX is ugly, but the reason for
>this has nothing to do with TeX. It has to do with the aesthetic sense of
>the person using TeX. Systems like TeX give the author too much control over
>the appearance of the document, and if the author misuses that control the
>resulting document is ugly.
>
>I would like to offer up the new Addison-Wesley PostScript reference manual
>as an example of an attractive book typeset in Scribe. The reason it is
>attractive is that its appearance was specified by a professional graphic
>designer and not by a programmer. 

Brian is quite right to point this out.  I have not looked at the
book he mentions but think his argument is valid regardless.  [To be
fair to the other side I would have to say that I think it is just as
easy to produce good documents as bad with a WYSIWYG system, but that
in my experience it is always harder to produce good documents with a
text processing system.  Scribe seems better than TeX in this regard,
but then I've never been a Scribe database administrator! :-) ] 

Furthermore this points out an item missing from my list comparing
WYSIWYG's and Scribe-like systems.  In particular, text processing
systems generally are better at *cataloging* document structures so
that the predefined structures can be retrieved rather than being
re-written by everyone document hack under the sun.  This is
somewhat analogous to a subroutine library -- rather than
reprogramming sin or some such <genericDeity>-awful function, we can
just get it out of the library.

So, you WYSIWYG'ers, add that to your task list!  Our interactive
systems should be able to catalogue and use document structures and
templates as well as text processors do.  But for the time being it
is still true that WYSIWYG's are better for some things, text
processors better for others.  

(Should I even bother to say I think that document structure and
cataloging is best done graphically and interactively?  :-)  )

cheers,

Tim

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.1 6/24/83; site umcp-cs.UUCP
Path: utzoo!watmath!clyde!burl!ulysses!mhuxr!mhuxt!houxm!whuxl!whuxlm!akgua!gatech!seismo!umcp-cs!chris
From: ch...@umcp-cs.UUCP (Chris Torek)
Newsgroups: net.text
Subject: Re: embedded-command text systems
Message-ID: <2555@umcp-cs.UUCP>
Date: Fri, 13-Dec-85 22:34:34 EST
Article-I.D.: umcp-cs.2555
Posted: Fri Dec 13 22:34:34 1985
Date-Received: Mon, 16-Dec-85 03:50:46 EST
References: <471@harvard.ARPA> <773@mmintl.UUCP> <734@tpvax.fluke.UUCP> <etc> <6214@utzoo.UUCP>
Organization: U of Maryland, Computer Science Dept., College Park, MD
Lines: 18

In article <6...@utzoo.UUCP> he...@utzoo.UUCP (Henry Spencer) writes:
> [...] 300/inch laser printers will not replace typesetting
> machines, but they're going to steal a lot of the typesetting-
> machine market.  Unless it is pointed out to them most explicitly,
> most users can see the difference only with poor fonts or difficult
> jobs.

... or with small fonts (7 point and under), where some characters
turn into indecipherable blobs.  But you are correct in stating that
many people do not care about the (still visible at 300dpi) difference.

Beware, however, of using 300 dpi laser printer output as a master
for copies.  Unless your copier is in good shape, the results are
often nearly unreadable.  Using larger fonts (11 or 12 point) helps.
-- 
In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 4251)
UUCP:	seismo!umcp-cs!chris
CSNet:	chris@umcp-cs		ARPA:	ch...@mimsy.umd.edu

Relay-Version: version B 2.10 5/3/83; site utzoo.UUCP
Posting-Version: version B 2.10.2 9/18/84; site glacier.ARPA
Path: utzoo!watmath!clyde!burl!ulysses!gamma!epsilon!zeta!sabre!petrus!bellcore!
decvax!decwrl!glacier!reid
From: r...@glacier.ARPA (Brian Reid)
Newsgroups: net.text
Subject: so-called 'power' of TeX/Scribe/troff
Message-ID: <2455@glacier.ARPA>
Date: Sun, 15-Dec-85 05:20:02 EST
Article-I.D.: glacier.2455
Posted: Sun Dec 15 05:20:02 1985
Date-Received: Mon, 16-Dec-85 04:46:50 EST
Organization: Stanford University, Computer Systems Lab
Lines: 33

In this argument I claim these things:
  (1) Troff is the most powerful formatting language, because it is the
      only one of the three mentioned that can compute generalized functions
      of the image state. Neither TeX nor Scribe can examine the image state.
  (2) I can't offhand see why that is a big deal.
  (3) I consider that the very best feature of Scribe is that nobody is able
      to make it process TeX-like input. The whole point of Scribe is that
      it enforces a certain amount of rigor and discipline on the preparation
      of documents. If I had wanted to program a turing-complete system that
      could emulate somebody else's formatter, I would have done so.
  (4) Arguments that the ugliness of published documents are not the fault of
      the text formatter are somewhat analogous to arguments that handguns
      don't kill people, rather people kill people. My stand on text formatters
      is the same as my stand on handguns: I don't want the average guy in the
      street to be carrying a gun, and I don't want the average hack at a
      word processing terminal to be using a text formatter that will let him
      have any control whatsoever over the appearance of the finished document.

Stanley Morison, the eminent British typographer and Cambridge University
Press author, wrote that the purpose of typesetting is to be invisible, and that
if the reader even notices the typesetting of a book then the typesetting has
failed. Most modern psychology researchers who have studied legibility of
typewritten material (Burt [1959], Tinker [1963], Zachrisson [1965]) report
that people find most legible that with which they are most familiar. Everyone
finds his own newspaper to be much more readable than the New York Times
(unless his own newspaper IS the NYT). Everyone finds his own handwriting 
legible. Everyone learns to like the fonts on his own laser printer. This 
suggests that we should all just pick some standard, however ugly, and all
use it, because if we all use it then we will all be able to read each other's
material more easily. Anyone game?
-- 
	Brian Reid	decwrl!glacier!reid
	Stanford	r...@SU-Glacier.ARPA

			        About USENET

USENET (Users’ Network) was a bulletin board shared among many computer
systems around the world. USENET was a logical network, sitting on top
of several physical networks, among them UUCP, BLICN, BERKNET, X.25, and
the ARPANET. Sites on USENET included many universities, private companies
and research organizations. See USENET Archives.

		       SCO Files Lawsuit Against IBM

March 7, 2003 - The SCO Group filed legal action against IBM in the State 
Court of Utah for trade secrets misappropriation, tortious interference, 
unfair competition and breach of contract. The complaint alleges that IBM 
made concentrated efforts to improperly destroy the economic value of 
UNIX, particularly UNIX on Intel, to benefit IBM's Linux services 
business. See SCO vs IBM.

The materials and information included in this website may only be used
for purposes such as criticism, review, private study, scholarship, or
research.

Electronic mail:			       WorldWideWeb:
   tech-insider@outlook.com			  http://tech-insider.org/