The Development of the Analysis of Arguments


Hausarbeit (Hauptseminar), 2003

23 Seiten, Note: 1,7


Leseprobe


Content

1 Introduction

2.1 The ancient approach of quantified sentences
2.2 Propositional logic
2.2.1 The analysis of the validity of arguments in propositional logic
2.2.2 Limitations of propositional logic
2.3 Quantified predicative logic
2.4 Componential Analysis
2.4.1 Problems of Componential Analysis

3 Conclusion

1 Introduction

If Francis Bacon actually wrote all the plays that are thought to be written by Shakespeare, then Francis Bacon was a great writer. Francis Bacon indeed was a great writer, so Francis Bacon must have written all the plays that are thought to be written by Shakespeare. A sensation!

The same holds true for Lewis Caroll. If he actually wrote the plays of Shakespeare, then he was a great writer - nobody would doubt that. Since we met Alice in Wonderland, we all know that Caroll indeed was a great writer. It is obvious, that he must have written Shakespeare’s plays too.

Since this argument holds true for all the great writers in history, there must be something odd in it. Throughout the whole history of science and humanities, attempts have been made to distinguish between arguments that really show us something new about the world we live in, and others that just claim connections that don’t really exist. The barber of Sevilla has been invented with the self-contradictory claim, that he shaves all the men in Sevilla that don’t shave themselves. Claims about the omnipotence of god have been reduced to the question, whether he could create a stone that is so heavy that he cannot levitate it.[1] In the end a system of growing complexity has been created to be employed as a universal strategy for proving the truth or falseness of arguments of any kind. My task in the following pages is to draw an outline of the development of that system and to display its power and its lacks at each stage.

2.1 The ancient approach of quantified sentences

The first approach of text analysis I have found was taken by Aristotle, the Greek philosopher. In the beginning, he investigated so called quantified sentences[2], which means sentences that posses a "quantifier". Quantifiers are the words "all", "some" and "no". Examples for quantified sentences, like Aristotle used them, are: "All owls are in Athens.", "Some human beings are mortal." or "No fish eats a dog."

He found that these sentences have a structure of the following form:

Q A be B:with Q=Quantifier; A, B=noun phrases; be=a form of be.

All simple main clauses that contain a quantifier can be reformulated to fit that pattern. For example: "Hungry cats eat mice." can be transformed into "All hungry cats are mice-eaters." and hence into Q A be B: with Q=All, A=hungry cats, B=mice and be=are. (see Copi, 1983, p.81-83)

Aristotle found out, that those sentences and their negations show a certain symmetry that is totally independent from the things, that are substituted for A and B. To see this symmetry we have to categorize the combinations of quantifiers and negations that can possibly occur in a sentence. As I said, there are three quantifiers: "all", "some" and "no". Combined with the negation there are six possibilities:

(1) "all" unnegated, (2) "all" negated

(3) "some" unnegated (4) "some" negated

(5) "no" unnegated (6) "no" negated

Now let’s have a closer look at the shape of the negation.[3] The sentences

"All human beings are mortal." and

"No human being is not mortal." have the same meaning

And the sentences

"No human being is mortal." and

"All human beings are not mortal." have the same meaning too.

As we can see the quantifier "no" isn’t needed, because it can also be expressed by simply negating the last term in an all-quantifier sentence. Thus "All doves are not black" can be derived from "No dove is black." and "All doves are white" is derived from "No dove is not white."[4]

So only the combinations (1)-(4) remain. Although none of the quantifiers left can be reduced to the other, there are certain relations between them. Aristotle sketched these relations by using a square:

illustration not visible in this excerpt

(Picture 1, Copi, 1983,Volume 1, p.96)

"Contradicting" means that two sentences in which the same terms are substituted for A and B exactly exclude each other if the one has structure (1) and the other structure (4), or if the first has structure (2) and the second structure (3).

For example:

"All birds are winged." contradicts

"Some birds are not winged." because both sentences cannot be true at the same time AND both sentences cannot be false at the same time.

"Some birds are winged." contradicts

"All birds are not winged." because of the same reasons.

(see Copi, 1983, Volume 1, p.91)

"Contrary" means that the two sentences cannot be true at the same time if A and B contain the same objects. But "contrary" sentences can be false at the same time.

For example:

"All birds are flying animals." and "All birds are not flying animals."

cannot be true at the same time. Both can be false simultaneously - if there are flying birds (like the sparrow) and birds that cannot fly (like the ostrich). (see Copi, 1983, Volume 1, p.93)

If two sentences are "subcontrary", they can’t be false at the same time, but they easily can both be true. (see Copi, 1983, Volume 1, p.94)

"Some birds are flying animals." and

"Some birds are not flying animals."

is such a pair of sentences. If the second is false, the first must be true, and vice versa. Actually looking at the world, both sentences are true.

The last relation mentioned by Aristotle is the "subaltern" relation. If one sentence is subaltern to another (which again means that A and B are replaced by the same objects), then the latter can be concluded from the first. Assuming sentence B is subaltern to sentence A, this means that IF sentence A is true, sentence B must be true too.

Sentences with structure (3) are subaltern to sentences with structure (1). Thus if "All politicians are tax-payers." is true, "Some politicians are tax-payers." must also be true. In other words: from "All politicians are tax-payers." follows that "Some politicians are tax-payers." but not the other way round. Accordingly from "All politicians are not tax-payers." (Structure 2) you can derive that "Some politicians are not tax-payers." (Structure 4) (see Copi, 1983, Volume 1, p.96). The most popular discovery of this approach was, that you cannot conclude from "All people who lie to the folks are politicians." that "All politicians lie to the folks." or in abstract terms, that

All A are B doesn’t mean that

All B are A.

Aristotle’s work also included a better instrument called syllogism analysis. It was concerned with deductions from two premises to one conclusion. All of these sentences had to be quantified sentences. An example for an argument having the form of an Aristotelian syllogism is the following:

"Da alles Wissen von den Sinneseindrücken herstammt und da es keinen Sinneseindruck der Substanz selbst gibt, folgt logisch, daß es kein Wissen der Substanz gibt." (Pirsig, 1995, p.35)

(You can check its validity by intuition or look it up in the appendix.)

Since I will introduce a more powerful system, that is also capable of analysing all forms of syllogisms, I won’t describe Aristotle’s syllogism analysis.

2.2 Propositional logic

Later a much more versatile approach called propositional logic was established. It wasn’t bound to special structures and could analyse each text that consisted of sentences or clauses connected by the junctors AND, OR, BECAUSE, EITHER...OR, IF, ONLY IF and NEITHER...NOR. The central concept of this system was the proposition. "A proposition is the smallest unit of knowledge that can stand as a separate assertion." (Anderson, 2000, p.470) It is represented by a sentence that has the same content as the corresponding proposition. Examples for propositions are:

“It is raining.“

“Mary buys a bike.“

Two propositions are different, only if they differ in content. So

“It’s raining.“

“Es regnet.“

“Il pleut.“

all stand for the same proposition. Sentences also can include more then one proposition as the following example shows.

“The old man goes into the cinema and his wife into the theatre.“

The sentence can be divided into:

1. “The old man goes into the cinema.“
2. “His wife goes into the theatre.“

One could subdivide the first part into the smaller units, so the propositions are:

1a. “The man goes into the cinema.“
1b. “The man is old.“

2. “His wife goes into the theatre.“

In propositional logic arguments are examined by division into smaller parts. Propositions cannot be subdivided and therefore, they are the smallest possible units.

With regard to Wessels description we need the following set of symbols:

1. Proposition Variables: p, q, r, p’, q’, r’, p’’,...
2. Junktors: ~(not),½(not ... or not ...), † (neither ... nor...), Ù(and), Ú(or), É(thus), º(only if)
3. Auxiliary symols: ), ( (see Wessel, 1998, p.36)

To inspect the structure of an argument it is necessary to formalize it. This means to create formulas that represent the structure of the argument. We separate the propositions of the text while each new proposition is replaced by a new variable. So the same proposition will get the same proposition variable each time it occurs in the argument.

Accordingly:

1st If Kennedy dies, then Nixon is mourns or Jackie O. is sad.

2nd Kennedy dies and Nixon doesn’t mourn.

So Jackie O. is sad.

Is converted into:

1st p É (~q Ú r) with p=Kennedy dies; q=Nixon mourns; r=Jackie O. is sad

2nd p Ù q

=> r => indicates that this is the conclusion of the argument

If the argument is complete then all necessary information is given in its premises.[6]

Now, we have got a complete representation of an easily understandable argument in complicated formulas. What is that good for?

There are two benefits we have gained by generating the formulas:

Firstly, we got rid of the content of the argument and therefore can think about its validity without any prejudice.

Secondly, there are procedures to analyse the structure of such formulas. With

the help of those procedures it is possible to tell whether and why an argument of any complexity is structurally wrong or right.

2.2.1 The analysis of the validity of arguments in propositional logic

The analysis of arguments in propositional logic is based on the assumption that certain basic argument-structures are valid independently from their content. One example for such a structure is the so called Modus Ponens (MP). When A and B stand for different assertions, MP says that if we know

that A implies B and we also know

that A is true, then we can infer

that B is true.

In formulas this argument is expressed in the following way:

Modus Ponens (MP): 1st A É B

2nd A

=> B[7]

So if we have given only the first two lines as being true, we can add the third line and infer that it is also true. This conclusion is valid per definition. The following example has the Modus Ponens structure:

"[1st] Hätte Pluto, so wie dies aus Hallidays Berechnungen hervorgeht, einen Durchmesser von mehr als 4200 Meilen, denn hätte man vom McDonald- Observatorium (in Fort Davis, Texas) aus eine Verfinsterung beobachtet; [2nd] die Aufzeichnungen weisen eindeutig darauf hin, daß eine solche nicht beobachtet wurde. [=>]Daher muß Pluto entweder von dieser Größe oder kleiner sein; er kann nicht größer sein." (Nicholson, 1967, the squared brackets don’t occur in the original.)

[...]


[1] Solutions to all those examples you can find in the appendix.

[2] I will come back to that type of sentences later.

[3] to save space for more interesting approaches then Aristoteles ancient basics, I won’t introduce all the relations and transformations that are found by Aristotle.

[4] This rule is called obversion (see Copi, 1983, Volume 1 p.104)

[5] Sometimes I will use the "No"-form, because it is easier then to grasp the meaning of the sentence.

[6] If there’s a lack of information, the argument can’t be logically cogent. To claim the truth of the conclusion then is merely a clever guess. All known information about the matter of affair can be inserted in form of premises. The one in the second line could be such a premise. It states, that we have the additional information that Kennedy dies and that Nixon doesn’t mourn.

[7] AÉB means If A then B; => means that this assertion is derived from the lines above.

Ende der Leseprobe aus 23 Seiten

Details

Titel
The Development of the Analysis of Arguments
Hochschule
Otto-von-Guericke-Universität Magdeburg  (Institut für fremdsprachliche Philologien)
Veranstaltung
Perspectives in Semantic Theory (WS 2002/2003)
Note
1,7
Autor
Jahr
2003
Seiten
23
Katalognummer
V37567
ISBN (eBook)
9783638368643
ISBN (Buch)
9783638654098
Dateigröße
509 KB
Sprache
Englisch
Anmerkungen
This term paper describes in detail the development of Logic from Syllogisms via Proposition Logic and Quantified Predicative Logic to Componential Analysis. It describes how all the systems of Logic work, gives practical examples of how and when to use them and provides exercises for a deeper understanding. Furthermore, it discusses the advantages, disadvantages, and limitations of each system. In the end all the logical systems are integrated into one unified system.
Schlagworte
Development, Analysis, Arguments, Perspectives, Semantic, Theory
Arbeit zitieren
Franz Wegener (Autor:in), 2003, The Development of the Analysis of Arguments, München, GRIN Verlag, https://www.grin.com/document/37567

Kommentare

  • Noch keine Kommentare.
Blick ins Buch
Titel: The Development of the Analysis of Arguments



Ihre Arbeit hochladen

Ihre Hausarbeit / Abschlussarbeit:

- Publikation als eBook und Buch
- Hohes Honorar auf die Verkäufe
- Für Sie komplett kostenlos – mit ISBN
- Es dauert nur 5 Minuten
- Jede Arbeit findet Leser

Kostenlos Autor werden