Componential Analysis Method


Trabajo de Investigación, 2002

21 Páginas, Calificación: 3


Extracto


Contents

1. Introduction
1.1 The definition and specialities of componential analysis

2. Different schools of lexical of componential analysis
2.1 Semantic primes and the use property
2.2 Katz’s theory and the criticism it brings

3. Decomposition of verbs

4. Summary

Bibliography

Introduction

“Linguistic semantics as a theoretical discipline concerned with the content of linguistic signs is not just the study of meaning, but the study of meaning as structuralized by language and intended to be used in the process of communication. This means that the units of the symbolic levels – particularly those of the lexicon – perform not only the functions of signification and classification of reality. They are meant to be used in actual speech events to convey and interpret experience of the world, to express the speaker’s attitudes and his/her intention to influence the attitudes of the hearer, and to construct written or spoken instantiations of linguistic messages.” (cf. Igor 99)

The systematic productivity of linguistic meaning is its most striking feature and it distinguishes human languages from many other semiotic systems.

In traditional linguistic analysis, the keen semantic intuitions of the linguist are the only test of whether the significance attached to an abstract element is really constant wherever that element is used, but such judgements are very tricky, especially when each analysis contains more than one abstract element, so that it may be difficult to know just what “part” of the meaning of real words is being attributed to each abstract element. Abstract words production is subject to the sentence what you mean. That is, word meanings must be able to provide an appropriate finite base for an adequate recursive theory of indefinitely many sentential meanings.

No matter whether being written or spoken, languages are all used for communicating messages, transferring the ideas of writer or speaker’s. In order to better grasp the sense of these ideas, Componential Analysis (C.A.) Method is introduced into language semantic field.

1.1 The definition and specialities of Componential Analysis

Lexical decomposition is an alternative term for C.A., and “ analysis into meaning components are called decomposition.” A good C.A. should pursuit the following specialities:

Meaning- providing models for word meanings

Basic meanings- reducing the vast variety of lexical meanings to a limited number of basic meanings

Precision -providing a means of presentation that allows a precise interpretation of lexical items

Meaning relations-explaining meaning relations within and across lexical fields

Composition- explaining the compositional prope rties of lexical items

Language Comparison- determining the semantic relations between expressions of different languages

Since a professed goal of modern linguists is to characterise the class of possible human languages in as narrow a way as possible, the core of this goal is to try to characterise the class of possible word meanings of natural languages in as narrow a way as possible.

Linguists have traditionally viewed decomposition analyses as a step towards this goal, as decomposition- analysis is supposed to reveal a universal set of fundamental “units of meaning”, a constructional view of word meanings leads to the viewpoint that a possible word meaning is anything that can be constructed out of these fundamental units by some specified method of putting them together. (cf. Dowty, 1979, P33)

One of lexical decomposition aims is reduction.

2. Analysis of different schools of Componential Analysis

Let us now see the history back ground and vary schools of linguistic beliefs (structural or non- structural e.g.) of this componential analysis:

Swiss linguist Ferdinand de Saussure tried to find an answer to establish ‘a simplest and most elegant explanation’ of canalisation, and ‘introduced a new method, a structural method, into generic linguistics.’

Hjelmslev, Louis, a Danish linguist, was a representative of early European Structuralism and the first one who gave definite proposal for a componential semantics. (cf. Cruse 2000) Hjelmsler -like procedures the beginning with complex meaning and reduces them to simpler ones, guided the meanings of other words.

Leibinz followed above process, when reduction could go no further, Leibniz thought, one should have arrived at the fundamental unites of thought; He was the first to discover an ‘alphabet of thought ’ by reducing complex meanings to combinations of simple ones.

Unbelievable is that lexical decomposition was first used by anthropologists as a technical method for describing and comparing the vocabulary of kinship in different languages. Until some years later did some scholars (Lamb, Weinreich or Katz and Fodor, e.g.) take it and use it as a general theory semantic structure. (cf. John Lyons, 1977)

One of modern linguists representatives is Lyons. He defines C.A. as: “One way of formalising, or making absolutely precise, the sense-relations that are hold among lexemes.”

There are three kinds of formulations:

a. Sense -component. (The sense of word can be represented as the product of 12 factors.)

“MOTHER”= “parents” x ”female”

b. Negation-operator (As it is defined in standard prepositional logic: ’~‘)

“MIDDLE-SIZE”=[~ small] L [big]

c. Conjunction (& for and)

“CHILD”= HUMAN & ~ADULT (c.f. Lyons 1996 pp.107-109)

To sum up, each lexical item will be entered in the dictionary with a complex of semantic components. A set of redundancy rules applies automatically to reduce the number of components stated for each item. Thus lexical relations can be stated in term of the components. (cf. Saeed 1997 P 234 )

In America, Componential Analysis (C. A.) gets independently developed. Sapir-Whorf’s followers (Even if he himself is an anti-universalism) were responsible for promoting so called: A particular kind of structuralism lexical semantics (different with non-lexical semantics), one feature of C.A. was it operated with a set of atomic components of lexical meaning assumed to be universal. Pinker is one of these representatives. He considered that ‘the grammatically relevant subset to be the main focus of research into language universals and language acquisition.’ (cf. Saeed 97 P 240)

2.1. Semantic primes and the use property

Another famous contemporary compontialist, Anna Wierzbicka, who differed with other compontialists; she takes componential not from the structuralism, but from further back in the past: She uses Leibniz source, but starts with a small list of what appear to be indispensable notions, and tries to express, as may meanings as possible with these, only adding items to the list of primitives when forced to do so.

Her argument is: Like human birth, all with some innate capacities, there exists surely an expressible primitive in all languages. She dismissed analyses of the Katz and Fordor variety as not so much genuine analysis of meaning as translations into an artificial language for which no one has any intuitions. A typical Wierzbickan analysis gives as following:

X punished Y for Z:

(a) Y did Z.

(b) X thought something like this:

(c) Y did something bad (Z).

(d) I want Y to feel something bad because of this.

(e) It will be good if Y feels something bad because of this.

(f) It will be good if I do something to Y because of this.

(g) X did something to Y because of this.

This analysis is intended to capture in maximally simple terms: the fact that punishment is objectively justifiable causation, that suffers for an offence. As Wierzbicka herself referred: If the role of these primitives as a foundation on the basement of all complex meanings is recognised, then it can be used as an instrument for improving lexicography. ( cf. Pulman 83 pp.25,67 ) Wierzbicka opposed to use abstract semantic primitives, but the way of showing direct intuition. Her approach focuses on language comparison though has drawbacks. One is lack of precision and the other is unable to define words connected with antonym and directional opposition.(cf. Löbner 02 pp.148, 150)

“NSM(Natural Semantic Metalanguage) are RELATIVELY more translatable than the vast majority of other English words.”[1]

“ ‘Lexical’ is used in a broad sense to include not only words, but also bound morphemes and fixed phrases. In many languages there are primitives that are expressed by bound morphemes, rather than by separate words.”[2]

Semantic primitives as universals may have the empirical motivation specified as (Some meanings of morphologically complex are not composed from the meaning of the morphological ‘bit’ in question. That is , some ≠ some + one, maybe ≠may + be, and inside ≠ in +side):

1.Some morphological markers
a. weaken, strengthen, widen, darken, loosen, broaden, harden
b. enable, enrich, enlarge, embitter
c. clarify, solidify, purify
d. realise, recognise, capitalise, decolonise

2. Negation

possible-impossible, happy-unhappy , cover- uncover , no ever- never, nowhere, nothing, etc.

3.Repetition

renew, redevelop , reunion, etc.

4. Decline, Subtraction

careless, defrost, decompose, defeat, deduct, devalue, painless, etc.

5. Beginning vs. End

boiling –boiled filling-filled developing-developed

( As German : erblühen - verblühen, erleuchten – verleuchten)

6. Back, Against

backdate, backfire, backtrack, etc.

( As German: zurückgehen, zurückgeben, zurückführen, zurücklegen, zurücktreten, etc.)

7. Lexical items often share minimal semantic components

a. breathe, drink, eat, inhale, sniff Þ INGEST

b. buy, sell, give, take, steal Þ TRANSFER

( cf. Siemund Componential Analysis handout)

2.2. Katz’s theory and the criticism it brings

Another representative is Katz, who advocated the abstract semantic primitives theory.

The central idea of Katz’s theory is the property of recursive and compositional. (cf. Saeed 97

pp.234-37) The Katzian dictionary is like this:

Bachelor {N}

a. (Human) (male) [one who has never been married]
b. (Human)(male)[young knight serving under the standard of another knight]
c. (Human)[one who has the first or lowest academic degree]
d. (Animal)(male)[young for seal without a mate in the breeding season]

So this includes two types of semantic component:

Firstly, the elements within parentheses (i), called semantic markers.

Secondly, distinguishers that are shown within square brackets [i].

The common-sense idea in the theory likes this: Part of a word’s meaning is shared with other words, but part is unique to that word. The essential part of the theory is the attempt to establish a semantic metalanguage( which means symbolised language),using ‘substance theory of semantic primes’ through the identification of semantic components: in simple terms, the theory of decomposition. The theory cannot answer the question like Quine’s how to decide the particular concept with this extension rather than with some different coextensive one. (About this point, I will interpret it in the next part.)

Pulman called Katz’s theory as Markerese (a kind of criticised fashion).In Pulman’s Against Semantic Primitives, he argued that Katz uses the approach, substance theory of semantic primes, tries to characterise the semantic relations of hyponymy, antonym, synonymy, contradiction, entailment, etc., this arises an intensive complexity and are inadequate.

Therefore, Katz’s theory was argued as completely unnecessary: “Katz’s readings express entailments from the term under analysis, but that they do not constitute a biconditional definition, or full analysis of the meaning of that term… It is impossible to undertake a grammatical analysis that has in no way been influenced by meaning, and it is equally impossible to undertake an analysis purely based on meaning.” (cf. Palmer 87, pp 8,44 )

Pustejorsky is one who objects to Semantic Primitives. He opposes mere feature decomposition, instead, he is for unexplored aspects of lexical meaning. What he would like to do is “ to propose a new way of viewing decomposition, looking more at the generative or compositional aspects of lexical semantics, rather than decomposition into a specified number of primitives.”

Pink and Grimshaw are also the ones who are against Semantic Primitives.( They emphasis more on the function of syntax behaviour of words, i.e. using lexical rules instead of lexical relations.)

3. Decomposition of verbs

Verbs are undoubtedly the most important lexical category of English language. Every English sentence must at least contain one verb, need not at the least include nouns; Comparing with nouns there are far fewer verbs in an English sentence. Verbs are thus the most complex lexical category. For any theory of lexical semantics, representing the meanings of verbs presents especially difficult. There is controvertible evidence that verbs actually are organised in lexical memory in terms of their irreducible meaning components.

There is difference between decomposition and a relational analysis, even if certain elements of a decomposition analysis constitute the semantics of some of the relations of analysis.

Another important process, which should be aware is that, the kinds of relations used to build networks of nouns and adjectives could not be directly imported into the verb lexicon, or they could not be changed when they were applied to verbs. The complexity of English verb predicates also the difficulty of noun phrases. If we can find some kind of framework to make the semantic relation of verbs clear, it will help us to analysis noun phrases, even the whole sentence easily. (cf. Miller & Fellbaum, 92, pp.214-215)

How finitely can many lexical meaning be sentential meanings must be addressed as the central question in meaning theory. The context is the key element to decide.(Often, this refers especially to the words with polysemy meanings.)

Thus the following sentence is one such case:

Barbara threw a colourful ball; and it has two meanings: ‘ She gave a party’, and ‘She caused a round object to move through the air’. In such a sentence, the components of meaning of ‘throw’, ‘ colourful’, and ‘ball’ must all be compatible. ‘Throw’ can mean ‘ to sponsor with a flourish’ or ‘to cause something to move through the air’. ‘Colourful’ can mean ‘abounding in contrast or variety of colours’ as applied to a social event or to a physical object. ‘Ball’ can mean ‘ an occasion for social dancing’ or ‘a small spherical object’.

The two combinations of meaning in the one clause above are meaningful only if the semantic components of the words fit together properly. In order to assert which combination of the two meanings are correct, we must relate the sentence to the context, asking ourselves: What kind of person is Barbara? or Who is she? Is she active in social activity? or Is she a mother of two children, just playing with them in the playground. So, if you think about the context you are reading, the judgement of one from the two meanings, I think, is out of question.

So, to make clear lexical contrasts and similarities is the second aim of lexical decomposition.

The mostly used semantic decomposition of verb perhaps is CAUSE sub-predicate of generative semantic method , which expresses a kind of semantic relation. see:

“ die =BECOME(DEAD(x))

kill =CAUSE(x, DIE(y))) ; kill =CAUSE(x, BECOME(DEAD(y)))

give = CAUSE (x HAVE (Y, Z))

show =CAUSE (x BECOME (SEE (Y, Z )))

teach =CAUSE(x BECOME(KNOW(Y, Z)))

replenish =CAUSE(X, BECOME(AGAIN(FULL(Y)))) ”[3]

There are large amount of inchoative and causative verbs whose meanings can be decomposed in this way, even if some of them can be further decomposed, this kind of approach is still powerful on meaning explanation.

We cannot replace KILL by CAUSE TO DIE or CAUSE BECOME NOT ALIVE, nor can we say DIE are equal with BECOME TO DIE (NOT ALIVE). Even if we were to accept the structure list’s doctrine that we only need to isolate the primitive semantic contrasting of a language, not further analyse these, we still face the problem of knowing whether the theoretical construct ‘cause’ used to analyse one kind of word is really representing the same meaning as it does when it is used in analysing another kind of word. (cf. Dowty 79 P.97)

Surely, to kill entails to die, but it is not the hyponym (or synonymous) of to die, one killed the other, not means the doer died (but in fact, the patient died). Also dead (NOT ALIVE) does not come out as a hyponym of alive(ALIVE). “ The analytic expressions cause to die suggests a looser and more indirect causal link than that conveyed by the lexical causative kill.”[4] Think about the situation: Using poison is also one-way to cause someone dying, must not be using weapon to kill someone to die. So kill is one condition of cause somebody to die, but not the only reason for the result to die. Of course, we can use “cause to die” to replace “kill”. The key is: “A satisfactory system of lexical decomposition must take account of the different ways in which semantic components combine together”. Thus, we could only say the assertion when LIKE, CAUSE alike are assured by us as part of a semantic metalanguage, used as investigators; the same way as the other English being used. Then ‘kill’ means ‘cause to become not alive’ because of its intuitive judgements, the only status of sentences and relationships between words, phrases and sentences of the native English. (cf. Pulman 83 pp37,38)

The word “butter” also belongs to this use. TO BUTTER blocks the application of lexical rules or other linguistic processes; or ‘frighten’ in its transitive sense as CAUSE TO BECOME AFRAID, all are the same rule: Just using a structure containing less complex English words to interpret complex ones (not using any series of concepts represented by symbols).

The third aim of lexical decomposition is to differ lexical relations and entailments. Besides, both Dowty’s ( examples see before ) and Jackendoff’s ‘Generative Semantic’ structures are very similar, but differ in respective theoretical foundations and objectives, the theory is only applied to verb meanings. To demonstrate, I give some Hackendorf’s examples as bellowing:

a. John went to home. Can be syntactically analysed as:

[S [NP John][ VP went [ PP TO [NP home]]]]]

conceptually analysed as: [Event GO ([Thing John],[Path TO ([ Place Home])])]

Conceptual decompositions for the verb GO and prep. TO are given as :

go: [Event GO ( [Thing] i,[ Path ] j )] to: [Path TO([ Place ]i )]

b. drink = [Event CAUSE ([Thing]i,][Event GO ([Thing LIQUID]j,

[Path TO ([Place IN ([Thing MOUTH OF ([ Thing] j)])])])])]

‘j’ has to be specified after the argument is implicit in the sentence. This formulation is a great progress than that Katz’s; at least it gives a clear way of meaning decomposition. (I also do not agree Katz’s way of analysis, which is too superficial, and the structure is inflexible. )

c. Mali hosted the meeting.

host: [Event CAUSE ( [Thing_]i,[Event ACT([Thing HOST]j , [Path AS([Place FOR([Thing _])])])])]

CAUSE (Mali, ACT (HOST, AS (FOR (MEETING))))

d. Harry buttered the bread.

butter: CAUSE (HARRY, GO (BUTTER, TO (ON (BREAD))))

butter, at here, is used as noun-verb, because it semantically related to the noun “butter”.

There are many such noun-verb words in English, hammer/hammer, phone/phone, bottle/bottle, package/package, e.g. their theories guide a formal language, which make every major syntactic constituent of a sentence (NP, AP, PP, etc) correspond with a major conceptual constituent (Event, Thing, Place, and Path); Jackendoff’s style of semantic analysis is an important contribution in the field of CA.

Decomposition methods have also been used in typological lexicon studies, e.g. Leonard Talmy’s language ‘package’ lexical meaning. He meant that English motion verbs, typically incorporate a ‘manner’ component (waltz, fly, walk, etc) whereas Spanish motion verbs incorporate a ‘path’ component. (cf. Goddard 98 Charpter.8)

The following set of words is used for performance by Cruse, also represent for decomposition of verb.

rise raise high

fall lower low

lengthen (1) lengthen (2) long

shorten (1) shorten (2) short

(lengthen (1) and shorten (1) are intransitive, like rise and fall; lengthen (2) and shorten (2)

are transitive/causative like raise and lower)

rise =[BECOME][MORE][HIGH]

fall =[BECOME][MORE][LOW]

raise =[CAUSE][BECOME][MORE] [HIGH]

lower =[CAUSE][BECOME][MORE][LOW] or

=[CAUSE][BECOME][LESS][HIGH] (in which represents for economy inventory).

lengthen (1)=[BECOME][MORE][LONG]

shorten (1)=[BECOME][MORE][SHORT]

lengthen (2)=[CAUSE][BECOME][MORE][LONG]

shorten (2)=[CAUSE][BECOME][MORE][SHORT] or

=[CAUSE][BECOME][LESS][LONG] (also represents for economy inventory).

The notion of a reference point related to the above is also given as:

high =[MORE][HEIGHT][REF: Average]

low =[LESS][HEIGHT][REF: Average]

long =[ MORE][LENGTH][REF: Average]

short =[LESS] [LENGTH][REF: Average]

As a second example, Cruse gives us the analysed way of some verbs:

dream process

mental

during sleep

experience unreal events

kiss (v.) action

physical

intentional

apply lips to something

functions as conventional signal

I interpret these in his way like this:

sleep [ACTION][PHYSICAL NEED][TO ALL ANIMATES][MOSTLY AT NIGHT]

bite [PROCESS][USING TEETH][A KIND OF NEED]

drive [ACTION][TO MOTOR VEHICLE][USING FOUR LIMBS]

fly [ACTION][HIGH ABOVE THE GROUND][WITH INSTRUMENT] <[HUMAN]>

[ACTION][HIGH ABOVE THE GROUND][WITH WINGS] <[BIRD]>

hop (like frog) [ACTION][USING LIMBS ALTERNATELY]

think [PROCESS][MENTAL][DURING THE DAY]

[EXPERIENCED REAL EVENTS]

So far so good, but how can the verbs that mean differently in different context be analysed? His examples to specify the relevant selective restrictions:

John expired (expired means ‘died’)

My driving licence has expired. (Here ‘expired’ means ‘has become invalid’)

expire =[BECOME][NOT][ALIVE]<[HUMAN]>

=[BECOME][NOT][VALID]<[DOCUMENT]>

Or put the additional restriction before head to each, thus like: [HUMAN]&[DOCUMENT]

He explained that the formulation above could be understood in two ways, a. When the subject being used is Human, and then the verb meaning NOT ALIVE should be chosen, if it is document then NOT VALID meaning should be the right choice. b. if neither being chosen as subject like a, then the sentence will be anomalous, like in ‘? The cup expired’.

How about the intention included not the same thing as this, can it also be analysed?

Followings are the two sentences that are related therefore:

Mary drank milk. drink [INCORPORATE][BY MOUTH], <[LIQUID]>

John drank the sulphuric acid. drink [INCORPORATE LIQUID][BY MOUTH]

Above two cases he argued that: There are good reasons for distinguishing relatively extrinsic co-occurrence restrictions even if it is difficult in some cases to pin down exactly what the co- occurrence constraints are. Thus, in order to judge anomaly is the fourth aim of the C.A. method.

The last aim of lexical decomposition is to avoid discontinuities. In I switch my computer on, again I switch it off. does not implicate that it was I who firstly switched it off before I use the computer.

What should arise our attention is that different languages differ also in their analysis methods. When we translate or speak a foreign language, we will make wrong or funny sentence if we choose words whose semantic components are not compatible and not fit together properly. The problem is especially severe in translation, where one is introducing new ideas and new collocations. We cannot simply translate English word ‘become’ into German equivalent as ‘bekommen’, for example, which is commonly called ‘false friends’.

German: Falsch Freund .In English: ‘become’ is used in change of ‘stage’, i.e. from ‘not being’ to ‘being’ or ‘ not existing’ to ‘ existing’; while German ‘bekommen’ is not, ‘bekommen’ is used in change of ‘position’, which includes from ‘not have’ to ‘to have’. We can find such examples in Leisi’s Praxis der englisher Semantik or Barnicakel, Klaus- Dieter’s False Friend: A Comparison Dictionary German-English.

4. Summary

In sum, Lexical decomposition represents the sense of a word in terms of the semantic features, which comprise in this word . As a method for characterising the sense of words, lexical decomposition has several advantages:

First, it explains our intuitions as speakers of English that some words are more closely related with these meanings than with those of others.

“Second, it is easy to characterise the sense of additional words by adding features.”

Finally, this method allows us, at least in principle, to characterise the senses of a potentially infinite set of words with a finite number of semantic features. “In general, the fewer the number of statements required by a theory to account for a given set of observations, the more highly valued the theory.”[5]

On the other hand, lexical decomposition has several practical limitations. First, linguists have been unable to agree on exactly how many and which features constitute the universal set of semantic properties, especially once the handful of features already mentioned. Moreover, nouns, especially concrete nouns, seem to lend themselves to lexical decomposition more readily than do other parts of speech; for example verbs, (cf. Parker & Riley 94, pp38, 39) but linguists have not given up the goal of studying of using this method, e.g. editing language dictionaries. So, Componential Analysis Method is not expected to be abandoned unless and until it has been explored in much greater depth and found either successful or ultimately unworkable.

Bibliography

Cruse, D.Alan. 2000. Meaning in Language, An Introduction to Semantics and Pragmatics,

Oxford: University Press

Dowty, David R.1979. Word Meaning and Montague Grammar, Holland: D.REIDEL

Publishing Company

Löbner, Sebastian 2002 Understanding Semantics, London/ New York Hodder / Oxford

Lyons, John. 1995. Linguistic Semantics, Cambridge

Miller, G.A. and Fellbaum, C. “Semantic networks of English.” In: Levin, Beth and

Pinker, Steven.(eds.), Lexical and Conceptual Semantics,

Oxford: Blackwell pp. 214-215

Parker, Franker & Riley, Kathryn (1994) Linguistics for Non-Linguists, Boston USA: Allyn and Bacon

Peck, Charles (1984) A Survey of grammatical structures, Dallas: Summer Institute of Linguistics

Pulman, S.G. (1983) Word Meaning and Belief, London &Canberra: CROOM HELM

Saeed, J.I (1997) Semantics, UK/USA: Blackwell

Componential Analysis Method

--Lexical decomposition of Verbs

Uni.Hamburg

Zhang, Fenglei

WS2002/03 Anglistik , Linguistics

Seminar II :Word Meaning-Sentence Meaning-Utterance meaning

Noten:3

[...]


[1]., see Cliff Goddard, Semantic Analysis (New York, 1998) 59,60.

[2]., see Cliff Goddard, Semantic Analysis (New York, 1998) 59,60.

[3] cf. Prof. Siemund’s handout

[4] See 1. P.260.

[5] F. Parker & K. Riley, Linguistics for Non-Linguists ( Boston USA,1994) pp33,38.

Final del extracto de 21 páginas

Detalles

Título
Componential Analysis Method
Universidad
University of Hamburg
Curso
Seminar II :Word Meaning-Sentence Meaning-Utterance meaning
Calificación
3
Autor
Año
2002
Páginas
21
No. de catálogo
V110404
ISBN (Ebook)
9783640085774
Tamaño de fichero
529 KB
Idioma
Inglés
Notas
A good constructure for the theory organization, the langauge needs to be better.
Palabras clave
Componential, Analysis, Method, Seminar, Word, Meaning-Sentence, Meaning-Utterance
Citar trabajo
Fenglei Zhang (Autor), 2002, Componential Analysis Method, Múnich, GRIN Verlag, https://www.grin.com/document/110404

Comentarios

  • No hay comentarios todavía.
Leer eBook
Título: Componential Analysis Method



Cargar textos

Sus trabajos académicos / tesis:

- Publicación como eBook y libro impreso
- Honorarios altos para las ventas
- Totalmente gratuito y con ISBN
- Le llevará solo 5 minutos
- Cada trabajo encuentra lectores

Así es como funciona