Misplaced Pages

Distributed morphology

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In generative linguistics , Distributed Morphology is a theoretical framework introduced in 1993 by Morris Halle and Alec Marantz . The central claim of Distributed Morphology is that there is no divide between the construction of words and sentences. The syntax is the single generative engine that forms sound-meaning correspondences, both complex phrases and complex words. This approach challenges the traditional notion of the Lexicon as the unit where derived words are formed and idiosyncratic word-meaning correspondences are stored. In Distributed Morphology there is no unified Lexicon as in earlier generative treatments of word-formation. Rather, the functions that other theories ascribe to the Lexicon are distributed among other components of the grammar.

#266733

80-402: The basic principle of Distributed Morphology is that there is a single generative engine for the formation of both complex words and complex phrases: there is no division between syntax and morphology and there is no Lexicon in the sense it has in traditional generative grammar. Distributed Morphology rejects the notion of a lexicon in the way it had been used. Any operation that would occur in

160-429: A basic operation is related to the mechanism which forces movement, which is mediated by feature-checking. In its original formulation, Merge is a function that takes two objects (α and β) and merges them into an unordered set with a label, either α or β. In more recent treatments, the possibility of the derived syntactic object being un-labelled is also considered; this is called "simple Merge" (see Label section ). In

240-408: A certain domain. In some but not all versions of minimalism, projection of selectional features proceeds via feature-checking, as required by locality of selection: Selection as projection : As illustrated in the bare phrase structure tree for the sentence The girl ate the food ; a notable feature is the absence of distinct labels (see Labels below). Relative to Merge, the selectional features of

320-511: A condition on agreement. This line of inquiry was initiated in Chomsky (2000), and formulated as follows: Many recent analyses assume that Agree is a basic operation, on par with Merge and Move. This is currently a very active area of research, and there remain numerous open questions: Co-indexation as feature checking: co-indexation markers such as {k, m, o, etc.} A phase is a syntactic domain first hypothesized by Noam Chomsky in 1998. It

400-450: A framework known as grammaire générale , first expounded in 1660 by Antoine Arnauld and Claude Lancelot in a book of the same title , dominated work in syntax: as its basic premise the assumption that language is a direct reflection of thought processes and so there is a single most natural way to express a thought. However, in the 19th century, with the development of historical-comparative linguistics , linguists began to realize

480-405: A head and what it selects: selection must be satisfied with the projection of the head. Move arises via "internal Merge". Movement as feature-checking : The original formulation of the extended projection principle states that clauses must contain a subject in the specifier position of spec TP/IP. In the tree above, there is an EPP feature. This is a strong feature which forces re-Merge—which

560-515: A lexical item determine how it participates in Merge: Feature-checking : When a feature is "checked", it is removed. Locality of selection ( LOS ) is a principle that forces selectional features to participate in feature checking. LOS states that a selected element must combine with the head that selects it either as complement or specifier. Selection is local in the sense that there is a maximum distance that can occur between

640-492: A minimum the "innate" component (the genetically inherited component) of the language faculty, which has been criticized over many decades and is separate from the developmental psychology component. Intrinsic to the syntactic model (e.g. the Y/T-model) is the fact that social and other factors play no role in the computation that takes place in narrow syntax ; what Chomsky, Hauser and Fitch refer to as faculty of language in

720-412: A new category consisting of a head (H), which is the label, and an element being projected. Some ambiguities may arise if the features raising, in this case α, contain the entire head and the head is also X . Labeling algorithm ( LA ): Merge is a function that takes two objects (α and β) and merges them into an unordered set with a label (either α or β), where the label indicates the kind of phrase that

800-399: A number of various topics that a syntactic theory is often designed to handle. The relation between the topics is treated differently in different theories, and some of them may not be considered to be distinct but instead to be derived from one another (i.e. word order can be seen as the result of movement rules derived from grammatical relations). One basic description of a language's syntax

880-427: A particular interface, a necessary consequence of Full Interpretation. A PF object must only consist of features that are interpretable at the articulatory-perceptual (A-P) interface; likewise a LF object must consist of features that are interpretable at the conceptual-intentional (C-I) interface. The presence of an uninterpretable feature at either interface will cause the derivation to crash. Narrow syntax proceeds as

SECTION 10

#1732776222267

960-436: A purpose. The structure of a sentence should be no larger or more complex than required to satisfy constraints on grammaticality. Within minimalism, economy—recast in terms of the strong minimalist thesis (SMT)—has acquired increased importance. The 2016 book entitled Why Only Us —co-authored by Noam Chomsky and Robert Berwick—defines the strong minimalist thesis as follows: The optimal situation would be that UG reduces to

1040-507: A semantic sense the terminal nodes of a complete syntactic derivation. For example, adjectives compárable and cómparable are thought to represent two different structures. First one, has a composition meaning of ‘being able to compare’ – root combines with a categorizer V- and the two combine with the suffix –able . The second one has an idiomatic meaning of ‘equal’ taken directly from the Encyclopedia – here root combines directly with

1120-511: A set of operations—Merge, Move and Agree—carried out upon a numeration (a selection of features, words etc., from the lexicon) with the sole aim of removing all uninterpretable features before being sent via Spell-Out to the A-P and C-I interfaces. The result of these operations is a hierarchical syntactic structure that captures the relationships between the component features. The exploration of minimalist questions has led to several radical changes in

1200-596: A single exponent ( portmanteau ). An example can be found in Swahili , which has separate exponents for subject agreement (e.g., 1st plural tu- ) and negation ( ha- ): tu- we- ta- will- pend-a love kiswahili Swahili tu- ta- pend-a kiswahili we- will- love Swahili ha- NEG - tu- we- ta- will- pend-a love kiswahili Swahili ha- tu- ta- pend-a kiswahili Syntax In linguistics , syntax ( / ˈ s ɪ n t æ k s / SIN -taks )

1280-435: Is a categorial grammar that adds in partial tree structures to the categories. Theoretical approaches to syntax that are based upon probability theory are known as stochastic grammars . One common implementation of such an approach makes use of a neural network or connectionism . Functionalist models of grammar study the form–function interaction by performing a structural and a functional analysis. Generative syntax

1360-709: Is a domain where all derivational processes operate and where all features are checked. A phase consists of a phase head and a phase domain. Once any derivation reaches a phase and all the features are checked, the phase domain is sent to transfer and becomes invisible to further computations. The literature shows three trends relative to what is generally considered to be a phase: A simple sentence can be decomposed into two phases, CP and v P. Chomsky considers CP and v P to be strong phases because of their propositional content, as well as their interaction with movement and reconstruction. Propositional content : CP and vP are both propositional units, but for different reasons. CP

1440-455: Is also called internal merge—of the DP the girl . The EPP feature in the tree above is a subscript to the T head, which indicates that T needs a subject in its specifier position. This causes the movement of <the girl> to the specifier position of T. A substantial body of literature in the minimalist tradition focuses on how a phrase receives a proper label. The debate about labeling reflects

1520-705: Is an example of a vocabulary item in Distributed Morphology: An affix in Russian can be exponed as follows: /n/ <--> [___, +participant +speaker, plural] The phonological string on the left side is available for insertion to a node with the features described on the right side. Roots, i.e. formatives from the Formative List, are exponed based on their features. For example, the first-person singular pronominal paradigm in English

1600-708: Is assigned to them only at spell-out, that is after all syntactic operations are over. The Formative List in Distributed Morphology differs, thus, from the Lexicon in traditional generative grammar, which includes the lexical items (such as words and morphemes ) in a language. As its name would suggest, the Formative List contains what are known as formatives, or roots. In Distributed Morphology, roots are proposed to be category-neutral and undergo categorization by functional elements. Roots have no grammatical categories in and of themselves, and merely represent

1680-421: Is built via merge. But this labeling technique is too unrestricted since the input labels make incorrect predictions about which lexical categories can merge with each other. Consequently, a different mechanism is needed to generate the correct output label for each application of Merge in order to account for how lexical categories combine; this mechanism is referred to as the labeling algorithm (LA). Recently,

SECTION 20

#1732776222267

1760-479: Is called "external Merge". As for Move, it is defined as an instance of "internal Merge", and involves the re-merge of an already merged SO with another SO. In regards to how Move should be formulated, there continues to be active debate about this, but the differences between current proposals are relatively minute. More recent versions of minimalism recognize three operations: Merge (i.e. external Merge), Move (i.e. internal Merge), and Agree. The emergence of Agree as

1840-432: Is called reconstruction. Evidence from reconstruction is consistent with the claim that the moved phrase stops at the left edge of CP and v P phases. Chomsky theorized that syntactic operations must obey the phase impenetrability condition (PIC) which essentially requires that movement be from the left-edge of a phase. The PIC has been variously formulated in the literature. The extended projection principle feature that

1920-413: Is concerned. (For a detailed and critical survey of the history of syntax in the last two centuries, see the monumental work by Giorgio Graffi (2001). ) There are a number of theoretical approaches to the discipline of syntax. One school of thought, founded in the works of Derek Bickerton , sees syntax as a branch of biology, since it conceives of syntax as the study of linguistic knowledge as embodied in

2000-514: Is considered a propositional unit because it is a full clause that has tense and force: example (1) shows that the complementizer that in the CP phase conditions finiteness (here past tense) and force (here, affirmative) of the subordinate clause. v P is considered a propositional unit because all the theta roles are assigned in v P: in (2) the verb ate in the v P phase assigns the Theme theta role to

2080-463: Is defined as an element that requires two NPs (its subject and its direct object) to form a sentence. That is notated as (NP/(NP\S)), which means, "A category that searches to the right (indicated by /) for an NP (the object) and generates a function (equivalent to the VP) which is (NP\S), which in turn represents a function that searches to the left for an NP and produces a sentence." Tree-adjoining grammar

2160-465: Is either contained within Z, or is Z. Adjunction : Before the introduction of bare phrase structure, adjuncts did not alter information about bar-level, category information, or the target's (located in the adjoined structure) head . An example of adjunction using the X-bar theory notation is given below for the sentence Luna bought the purse yesterday . Observe that the adverbial modifier yesterday

2240-491: Is exponed as follows: [+1 +sing +nom +prn] ←→ /aj/ [+1 +sing +prn] ←→ /mi/ The use of /mi/ does not seem infelicitous in a nominative context at first glance. If /mi/ acquired nominative case in the syntax, it would seem appropriate to use it. However, /aj/ is specified for the feature [+nom], and therefore must block the use of /mi/ in a nominative context. This is known as the Maximal Subset Condition or

2320-709: Is generalized as follows in Marantz 1988: 261: Morphological Merger: At any level of syntactic analysis (d-structure, s-structure, phonological structure), a relation between X and Y may be replaced by (expressed by) the affixation of the lexical head of X to the lexical head of Y. Two syntactic nodes can undergo Morphological Merger subject to morphophonological well-formedness conditions. Two nodes that have undergone Morphological Merger or that have been adjoined through syntactic head movement can undergo Fusion, yielding one single node for Vocabulary insertion. Many-to-one relation where two syntactic terminals are realized as

2400-573: Is generally believed that certain operations apply before vocabulary insertion, while others apply to the vocabulary items themselves. For example, Embick and Noyer (2001) argue that Lowering applies before Vocabulary insertion, while Local Dislocation applies afterwards. Apart from the operations described above, some researchers (Embick 1997 among others) have suggested that there are morphemes that represent purely formal features and are inserted post-syntactically but before spell-out: these morphemes are called "dissociated morphemes". Morphological Merger

2480-479: Is language? and Why does it have the properties it has?—but the answers to these two questions can be framed in any theory. Minimalism is an approach developed with the goal of understanding the nature of language. It models a speaker's knowledge of language as a computational system with one basic operation, namely Merge. Merge combines expressions taken from the lexicon in a successive fashion to generate representations that characterize I-Language , understood to be

Distributed morphology - Misplaced Pages Continue

2560-427: Is no consensus on which approach most accurately describes the structural configuration of root categorization. Vocabulary items associate phonological content with arrays of underspecified syntactic and/or semantic features – the features listed in the Lexicon – and they are the closest notion to the traditional morpheme known from generative grammar. Postsyntactic Morphology posits that this operation takes place after

2640-506: Is on the heads of phases triggers the intermediate movement steps to phase edges. Movement of a constituent out of a phase is (in the general case) only permitted if the constituent has first moved to the left edge of the phase (XP). The edge of a head X is defined as the residue outside of X', in either specifier of X and adjuncts to XP. English successive cyclic wh-movement obeys the PIC. Sentence (7) has two phases: v P and CP. Relative to

2720-408: Is relevant for child language acquisition, where children are observed to go through a so-called "two-word" stage. This is discussed below in the implications section.) As illustrated in the accompanying tree structure, if a new head (here γ) is merged with a previously formed syntactic object (a phrase, here {α, {α, β} }), the function has the form Merge (γ, {α, {α, β}}) → {γ, {γ, {α, {α, β}}}}. Here, γ

2800-411: Is sister to VP and dominated by VP. Thus, the addition of the modifier does not change information about the bar-level: in this case the maximal projection VP. In the minimalist program, adjuncts are argued to exhibit a different, perhaps more simplified, structure. Chomsky (1995) proposes that adjunction forms a two-segment object/category consisting of: (i) the head of a label; (ii) a different label from

2880-429: Is sometimes framed as questions relating to perfect design (Is the design of human language perfect?) and optimal computation (Is the computational system for human language optimal?) According to Chomsky, a human natural language is not optimal when judged based on how it functions, since it often contains ambiguities, garden paths, etc. However, it may be optimal for interaction with the systems that are internal to

2960-401: Is the head, so the output label of the derived syntactic object is γ. Chomsky's earlier work defines each lexical item as a syntactic object that is associated with both categorical features and selectional features. Features—more precisely formal features—participate in feature-checking, which takes as input two expressions that share the same feature, and checks them off against each other in

3040-443: Is the performance–grammar correspondence hypothesis by John A. Hawkins , who suggests that language is a non-innate adaptation to innate cognitive mechanisms. Cross-linguistic tendencies are considered as being based on language users' preference for grammars that are organized efficiently and on their avoidance of word orderings that cause processing difficulty. Some languages, however, exhibit regular inefficient patterning such as

3120-519: Is the sequence in which the subject (S), verb (V), and object (O) usually appear in sentences. Over 85% of languages usually place the subject first, either in the sequence SVO or the sequence SOV . The other possible sequences are VSO , VOS , OVS , and OSV , the last three of which are rare. In most generative theories of syntax, the surface differences arise from a more complex clausal phrase structure, and each order may be compatible with multiple derivations. However, word order can also reflect

3200-629: Is the study of how words and morphemes combine to form larger units such as phrases and sentences . Central concerns of syntax include word order , grammatical relations , hierarchical sentence structure ( constituency ), agreement , the nature of crosslinguistic variation, and the relationship between form and meaning ( semantics ). There are numerous approaches to syntax that differ in their central assumptions and goals. The word syntax comes from Ancient Greek roots: σύνταξις "coordination", which consists of σύν syn , "together", and τάξις táxis , "ordering". The field of syntax contains

3280-542: Is the study of syntax within the overarching framework of generative grammar . Generative theories of syntax typically propose analyses of grammatical patterns using formal tools such as phrase structure grammars augmented with additional operations such as syntactic movement . Their goal in analyzing a particular language is to specify rules which generate all and only the expressions which are well-formed in that language. In doing so, they seek to identify innate domain-specific principles of linguistic cognition, in line with

Distributed morphology - Misplaced Pages Continue

3360-500: The Grammaire générale . ) Syntactic categories were identified with logical ones, and all sentences were analyzed in terms of "subject – copula – predicate". Initially, that view was adopted even by the early comparative linguists such as Franz Bopp . The central role of syntax within theoretical linguistics became clear only in the 20th century, which could reasonably be called the "century of syntactic theory" as far as linguistics

3440-615: The 'lexicon' according to lexicalist approaches is considered too vague in Distributed Morphology, which instead distributes these operations over various steps and lists. The term Distributed Morphology is used because the morphology of an utterance is the product of operations distributed over more than one step, with content from more than one list. In contrast to lexicalist models of morphosyntax, Distributed Morphology posits three components in building an utterance: There are three relevant lists in Distributed Morphology:

3520-544: The DP the cake and the Agent theta-role to the DP Mary . Movement : CP and vP can be the focus of pseudo-cleft movement, showing that CP and v P form syntactic units: this is shown in (3) for the CP constituent that John is bringing the dessert , and in (4) for the v P constituent arrive tomorrow . Reconstruction. When a moved constituent is interpreted in its original position to satisfy binding principles, this

3600-533: The Elsewhere Principle: if two items have a similar set of features, the one that is more specific will win. Illustrated in logical notation: f(E1) ⊂ f(T), f(E2) ⊂ f(T), and f(E1) ⊂ f(E2) → f(E2) wins. In this case, both /mi/ and /aj/ have a subset of features f(T), but /aj/ has the maximal subset. The Encyclopedia associates syntactic units with special, non-compositional aspects of meaning. This list specifies interpretive operations that realize in

3680-672: The Formative List, the Exponent List (Vocabulary Items), and the Encyclopedia. Items from these lists enter the derivation at different stages. The formative list, sometimes called the lexicon (this term will be avoided here) in Distributed Morphology includes all the bundles of semantic and sometimes syntactic features that can enter the syntactic computation. These are interpretable or uninterpretable features (such as [+/- animate], [+/- count], etc.) which are manipulated in syntax through syntactic operations. These bundles of features do not have any phonological content; phonological content

3760-447: The VO languages Chinese , with the adpositional phrase before the verb, and Finnish , which has postpositions, but there are few other profoundly exceptional languages. More recently, it is suggested that the left- versus right-branching patterns are cross-linguistically related only to the place of role-marking connectives ( adpositions and subordinators ), which links the phenomena with

3840-569: The application of movement, who moves from the (lower) v P phase to the (higher) CP phase in two steps: Another example of PIC can be observed when analyzing A'-agreement in Medumba . A'-agreement is a term used for the morphological reflex of A'-movement of an XP. In Medumba, when the moved phrase reaches a phase edge, a high low tonal melody is added to the head of the complement of the phase head. Since A'-agreement in Medumba requires movement,

3920-534: The bundle of semantic features to be exponed. The notation for roots in Distributed Morphology generally uses a square root symbol, with an arbitrary number or with the orthographic representation of the root. For example, love , without a grammatical category, could be expressed as √362 or as √LOVE. Researchers adopting the Distributed Morphology approach agree that roots must be categorized by functional elements. There are multiple ways that this can be done. The following lists four possible routes. As of 2020, there

4000-608: The deeper aspirations of the minimalist program, which is to remove all redundant elements in favour of the simplest analysis possible. While earlier proposals focus on how to distinguish adjunction from substitution via labeling, more recent proposals attempt to eliminate labeling altogether, but they have not been universally accepted. Adjunction and substitution : Chomsky's 1995 monograph entitled The Minimalist Program outlines two methods of forming structure: adjunction and substitution. The standard properties of segments, categories, adjuncts, and specifiers are easily constructed. In

4080-412: The early 1990s, starting with a 1993 paper by Noam Chomsky . Following Imre Lakatos 's distinction, Chomsky presents minimalism as a program , understood as a mode of inquiry that provides a conceptual framework which guides the development of linguistic theory. As such, it is characterized by a broad and diverse range of research directions. For Chomsky, there are two basic minimalist questions—What

SECTION 50

#1732776222267

4160-504: The framework of generative grammar, which holds that syntax depends on a genetic endowment common to the human species. In that framework and in others, linguistic typology and universals have been primary explicanda. Alternative explanations, such as those by functional linguists , have been sought in language processing . It is suggested that the brain finds it easier to parse syntactic patterns that are either right- or left- branching but not mixed. The most-widely held approach

4240-420: The general form of a structured tree for adjunction and substitution, α is an adjunct to X, and α is substituted into SPEC, X position. α can raise to aim for the X position, and it builds a new position that can either be adjoined to [Y-X] or is SPEC, X, in which it is termed the 'target'. At the bottom of the tree, the minimal domain includes SPEC Y and Z along with a new position formed by the raising of α which

4320-476: The generative paradigm are: The Cognitive Linguistics framework stems from generative grammar but adheres to evolutionary , rather than Chomskyan , linguistics. Cognitive models often recognise the generative assumption that the object belongs to the verb phrase. Cognitive frameworks include the following: Minimalist Program In linguistics , the minimalist program is a major line of inquiry that has been developing inside generative grammar since

4400-427: The head of the label. The label L is not considered a term in the structure that is formed because it is not identical to the head S, but it is derived from it in an irrelevant way. If α adjoins to S, and S projects, then the structure that results is L = {<H(S), H(S)>,{α,S}}, where the entire structure is replaced with the head S, as well as what the structure contains. The head is what projects, so it can itself be

4480-456: The human mind . Other linguists (e.g., Gerald Gazdar ) take a more Platonistic view since they regard syntax to be the study of an abstract formal system . Yet others (e.g., Joseph Greenberg ) consider syntax a taxonomical device to reach broad generalizations across languages. Syntacticians have attempted to explain the causes of word-order variation within individual languages and cross-linguistically. Much of such work has been done within

4560-407: The internalized intensional knowledge state as represented in individual speakers. By hypothesis, I-language—also called universal grammar —corresponds to the initial state of the human language faculty in individual human development. Minimalism is reductive in that it aims to identify which aspects of human language—as well the computational system that underlies it—are conceptually necessary. This

4640-487: The label or can determine the label irrelevantly. In the new account developed in bare phrase structure, the properties of the head are no longer preserved in adjunction structures, as the attachment of an adjunct to a particular XP following adjunction is non-maximal, as shown in the figure below that illustrates adjunction in BPS. Such an account is applicable to XPs that are related to multiple adjunction. Substitution forms

4720-430: The labeling algorithm theory should be eliminated altogether and replaced by another labeling mechanism. The symmetry principle has been identified as one such mechanism, as it provides an account of labeling that assigns the correct labels even when phrases are derived through complex linguistic phenomena. Starting in the early 2000s, attention turned from feature-checking as a condition on movement to feature-checking as

4800-461: The labeling algorithm violates the tenets of the minimalist program, as it departs from conceptual necessity. Other linguistic phenomena that create instances where Chomsky's labeling algorithm cannot assign labels include predicate fronting, embedded topicalization, scrambling (free movement of constituents), stacked structures (which involve multiple specifiers). Given these criticisms of Chomsky's labeling algorithm, it has been recently argued that

4880-423: The left (indicated by \) for an NP (the element on the left) and outputs a sentence (the element on the right)." Thus, the syntactic category for an intransitive verb is a complex formula representing the fact that the verb acts as a function word requiring an NP as an input and produces a sentence level structure as an output. The complex category is notated as (NP\S) instead of V. The category of transitive verb

SECTION 60

#1732776222267

4960-400: The mind. Such questions are informed by a set of background assumptions, some of which date back to the earliest stages of generative grammar: Minimalism develops the idea that human language ability is optimal in its design and exquisite in its organization, and that its inner workings conform to a very simple computation. On this view, universal grammar instantiates a perfect design in

5040-501: The narrow sense (FLN), as distinct from faculty of language in the broad sense (FLB). Thus, narrow syntax only concerns itself with interface requirements, also called legibility conditions. SMT can be restated as follows: syntax, narrowly defined, is a product of the requirements of the interfaces and nothing else. This is what is meant by "Language is an optimal solution to legibility conditions" (Chomsky 2001:96). Interface requirements force deletion of features that are uninterpretable at

5120-486: The phrase acts as a verb. This can be represented in a typical syntax tree as follows, with the name of the derived syntactic object (SO) determined either by the lexical item (LI) itself, or by the category label of the LI: Merge can operate on already-built structures; in other words, it is a recursive operation. If Merge were not recursive, then this would predict that only two-word utterances are grammatical. (This

5200-421: The phrase. It has been noted that minimal search cannot account for the following two possibilities: In each of these cases, there is no lexical item acting as a prominent element (i.e. a head). Given this, it is not possible through minimal search to extract a label for the phrase. While Chomsky has proposed solutions for these cases, it has been argued that the fact that such cases are problematic suggests that

5280-476: The place of that division, he positioned the verb as the root of all clause structure. Categorial grammar is an approach in which constituents combine as function and argument , according to combinatory possibilities specified in their syntactic categories . For example, other approaches might posit a rule that combines a noun phrase (NP) and a verb phrase (VP), but CG would posit a syntactic category NP and another NP\S , read as "a category that searches to

5360-551: The same type. The Aṣṭādhyāyī of Pāṇini , from c.  4th century BC in Ancient India , is often cited as an example of a premodern work that approaches the sophistication of a modern syntactic theory since works on grammar had been written long before modern syntax came about. In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax . For centuries,

5440-453: The semantic mapping of sentences. Dependency grammar is an approach to sentence structure in which syntactic units are arranged according to the dependency relation, as opposed to the constituency relation of phrase structure grammars . Dependencies are directed links between words. The (finite) verb is seen as the root of all clause structure and all the other words in the clause are either directly or indirectly dependent on this root (i.e.

5520-527: The semantics or function of the ordered elements. Another description of a language considers the set of possible grammatical relations in a language or in general and how they behave in relation to one another in the morphosyntactic alignment of the language. The description of grammatical relations can also reflect transitivity, passivization , and head-dependent-marking or other agreement. Languages have different criteria for grammatical relations. For example, subjecthood criteria may have implications for how

5600-558: The sense that it contains only what is necessary. Minimalism further develops the notion of economy, which came to the fore in the early 1990s, though still peripheral to transformational grammar . Economy of derivation requires that movements (i.e., transformations) occur only if necessary, and specifically to satisfy to feature-checking, whereby an interpretable feature is matched with a corresponding uninterpretable feature . (See discussion of feature-checking below.) Economy of representation requires that grammatical structures exist for

5680-465: The sheer diversity of human language and to question fundamental assumptions about the relationship between language and logic. It became apparent that there was no such thing as the most natural way to express a thought and so logic could no longer be relied upon as a basis for studying the structure of language. The Port-Royal grammar modeled the study of syntax upon that of logic. (Indeed, large parts of Port-Royal Logic were copied or adapted from

5760-478: The simplest computational principles which operate in accord with conditions of computational efficiency. This conjecture is ... called the Strong Minimalist Thesis (SMT). Under the strong minimalist thesis, language is a product of inherited traits as developmentally enhanced through intersubjective communication and social exposure to individual languages (amongst other things). This reduces to

5840-574: The subject is referred to from a relative clause or coreferential with an element in an infinite clause. Constituency is the feature of being a constituent and how words can work together to form a constituent (or phrase ). Constituents are often moved as units, and the constituent can be the domain of agreement. Some languages allow discontinuous phrases in which words belonging to the same constituent are not immediately adjacent but are broken up by other constituents. Constituents may be recursive , as they may consist of other constituents, potentially of

5920-507: The suffix –able . The Y-model of Minimalism , as well as the syntactic operations postulated in Minimalism, are preserved in Distributed Morphology. The derivation of a phrase/word proceeds as follows: Distributed Morphology recognizes a number of morphology-specific operations that occur post-syntactically. There is no consensus about the order of application of these morphological operations with respect to vocabulary insertion, and it

6000-403: The suitability of a labeling algorithm has been questioned, as syntacticians have identified a number of limitations associated with what Chomsky has proposed. It has been argued that two kinds of phrases pose a problem. The labeling algorithm proposes that labelling occurs via minimal search, a process where a single lexical item within a phrasal structure acts as a head and provides the label for

6080-570: The syntax itself has occurred. Vocabulary items are also known as the Exponent List. In Distributed Morphology, after the syntax of a given utterance is complete, the Exponent List must be consulted to provide phonological content. This is known as 'exponing' an item. In other words, a vocabulary item is a relation between a phonological string (which could also be zero or null) and the context in which this string may be inserted. Vocabulary items compete for insertion to syntactic nodes at spell-out, i.e. after syntactic operations are complete. The following

6160-450: The technical apparatus of transformational generative grammatical theory. Some of the most important are: Early versions of minimalism posits two basic operations: Merge and Move . Earlier theories of grammar—as well as early minimalist analyses—treat phrasal and movement dependencies differently than current minimalist analyses. In the latter, Merge and Move are different outputs of a single operation. Merge of two syntactic objects (SOs)

6240-414: The verb). Some prominent dependency-based theories of syntax are the following: Lucien Tesnière (1893–1954) is widely seen as the father of modern dependency-based theories of syntax and grammar. He argued strongly against the binary division of the clause into subject and predicate that is associated with the grammars of his day (S → NP VP) and remains at the core of most phrase structure grammars. In

6320-473: The version of Merge which generates a label, the label identifies the properties of the phrase. Merge will always occur between two syntactic objects: a head and a non-head. For example, Merge can combine the two lexical items drink and water to generate drink water . In the Minimalist Program, the phrase is identified with a label . In the case of drink water , the label is drink since

6400-511: The wider goals of the generative enterprise. Generative syntax is among the approaches that adopt the principle of the autonomy of syntax by assuming that meaning and communicative intent is determined by the syntax, rather than the other way around. Generative syntax was proposed in the late 1950s by Noam Chomsky , building on earlier work by Zellig Harris , Louis Hjelmslev , and others. Since then, numerous theories have been proposed under its umbrella: Other theories that find their origin in

#266733