In linguistics, syntax ( ) is a set of rules, principles, and processes that govern the structure of a sentence in a particular language, usually including word order. The term syntax is also used to refer to the study of principles and processes. The purpose of many syntacticians is to find syntactic rules that are common to all languages.
In mathematics, syntax refers to rules governing the behavior of mathematical systems, such as the formal language used in logic. (See logical syntax.)
Video Syntax
Etimologi
The word syntax is from Ancient Greek: ???????? "coordination", which consists of the title ??? syn , "together," and ????? tÃÆ'áxis , "reservations".
Maps Syntax
The order of subjects, verbs, and objects
The basic feature of language syntax is the order in which the subject (S), verb (V), and object (O) usually appear in the sentence. More than 85% of the language usually puts the subject first, either in the SVO sequence or SOV sequence. Other possible sequences are VSO, VOS, OVS, and OSV, the last three are rare.
Initial history
Work on grammar was written long before modern syntax emerged; in Ancient India, A ??? dhy? y? from P ?? this (circa 4th century BC) is often cited as an example of premodern work that approaches the syntactical sophistication of modern theory. In the West, the school of thought that came to be known as "traditional grammar" began with the work of Dionysius Thrax.
For centuries, work in syntax is dominated by a framework known as grammaire gÃÆ' à © nÃÆ' à © rale , first was described in 1660 by Antoine Arnauld in a book of the same title. This system takes the basic premise of the assumption that language is a direct reflection of thought processes and therefore there is one most natural way to express the mind.
However, in the 19th century, with the development of linguistic-comparative history, linguists began to realize the diversity of human languages ââand questioned the fundamental assumptions about the relationship between language and logic. It becomes clear that there is no natural way to express the mind, and therefore logic can no longer be relied upon as a basis for studying the structure of language.
The Port-Royal Grammar modeled a syntactic study of logic. (Indeed, most Port-Royal Logic is copied or adapted from Grammaire gÃÆ' à © nÃÆ' à © rale. ) the syntactic categories are identified by the logical ones, and all sentences are analyzed in terms of "Subject - Copula - Predicate. "Initially, this view was adopted even by early comparative linguists such as Franz Bopp.
The syntactic central role in theoretical linguistics becomes evident only in the 20th century, which can be called the "century of syntactic theory" as far as linguistics is concerned. (For a detailed and critical survey of syntactical history in the last two centuries, see Giorgio Graffi's monumental work (2001).)
Theory
There are a number of theoretical approaches to syntactic discipline. One school of thought, founded in the work of Derek Bickerton, sees syntax as a branch of biology, because it contains syntax as the study of linguistic knowledge as embodied in the human mind. Other linguists (eg, Gerald Gazdar) take a more Platonic view, since they regard syntax as the study of the abstract formal system. Yet others (eg, Joseph Greenberg) consider the syntax of taxonomic devices to reach broad generalizations across languages.
Generative grammar
The generative grammar hypothesis is that language is the structure of the human mind. The purpose of generative grammar is to create a complete model of this inner language (known as i-language). This model can be used to describe all human languages ââand to predict the grammar of any given utterance (ie, to predict whether the speech will sound true for native speakers of the language). This language approach was pioneered by Noam Chomsky. Most generative (though not all) theories assume that the syntax is based on the sentence constituent structure. Generative grammar is one theory that focuses primarily on sentence form, rather than its communicative function.
Among many generative linguistic theories, Chomsky's theory is:
- Transformational grammar (TG) (The original theory of generative syntax laid out by Chomsky in Structure of Syntax in 1957)
- Government and binding theory (GB) (revised theory in TG tradition developed mainly by Chomsky in the 1970s and 1980s)
- The Minimalist Program (MP) (reworking the theory of the GB framework published by Chomsky in 1995)
Other theories that find their origin in the generative paradigm are:
- Arc pair grammar
- Common phrase structure grammar (GPSG: now largely deprecated)
- Generative semantics (now mostly outdated)
- The grammar structure of the head-shifted phrase (HPSG)
- Lexical functional grammar (LFG)
- Nanosyntax
- Relational grammar (RG) (now mostly outdated)
Grammar dependency
Grammatical dependence is the approach to sentence structure in which the syntactic unit is organized according to the dependency relationship, as opposed to the constituent relation of grammatical phrase structure. Dependencies are links directed between words. Verbs (limited) are seen as the root of all clause structures and all other words in the clause directly or indirectly depend on this root. Some prominent syntax-based dependency theories are:
- Recursive categorical syntax, or Algebraic syntax
- Generative functional description
- Mean text theory
- Operator grammar
- The grammar of the word â ⬠<â â¬
Lucien Tesni̮'̬re (1893-1954) is widely seen as the father of syntactic theory and grammar-based modern dependence. He argued vigorously against the binary division of clauses into the subject and the predicate associated with the grammar of his day (S -> NP VP) and which remained the core of most grammatical phrase structures. In place of this division, he positions the verb as the root of all clause structures.
Categorical grammar
Categorical grammar is an approach that associates a syntactic structure rather than a grammatical rule, but on the properties of the syntactic category itself. For example, instead of asserting that the sentence is constructed by a rule that combines the noun phrase (NP) and verb phrase (VP) (eg, the phrase structure rule S -> NP VP), in categorical grammar, these principles are embedded in categories said the head itself. So the syntactic category for the intransitive verb is a complex formula that represents the fact that the verb acts as a function word requiring NP as input and generating a sentence-level structure as output. This complex category is denoted as (NP \ S) rather than V. NP \ S is read as "left-looking category (indicated by \) for NP (element on the left) and outputs a sentence (element on the right)." Category a transitive verb is defined as an element requiring two NPs (its subject and its direct object) to form a sentence. This is denoted as (NP/(NP \ S)) which means "the category that searches to the right (shown by/) for NP (object), and produces a function (equivalent to VP) that (NP) \ S), which in turn is a function that searches left for the NP and produces a sentence. "
The grammar adjacent to the tree is a categorical grammar that adds partial tree structures into categories.
Story/grammatical/probabilistic grammar/theory
The theoretical approach to syntax based on probability theory is known as stochastic grammar. A common implementation of such an approach utilizes neural networks or koneksionisme.
Functional grammar
Source of the article : Wikipedia