|home | alphabetical index|
A subfield of linguistics, syntax is the study of the rules, or "patterned relations," that govern the way the words in a sentence come together. It concerns how different words which are categorized as nouns, adjectives, verbs etc. (goes back to Dionysios Trax) are combined into clauses which in turn combine into sentences.
In the framework of transformational-generative grammar, the structure of a sentence is represented by phrase structure trees, otherwise known as phrase markers or tree diagrams. Such trees provide three types of information about the sentences they represent:
In computer science, the term syntax is used to denote the literal text of something written in a formal language or programming language, as opposed to its semantics or meaning.
The analysis of programming language syntax usually entails the transformation of a linear sequence of tokens (a token is akin to an individual word or punctuation mark in a natural language) into a hierarchical syntax tree (abstract syntax trees are one convenient form of syntax tree). This process, called parsing, is in some respects analogous to syntactic analysis in linguistics; in fact, certain concepts, such as the Chomsky hierarchy and context-free grammars, are common to the study of syntax in both linguistics and computer science. However, the applications of these concepts vary widely between the two fields, and the practical resemblances are small.
|copyright © 2004 FactsAbout.com|