Modelling semantics as a linguistic interface system

22
Modelling Semantics as a Linguistic Interface System Heike Wiese, Humboldt University Berlin [email protected] Abstract I present an account of the semantic system as an interface level that makes conceptual structures acces- sible for language. The model I propose integrates semantic and conceptual structures into a broader ar- chitecture of the language faculty, placing semantics on a par with phonology and the conceptual system (CS) on a par with phonetics (PHON). It is based on a definition of interface levels as relational struc- tures that are generated by linguistic view functions operating on PHON, CS, and the syntactic system SYN. I show that the definition of semantics as the linguistic interface level of CS allows us to account for dissociations between linguistic and non-linguistic aspects of meaning, and provides a framework to make explicit the relations between grammatical semantics and conceptual knowledge. As I demonstrate for English and Kurdish mass and count nominals, the model supports a distinction of language-specific and language-independent phenomena that can account, among others, for the interaction of conceptual, semantic, and syntactic aspects of grammatical alternations. 1 Overview This paper is concerned with the correlation of linguistic and non-linguistic structures. In particular I discuss the correlation of linguistic and conceptual representations, as organised by the interface between language and the conceptual system CS. However, the analysis I will put forward does not treat the make-up of this correlation as idiosyncratic, but subsumes it under a unified notion of lin- guistic interfaces that allows us to understand core aspects of the linguistic-conceptual interface as an instance of a general pattern underlying the correlation of linguistic and non-linguistic structures. Based on a definition of linguistic interfaces and the view function functions generating them, our model will identify semantics and phonology as the interface levels that provide gateways to language for conceptual and phonetic information, respectively. This account places semantics on a par with phonology, and the conceptual system CS on a par with the phonetic system PHON. As the discussion will show, this association is supported by a number of crucial parallels between the re- spective (sub-)systems. The definition of semantics as a specialised, linguistically determined sub- system within CS accounts for linguistic aspects of meaning and allows us to make a clear distinc- tion between linguistic and non-linguistic phenomena. In the following sections, I will first give a general characterisation of the conceptual- linguistic interface and describe a semantic level SEM as introduced by Two-Level approaches to semantics. On this basis I will discuss the motivation to introduce SEM as a system separate from CS, rather than modelling semantic structures as a non-distinct, integral part of a general conceptual system. In this context I will refer to linguistic phenomena and to evidence from cognitive neurosci- ence suggesting a dissociation of grammatical semantics and general conceptual knowledge. The results from this discussion will support a model that accounts for semantic representa- tions as distinct from general conceptual structures, but integrates the semantic system into CS as a subsystem. The model I propose can be described within the framework of a Tripartite Parallel Ar- chitecture (‘TPA’), as described by Jackendoff (1997); it assumes three autonomous derivational systems: PHON (Phonetics), SYN (Syntax), and CS (Conceptual System). It deviates from the stan- dard TPA-approach by recognising an independent level of grammatical semantics SEM as the lin- guistic interface level of CS. I will introduce SEM into the TPA-framework by working out the no- tion of interface levels that TPA provides.

Transcript of Modelling semantics as a linguistic interface system

Modelling Semantics as a Linguistic Interface System

Heike Wiese, Humboldt University [email protected]

AbstractI present an account of the semantic system as an interface level that makes conceptual structures acces-sible for language. The model I propose integrates semantic and conceptual structures into a broader ar-chitecture of the language faculty, placing semantics on a par with phonology and the conceptual system(CS) on a par with phonetics (PHON). It is based on a definition of interface levels as relational struc-tures that are generated by linguistic view functions operating on PHON, CS, and the syntactic systemSYN. I show that the definition of semantics as the linguistic interface level of CS allows us to accountfor dissociations between linguistic and non-linguistic aspects of meaning, and provides a framework tomake explicit the relations between grammatical semantics and conceptual knowledge. As I demonstratefor English and Kurdish mass and count nominals, the model supports a distinction of language-specificand language-independent phenomena that can account, among others, for the interaction of conceptual,semantic, and syntactic aspects of grammatical alternations.

1 OverviewThis paper is concerned with the correlation of linguistic and non-linguistic structures. In particularI discuss the correlation of linguistic and conceptual representations, as organised by the interfacebetween language and the conceptual system CS. However, the analysis I will put forward does nottreat the make-up of this correlation as idiosyncratic, but subsumes it under a unified notion of lin-guistic interfaces that allows us to understand core aspects of the linguistic-conceptual interface asan instance of a general pattern underlying the correlation of linguistic and non-linguistic structures.

Based on a definition of linguistic interfaces and the view function functions generating them,our model will identify semantics and phonology as the interface levels that provide gateways tolanguage for conceptual and phonetic information, respectively. This account places semantics on apar with phonology, and the conceptual system CS on a par with the phonetic system PHON. As thediscussion wil l show, this association is supported by a number of crucial parallels between the re-spective (sub-)systems. The definition of semantics as a specialised, linguistically determined sub-system within CS accounts for linguistic aspects of meaning and allows us to make a clear distinc-tion between linguistic and non-linguistic phenomena.

In the following sections, I will first give a general characterisation of the conceptual-linguistic interface and describe a semantic level SEM as introduced by Two-Level approaches tosemantics. On this basis I will discuss the motivation to introduce SEM as a system separate fromCS, rather than modell ing semantic structures as a non-distinct, integral part of a general conceptualsystem. In this context I will refer to linguistic phenomena and to evidence from cognitive neurosci-ence suggesting a dissociation of grammatical semantics and general conceptual knowledge.

The results from this discussion will support a model that accounts for semantic representa-tions as distinct from general conceptual structures, but integrates the semantic system into CS as asubsystem. The model I propose can be described within the framework of a Tripartite Parallel Ar-chitecture (‘TPA’), as described by Jackendoff (1997); it assumes three autonomous derivationalsystems: PHON (Phonetics), SYN (Syntax), and CS (Conceptual System). It deviates from the stan-dard TPA-approach by recognising an independent level of grammatical semantics SEM as the lin-guistic interface level of CS. I will introduce SEM into the TPA-framework by working out the no-tion of interface levels that TPA provides.

2

The definition of SEM, and of the linguistic view functions generating SEM as a part of CS,will be integrated into a broader model of linguistic and non-linguistic interfaces; in particular I willrelate it to the definition of phonology as the interface level of PHON. I will il lustrate the operationof language-specific view functions in the generation of semantic entries with the example of countand mass nominals in English and Kurdish. I show that the model developed here supports ananalysis that can account, among others, for the distinction and interaction of language-dependent(semantic and syntactic) and language-independent (conceptual) aspects of phenomena like count-mass coercions.

2 The conceptual-linguistic interfaceIf we regard the language faculty as part of the cognitive human capacities, there are two interfaceswith non-linguistic mental systems that are relevant in language production and comprehension: aninterface with acoustic structures as computed by phonetic representations (and linked to articula-tory-perceptual systems), and an interface with the conceptual system that provides the meaning forlinguistic representations. In Chomsky (1995), for instance, these linguistic interfaces are identifiedas the articulatory-perceptual and the conceptual-intentional interface (cf. also Bierwisch 1996, andthe discussion in Jackendoff 1997, ch.2).

The conceptual system CS is an autonomous, extra-linguistic system that interacts with thelinguistic system. By providing a particular representation of the world, concepts mediate the refer-ence of linguistic items, but conceptual structures are not necessarily connected to language (cf.Keil 1985).

To distinguish between linguistic and conceptual structures, Two-Level models of semanticsintroduce a lexical-semantic system SEM (cf. Bierwisch 1983; 1989; Bierwisch & Schreuder 1992;Lang 1994). SEM accounts for those aspects of meaning that have reflexes in the linguistic systemand is part of language, whereas CS is non-linguistic: “CS represents conditions based on general,extralinguistic knowledge, involving encyclopedic and situational aspects of various sorts” (Bier-wisch 1989, 5).

In these models, SEM is connected with the syntactic system SYN on the one hand, and withCS on the other hand; semantic representations mediate syntactic and conceptual structures. To thisend, the semantic representation of a lexical item LI identifies

(a) the combinatorial potential of LI: �-operators in the semantic representation mark those posi-

tions that are occupied by other linguistic items in the derivation of complex expressions( � correlation of SEM and SYN), and

(b) the referential potential of LI: semantic constants are mapped onto CS entities by an interpre-tation function Int ( � correlation of SEM and CS). Semantic representations are underspeci-fied in terms of conceptual interpretations; the interpretation function Int is sensitive to (lin-guistic and non-linguistic) contextual information.

Hence, according to Two-Level models of semantics, two systems are involved in the representa-tion of meaning: (i) a lexical semantic system SEM, and (ii) a non-linguistic conceptual system CSthat is set apart from the linguistic modules Phonology, Syntax and Semantics. Links between con-ceptual and linguistic representations are established by the interpretation function (Int) that con-nects semantic constants with CS entities.

Levelt et al. (1999) introduced a level of ‘ lexical concepts’ into their model of language pro-duction that can be regarded as a counterpart of SEM. Lexical concepts as defined by Levelt et al.are activated in a process of ‘conceptual preparation’ , and connected with lemmata that relate themeaning of lexical items to their morpho-syntactic features and phonological representations. Lexi-cal concepts are language-specific and integrate different conceptual representations with respect to

3

lexical constraints. For instance for English, where we have a lexical item mare in addition to horse,but not a single lexical item for ‘ female elephant’ , Levelt et al. assume a lexical concept MARE thatintegrates the concepts FEMALE and HORSE, but do no assume a unitary lexical concept inte-grating the concepts FEMALE and ELEPHANT:

“If a speaker intends to refer to a female horse, he may effectively do so by producing theword “mare,” which involves the activation of the lexical concept MARE(x). But if the in-tended referent is a female elephant, the English speaker will resort to a phrase, such as “fe-male elephant,” because there is no unitary lexical concept available for the expression of thatnotion.” (Levelt et al., 1999)

3 Motivation of a linguistic level SEMDo we need a specific semantic system, distinct from CS, to account for the meaning of linguisticexpressions? What is the purpose of SEM in Two-Level models, and what are the benefits of distin-guishing semantic and conceptual structures?

Within Two-Level models, the semantic level SEM is crucial to distinguish linguistic andnon-linguistic aspects of meaning. SEM is posited to account for the linguistic organisation ofmeaning; “SF [Semantic Form, H.W.] represents the linguistically specified conditions on Concep-tual Structure (CS), in terms of which linguistic expressions are interpreted.” (Bierwisch 1989, 4).In particular, SEM represents those aspects of meaning that are reflected in the morpho-syntacticsystem.

In the following paragraphs I describe the empirical phenomena that motivate such a semanticlevel SEM. A number of linguistic phenomena supporting such a level has been discussed in theliterature; in what follows I provide some examples to illustrate the kind of data SEM has to ac-count for. In a nutshell, the motivation for defining a separate system for linguistic aspects ofmeaning comes from two sources: (i) the meaning of lexical items is systematically underspecified,and (ii ) it is based on language-specific classifications of conceptual representations. In addition,(iii ) the distinction of SEM and CS might be supported by neurological and neurolinguistic evi-dence suggesting a dissociation of grammatical semantics, and conceptual representations that areinvisible to the syntactic system.

3.1 Underspecification of meaningThe meaning of linguistic representations is systematically underspecified; their semantic contribu-tion can be specified by different conceptual representations, depending on the linguistic and extra-linguistic context. As an extreme (and somewhat exaggerated) case, consider the following explica-tion of the German noun Zug, quoted from Mark Twain’s ‘The Awful German Language’ 1:

“Zug means Pull , Tug, Draught, Procession, March, Progress, Flight, Direction, Expedition,Train, Caravan, Passage, Stroke, Touch, Line, Flourish, Trait of Character, Feature, Lineament,Chess-move, Organ-stop, Team, Whiff , Bias, Drawer, Propensity, Inhalation, Disposition.”

Similarly, lexical items like number (#) or phrases like leave the institute allow a range of different,related interpretations, as illustrated in (1) and (2):

(1a) You are the #1 in my life. � numerical rank (ordinal number assignment)(1b) The #1 bus leaves from Porter Square. � numerical label (nominal number assignment)

(2a) He left the institute an hour ago. � institute as a building: change of place

1 Twain, Mark (1880): The Awful German Language; quoted after the Penguin edition (= A Tramp Abroad, New York

et al.: Penguin, 1997; Appendix D, p. 396).

4

(2b) He left the institute a year ago. � institute as an organisation: change of affili ation 2

Two-Level models account for this interpretational range by semantic representations that containunderspecified semantic constants. These constants can be interpreted and specified by differentconceptual representations in the course of derivation; the interpretation takes into account linguis-tic and non-linguistic contextual information. Along these lines, we can give a semantic representa-tion for the #1 as outlined in (1c): NU is a semantic constant identifying two related kinds of num-ber assignments; its interpretational range covers (at least) two functions in CS (namely NR andNL) that carry out ordinal and nominal assignments of numbers to objects, representing numericalrank (NR) and numerical label (NL).3

(1c) the #1: � x (NU (x,1)); Int(NU) � {NR, NL}

3.2 Language-specific classifications of conceptual representationsA second argument for a separate level of lexical semantics is the observation that, although se-mantic representations are interpreted in CS, semantic classes can be conceptually arbitrary; theyare language-specific, and not necessarily based on salient conceptual features. This phenomenon,which has so far received less attention in the discussion of Two-Level models of semantics, leadsto language-specific classifications and configurations of conceptual entities. The following exam-ples summarise some of the evidence.

The animate/ inanimate distinction of nouns

The differentiation of animate and inanimate objects is extra-linguistic. Yet the degree to which thisdifferentiation is relevant for the behaviour of nouns is language-specific. On the one hand, the con-ceptualisation of a noun’s referent as animate or as inanimate can be reflected by a wide range ofmorpho-syntactic phenomena in different languages (cf. Comrie 1989, ch.9; Dahl & Fraurud 1996).On the other hand, the boundaries between [animate] and [inanimate] nouns differ across languagesand can be influenced by linguistic factors like diachronic and phonological phenomena.

For instance in Persian, the [animate] category encompasses nouns referring to human beingsand some animals; these nouns are pluralised more regularly than others,4 and can take a plural suf-fix - � n that is not used with [ inanimate] nouns. However, derakht (‘ tree’) belongs to the [animate]category, i.e., the noun is treated on a par with nouns like zan (‘woman’), but not with nouns likegol (‘ flower’). Yet one would not assume that speakers of Persian conceive a tree as more personi-fied than, say, speakers of English. Hence even though conceptual classes are presumably the sameacross languages, their elements can enter the corresponding classes in the grammatical system in adifferent way.

This does not mean that there is no conceptual basis for the grammatical [ � animate] classifi-cation: the distinction of animate and inanimate entities does have a conceptual reality. As Gelman& Gottfried (1996) showed, children as young as three years are aware of the animate/inanimatedistinction, and, for instance, interpret the movement of animals and artifacts differently: they aremore likely to attribute immanent cause to animals than to artifacts and more likely to attribute hu-man cause to artifacts than to animals, suggesting a conceptualisation of animacy as a relevant ob-ject feature.

2 Cf. Bierwisch & Schreuder (1992, 31f).3 (1c) is an abbreviation of the actual semantic representation, ignoring, among others, a nominal predicate like BUS in

constructions li ke (1b). For detailed definitions of conceptual representations for number assignments cf. Wiese(1997b, ch. 4.3.5 and 9.2).

4 This is in accordance with a ‘plurality hierarchy’ suggested by Smith-Stark (1974); cf. the diachronic discussion inWiese (1997a) of nominal number in Persian.

5

Hence, while the distinction of [animate] and [inanimate] nouns is rooted in the conceptuali-sation of nominal referents in terms of animacy, the domain in which conceptual differences arelinguistically operative is determined by language-specific, ‘semantic’ classifications; the linguisticdistinction is not a direct reflex of a conceptual taxonomy.

Semantic constraints on lexical rules that govern verb alternations

Lexical rules that govern verb alternations are sensitive to phonological and morphological proper-ties, and to semantic and thematic properties of verbs. In first language acquisition the child usesthese properties to determine the range of alternation rules and to apply them productively to newverbs.

As Pinker (1989) shows, the relevant semantic constraints are based on verb classificationsthat do not depend on characteristic features of the events a verb can refer to, but rather on thoseaspects of the event that are focused in its semantic representation. For instance the rule that gov-erns the conative alternations in (3) and (4) picks out verbs whose semantic representations define atype of motion resulting in type of contact. It converts a specific thematic core ‘X acts-on Y’ into‘X goes toward acting-on Y’.

(3) Mary cut the bread. / Mary cut at the bread.(4) Bill hit the dog. / Bill hit at the dog.

However, although motion and subsequent contact are also typically involved in the events ofbreaking and kissing, (5) and (6) do not allow such an alternation:

(5) Mary broke the bread. / * Mary broke at the bread.(6) Bill kissed the child. / * Bill kissed at the child.

What is crucial here, is that the alternation rule applies to a certain class of verbs that is defined bylinguistically relevant semantic features. Motion and contact are semantic constituents of cut andhit, but they are not specified in the semantic representations of break and kiss, although they aretypically involved in the conceptual representation of breaking and kissing events.

Hence the semantic classification does not necessarily reflect a general cognitive similarity ortypicality of the referents, “ it’s not what possibly or typically goes on in an event that matters; it’swhat the verb’s semantic representation is choosy about in that event that matters” (Pinker 1989,108). According to this analysis, the semantic constraints on alternation rules like these are linguis-tically, not conceptually determined.

Semantic features that govern the distribution of numeral classifiers and nouns

In languages with a rich classifier system, numeral classifiers have a taxonomic effect on nouns;they are combined with classes of nouns that share certain aspects of their meaning. This classifica-tion is productive, and the distribution of novel nouns in cardinal classifier constructions can be de-termined by their meaning (cf. Carpenter 1991).

However, this nominal classification does not necessarily reflect a conceptual taxonomy. Thecombination of nouns and classifiers is conventional and to a large extent language-specific; it fo-cuses on different aspects of the nominals’ referents and does not take into account conceptual fea-tures in a systematic way.5

In the same language, the classification can, among others, refer to different physical attrib-utes of the nominal referent (shape, surface, size, …), to function, and to instrumental criteria;yielding taxonomies like ‘ [round object] / [small object] / [pet] / [food] / …’ that do not make muchsense in the conceptual system. Whether, for instance, a word for ‘dumpling’ in a classifier lan-

5 Cf. Wiese (2001) for a detailed discussion.

6

guage belongs to the class [round object], [small object], or [food], is governed by linguistic crite-ria. Moreover, while conceptual classifications arguably remain the same, the semantic taxonomyunderlying the distribution of nouns and classifiers can change diachronically. Among others, thiscan lead to conceptually incoherent classes like [animal or clothing or furniture], as is the case forthe Thai classifier tua.6 The fact that these classes are not considered eccentric within the semanticsystem suggests an autonomy of grammatical semantics.

So, although the taxonomic effect of numeral classifiers relates to conceptual features of thenominal referents, the selection of those conceptual features that are relevant for the distribution ofnumeral classifiers and nouns is lexically, not conceptually governed. As a result the distribution ofnumeral classifiers and nouns is based on grammatically determined classifications that are dissoci-ated from conceptual taxonomies.

‘Transnumerality’ and ‘ individuation’ of nominals

Nominals can be differentiated by a feature [ � tn] (‘ transnumeral’ ) that accounts, among others,for their distribution in cardinal constructions.7 [ tn] nominals like cattle require a numeral classi-fier in counting constructions and are not marked for number, whereas plural [ � tn] nominals likecows can be combined with a cardinal numeral directly; cf. (7) versus (8):

(7) six head of cattle (classifier construction: numeral – classifier – nominal [ � tn] );(8) six cows (plural construction: numeral – plural nominal [ tn] ).

The syntactic [– tn] feature of plural nominals can be linked to a semantic ‘ individuation’ that trig-gers a focus shift from a whole set to its individual elements: the semantic representation of nomi-nals like cows, unlike that of [ � tn] nominals (cattle), contains an individuation function. This indi-viduation function makes nominal referents accessible for discrete quantification. Accordingly,cows can be combined with a cardinal directly. In contrast to this, in cardinal constructions with [ �tn] nominals (like cattle) the individuation must be contributed by a classifier (for instance, head in(7)).8

However, this individuation does not add any lexical content. The grammatical distinctionbetween cows and cattle, between noodles and spaghetti, or – cross-linguistically – between furni-ture and meubles does not reflect salient conceptual differences. Likewise, the classifier head in (7)does not add any additional content. The individuation in these cases is a linguistic, ‘semantic’ phe-nomenon, rather than a conceptual one.

3.3 Neurological evidence supporting a distinction of linguistic and non-linguisticaspects of meaning

Recent results from cognitive neuroscience support the discrimination of two systems for the repre-sentation of linguistic and non-linguistic aspects of meaning. In particular a series of studies due toDavid Kemmerer presents neurological and neurolinguistic evidence for dissociations of (i) gram-matical semantics and (ii ) conceptual representations that are invisible to the grammatical system(Kemmerer 1999; 2000a; 2000b). As Kemmerer shows, access to grammatically relevant semanticfeatures and grammatically irrelevant conceptual features can be selectively impaired in brain-damaged patients; a dissociation that suggests distinct neurological underpinnings for the two lev-els.

Kemmerer (2000a) investigated the comprehension of conceptual and grammatical features ofadjectives, testing subjects who suffered from left-hemisphere lesions. His results suggest that the

6 DeLancey (1986) gives a diachronic analysis of tua.7 Cf. Greenberg (1974) for the cross-linguistic discussion of transnumeralit y in nouns.8 For the discussion of individuation functions cf. for instance Krifka (1989), Wiese (1997a, 2001).

7

patients had problems with grammatically relevant aspects of adjective meaning, but not withgrammatically irrelevant conceptual features. They could not utili se functional distinctions like ‘de-scriptive’ versus ‘classifying’ and semantic categories like ‘size’ versus ‘dimension’ for the order-ing of pre-nominal adjectives in English, whilst they had no problems interpreting conceptual fea-tures like ‘hot’ versus ‘cold’ that are invisible to syntax.

At the same time, they did well on grammaticality judgments requiring access to purely syn-tactic features like the position of adjectives in relation to the determiner and the noun (i.e., ‘Det–adjectives–Noun’ in English), suggesting that their problems with adjective order were not due to asyntactic impairment. Rather, their performance seems to indicate a selective impairment of lin-guistically relevant aspects of meaning, that is, of those structures that constitute the semantic sys-tem (Kemmerer 2000a).

Further support for this interpretation comes from a study with three brain-damaged patientswhose performance suggests selective deficits relating to semantic versus conceptual features ofverbs (Kemmerer 2000b). In this study, subjects performed two tasks: (i) a picture-matching taskwhere they had to distinguish between conceptually related verbs such as pour, drip, and slosh, and(ii) a grammaticality judgment task where they had to assess sentences like (9):

(9) Sam poured/sloshed/* fil led water into the glass.

The picture-matching task tested the comprehension of grammatically irrelevant conceptual distinc-tions (like manner of motion, as distinguished by pour versus slosh). In contrast to this, grammati-cality judgments for sentences like (9) require access to semantic constraints: constraints similar tothe ones discussed in 3.2, which govern verb alternations like ‘Mary cut/broke the bread’ vs. ‘Marycut/*broke at the bread.’ (cf. Pinker 1989).

For sentences like (9) that have the form ‘NP1 V NP2 into NP3’ , the constraint can be de-scribed as follows: a verb can only occur in these contexts if its semantic representation satisfies theschema ‘X causes Y to go to Z.’ (where X is denoted by NP1, Y by NP2, and Z by NP3). This is thecase for pour, but not for fil l. Although both verbs refer to events that result in an increase of liquidwithin a container, only the semantic representation of pour specifies that the liquid is caused tomove from one location to another location (the container), whereas the semantic representation offill indicates a change of state for the container (namely from not full to full), but does not specifyhow this change is brought about. These semantic representations (and the language-specific con-straints that draw upon) them have to be accessed for grammaticali ty judgments on sentences like(9), but not for the distinction of verbs like pour and slosh that makes use of grammatically irrele-vant conceptual differences.

Interestingly, the results of the study indicate a double dissociation of semantic versus con-ceptual features. Two of the subjects were able to distinguish conceptually related verbs in the pic-ture naming task, but were impaired in their judgments for sentences involving the semantic con-straints illustrated in (9). The third patient performed poorly in the picture-matching task, but didwell in the grammaticality judgments. As Kemmerer (2000b) points out, this suggests a dissociationof grammatically irrelevant conceptual distinctions and semantic features that establish grammati-cally relevant verb classes.

In view of this data, Kemmerer argues that grammatically relevant and grammatically irrele-vant components of meaning can be segregated in the mind / brain; he interprets this as evidence forneurological underpinnings of a dissociation of (a) grammatical semantics and (b) general concep-tualisations that are not visible for syntax (Kemmerer 2000a; 2000b).

8

4 Semantics as a linguistic interface for the conceptual systemTo sum up the results from the preceding section, the differentiation of semantic and conceptualrepresentations – a distinction that might be supported neurologically – accounts for the fact thatlanguages access a specific view of the conceptual system, determining a linguistic structure ofmeaning. On the one hand the meaning of linguistic expressions is underspecified; their interpreta-tion takes into account linguistic and extralinguistic contextual information. On the other hand, themeaning of a lexical item integrates different elements of CS and by doing so, focuses on semanticaspects that need not be conceptually salient. The lexicon generates linguistically relevant concep-tual ensembles and establishes linguistically relevant classifications and configurations.

Hence semantic representations, as the elements of a linguistic level of meaning, have asomewhat dual status. On the one hand, they are grounded in conceptual representations: since se-mantic constants are defined by their interpretational value, that is, in terms of CS elements, SEMand CS do not constitute ontologically distinct entities. On the other hand, semantic representationsare part of language: SEM represents exactly those aspects of meaning that are visible for the lin-guistic system; elements of SEM and classifications within SEM account for linguistically, but notnecessarily conceptually, relevant structures.

In a model that does not provide a separate level for linguistic aspects of meaning, the burdento account for semantic phenomena lies on the links between CS and the linguistic system, in par-ticular on links from CS to syntactic structures and the lexicon. As these links need to access lin-guistically relevant classes of CS entities, this means that we have to define classifications and con-figurations in CS that are governed linguistically, that is, we have to posit certain language-specificconceptual structures. As we have seen above, these structures need not be salient in terms of con-ceptual representations; linguistic classifications of meaning are essentially independent of concep-tual taxonomies. It might hence be desirable to have a sharper distinction between linguistic andgenuinely conceptual phenomena.

A way to distinguish linguistic aspects of meaning without positing a separate module for se-mantic structures, is to integrate the semantic level into the conceptual system CS. This way we cantreat semantics as a system in its own right, without neglecting the close correlation of semantic andconceptual representations.

This proposal is explored in the following section. My model can be described within theframework of a Tripartite Parallel Architecture (short: ‘TPA’) for the language faculty, as describedin Jackendoff (1997). In accordance with this framework I assume three autonomous derivationalsystems for the generation of (a) phonological and phonetic, (b) syntactic, and (c) semantic andconceptual representations. Unlike the standard TPA approach, the model I put forward recognisesan independent subsystem of grammatical semantics within the conceptual system. In particular, Igive an account of SEM as an interface level of CS that prepares conceptual representations for lin-guistic structures.

My notion of interface levels is based on a definition of view functions as functions that oper-ate on phonetic, syntactic and conceptual representations and generate linguistic interfaces in accor-dance with language-specific constraints; these interfaces are defined as relational structures. Byintroducing a class of view functions that generate semantics as a linguistic interface level of theconceptual system, the model I will put forward makes explicit the relationship between linguisticaspects of meaning and their conceptual basis, between semantically underspecified and flexiblelexical items and their contextual specification.9 A function generating a semantic subsystem of CS

9 According to this view, in a dynamic approach to semantics it would be these linguistic-conceptual l inks, as organised

by semantic view functions (the links between elements of the semantic level and conceptual knowledge), that providethe basis to update information states.

9

is a linguistic view function that prepares CS entities for language; it determines which conceptualrepresentations and configurations enter the lexicon, and how they are accessed by linguistic struc-tures.

In a Generative Lexicon framework as introduced by Pustejovsky (1995), the semantic flexi-bil ity of lexical items is accounted for by enriched lexical representations involving ‘qualia struc-tures’ . These structures integrate aspects of conceptual information that are relevant for the genera-tion and adjustment of meaning in complex constructions. From a Generative Lexicon point ofview, the definition of semantic view functions I will put forward in the following paragraphs pro-vides a framework to make explicit the generation of qualia structures from general conceptualstructures: semantic view functions, as functions from conceptual to semantic representations, iden-tify those elements that enter qualia structures in the representation of lexical items, and determinewhich conceptual processes can be accessed to integrate related concepts into the linguistic repre-sentation in the course of semantic composition.

I will spell out the operation of view functions for the example of nominal mass/count alter-nations in section 6 below, illustrating the way the model proposed here can account for the distinc-tion and interaction of conceptual, semantic, and syntactic aspects involved in grammatical alterna-tions.

4.1 Tripartite Parallel ArchitectureThe theory of a Tripartite Parallel Architecture (TPA) for the language faculty has been introducedin Jackendoff (1997). Following this approach, three modules, PS, SS and CS, are involved in therepresentation of linguistic structures, providing phonological, syntactic, and conceptual informa-tion, respectively. PS, SS and CS are autonomous derivational systems that are correlated by inter-face modules. An interface module consists of two (or more) interface levels and a set of correspon-dence rules. Interface levels are sets of representations within PS, SS and CS.

Correspondence rules govern exactly the interface level entities and establish (partial) homo-morphisms between them. The interface level of the syntactic system to PS and CS is identifiedwith S-structure. In addition to their correlation with the syntactic system, CS and PS interface withnon-linguistic systems like auditory and motor systems, spatial information, and emotion.

Under this account the meaning of a linguistic item is represented within the conceptual sys-tem CS; correspondence rules correlate conceptual structures with linguistic structures directly. Thelexicon is defined as a subset of the correspondence rules; a lexical item is a triple � PS, SS, CS���Hence the lexicon combines linguistic, namely syntactic and phonological, information (SS, PS)and conceptual information (CS).

It is in this respect that the model I put forward deviates from the standard TPA-model: basedon the results of our discussion so far, the approach I present in the following sections introduces alevel of grammatical semantics SEM that acts as a go-between for CS and the linguistic system.Under this account, conceptual representations do not enter lexical information directly, but only inthe form of their semantic ‘proxies’ . I will formalise this by working out the notion of interface lev-els that the TPA-framework provides.

4.2 Linguistic interface levelsWithin the TPA framework, two interface levels are involved in the correlation of the syntactic andthe conceptual system: (i) S-structure, the interface level of the syntactic system, and (ii ) the inter-face level of CS to the linguistic system. The elements of the interface levels, and only these ele-ments, are subject to correspondence rules between the two systems. Hence the interface level ofCS is that part of the conceptual system that makes CS entities accessible for language.

10

Remember now that this is also the pivotal function of the semantic level SEM as discussedabove; SEM mediates conceptual and linguistic structures. Hence, if we can identify SEM as thelinguistic interface level of CS, we can capture crucial aspects of Two-Level Semantics within theTPA-model. This will provide us with a systematic distinction of linguistic and non-linguistic as-pects of meaning, without committing us to the postulation of two modules for semantic and con-ceptual structures (cf. Wiese 1999). In the present section, I introduce a model that is based on thisidea.

In accordance with a TPA framework I assume three modules for the representation of lin-guistic and conceptual structures. These modules are identified as PHON, SYN and CS; they repre-sent (a) phonological and phonetic structures, (b) syntactic structures, and (c) conceptual-semanticstructures, respectively. Each module m contains a linguistic interface level ILm. The elements oftwo (or more) interface levels are connected by correspondence rules. Following the TPA approach,I regard the lexicon as a subset of these correspondence rules. I define a lexical entry as a triple< � , � ,� >, where � � ILPHON, � � ILSYN, and � � ILCS.

The correspondence rules establish homomorphisms between interface levels. A homomor-phism f is a mapping between two relational structures s1 and s2. f maps the elements of s1 ontothose of s2 and preserves the relations defined between them. The purpose of an interface level isnow to make the elements of a module accessible for these homomorphisms. Accordingly I intro-duce interface levels as relational structures. These relational structures are generated by specificview functions that operate on the modules PHON, SYN and CS. A view function yields interfacerepresentations and focuses on specific relations between them. These relations are preserved by thehomomorphisms that constitute links between the different modules.

Definition 1: View functions and interface levels

For every module m, where m � {PHON, SYN, CS} ,there is an identified view function � L whose target is ILm

L, the interface level of m for a givenlanguage L, such that� � : m’ � ILm, where m’ � � (m) [ � (m) is the power set of m], and� ILm is a relational structure, hence,

ILm = ! , " # where $ is a non-empty set of entities, and " is a non-empty set of relations in $ .According to this definition, a view function % operates on a subset m’ of & (m), the power set of amodule m. & (m) contains all sets of elements of m. % takes some of these sets (namely those that areelements of m’) and maps them onto interface level representations ( $ ), generating specific relationsbetween them ( " ). This way, % constitutes a relational structure ' $ , " # .

The elements and relations of this relational structure enter a homomorphism that connectsthem with interface level representations from another module. The homomorphism is establishedby correspondence rules:

Definition 2: Correspondence rules

For given interface levels ILm and ILn, where m, n ( {PHON, SYN, CS} , andILm = ' A, { R1, ... Ri} # , and ILn = ' B, { S1, ... Si} # :f is a set of correspondence rules between ILm and ILn ifff is a homomorphism of ILm into ILn, such that) for all a ( A: f (a) ( B, and) for each i: if Ri is an n-ary relation and a1, ... , an are in A, then

Ri(a1, ..., an) * Si(f (a1), ..., f (an)).

11

In accordance with Definition 1, the interface levels ILm and ILn in Definition 2 are given as rela-tional structures, i.e., as ordered pairs consisting of a set of elements and a set of relations. The setsof elements are referred to as A and B, for ILm and ILn, respectively. The correspondence rules be-tween the two interface levels are defined as the elements of a homomorphism f of ILm into ILn.10

Being a homomorphism, f maps each element of A onto an element of B, such that the relations thathold in A are preserved in B.

4.3 Definition of SEM as ILCS

Within this framework, we can now identify SEM as the linguistic interface level of CS. In order todo so, we introduce a view function + SEM

L that operates on sets of CS elements and generates lan-guage-specific interface representations (for a given language L). By doing so, + SEM

L makes CS en-tities accessible for language; SEM provides sets of CS entities as potential meanings for linguisticexpressions. This way, SEM entities serve as ‘ linguistic proxies’ for CS representations.

Definition 3: Generation of SEM as the linguistic interface level of CS

For a given language L, + SEML

is an identified view function that generates the interface level ILCSL

of the conceptual system CS, and ILCS = SEM, such that, + SEM: CS’ - SEM, where CS’ . / (CS) [/ (CS) is the power set of CS], and, SEM is a relational structure, SEM = 0 1 SEM 243 SEM 5 , where6 7 8:9<; is a set of typed semantic representations, and for each = > ? @:A<B ,there is a C D CS’ such that E SEM( C ) = F ,and for each x D C : there is a context CT, such that Int( F , CT) = x [Hence G H SEM, x H CS. Int is a context-sensitive interpretation function from SEM to CS.];I J K:L<M is a set of relations in N K:L<M .

OSEM in Definition 3 operates on a subset CS’ of P (CS), where P (CS) is the set of all sets of CS ele-

ments. As CS’ is a proper subset of P (CS), O

SEM does not operate on all possible sets of CS elements;furthermore it does not necessarily take into account all elements of CS: not all concepts and en-sembles of concepts have to be linked to linguistic expressions. The relevant elements of CS can beprimitive as well as complex representations, depending on language-specific lexical patterns.Hence, taking the above example from Levelt et al. (1999), we can think of a view function

OSEM

E

for English that maps the complex conceptual representation female horse onto a semantic constantMARE, but does not provide a unitary element of SEM for the conceptual representation femaleelephant.

A SEM entry as generated by Definition 3 consists of a semantic representation (generated inaccordance with the logical calculus) and its type. For each element Q of SEM,

OSEM identifies a setR of conceptual representations; R encompasses the possible interpretations for Q . For instance, for

the representation of the # 1 from (1) above O

SEME provides a set R = { NL, NR} (the conceptual rep-

resentations of numerical label and numerical rank) as possible interpretations for ‘#’ , as repre-sented by the semantic constant NU. For a given context CT, an interpretation function Int maps Qonto a specific element x of R ; for instance, Int maps NU onto NL in ‘ the #1 bus’ , and onto NR in‘ the #1 in my li fe’ (cf. (1a) and (1b) in 3.1 above). Hence, like in Two-Level models, Int correlatesunderspecified semantic representations with conceptual interpretations; Int specifies the informa-tion in R , taking into account contextual information.

10 Links between two interface levels are bi-directional, of course. The definition of correspondence rules above focuses

on the mapping from ILm into ILn, leaving it open whether the same or a different homomorphism is to be employedfor correspondences in the other direction (that is, from ILn into ILm).

12

On the other hand, semantic representations are correlated with linguistic (in particular syn-tactic) representations. S SEM lays the grounds for this correlation by defining a set T U:V<W of relationsthat hold between the elements of SEM. T U:V<W accounts for semantic classifications. Note that thedistinctions that define the relevant classifications are linguistically motivated. As the examplesabove il lustrated, distinctions like [ X animate] that are reflected by grammatical phenomena do notnecessarily observe conceptual taxonomies: certain conceptual differentiations, but not others, arelinguistically relevant, and the relevant features need not be conceptually salient.

Hence, T U:V<W P as established by a semantic view function S SEMP for Persian, would compile se-

mantic constants relating to humans and animals, like WOMAN and HORSE, with TREE as [+animate], but would exclude constants like FLOWER, which are classified as [– animate] togetherwith HOUSE etc., in accordance with the grammatical constraints discussed above. Table 2 illus-trates the dissociation of conceptual and grammatical classifications, and the definition of a seman-tic [ X animate] taxonomy by S SEM

P in Persian (in this graphic, pictures stand for conceptual repre-sentations, while capitalised words stand for semantic constants):

SEM [– animate] [+ animate]

HOUSE FLOWER TREE HORSE WOMAN

CS

Table 1: Access to conceptual features for a semantic taxonomy: [ Y Y animate] in Persian

The relations that constitute Z [:\<] identify the argument structure of lexical items. Conceptual inter-pretations restrict the upper number of arguments; the actual number is specified in accordance withthe item’s syntactic combinatorial potential, and is indicated by ^ -bound positions in the semanticrepresentation (and is reflected by its type). Correspondence rules between SEM and SYN consti-tute a homomorphism fSEM of _ ` SEM a Z SEM b into ILSYN, the interface level of SYN, that preserves thehierarchical order defined by Z [:\<] (in accordance with Definition 2).

A view function as defined in Definition 3 operates on a language-independent conceptualsystem CS and generates language-specific representations that constitute an interface level ILCS

L

for a particular language L. As the Persian and English examples illustrated, this implies that therecan be different view functions c SEM

L1, c SEML2 etc. for different languages L1, L2, which operate on

the same conceptual system CS. It also means that semantic systems are not only specific for lan-guage per se, but that they can also have idiosyncratic features that are specific for a particular lan-guage or for a particular language family.

However, Definition 3 does not exclude the possibil ity that there are also universal features ofview functions c SEM. The definition can account for the fact that there are universal semantic struc-tures as well as idiosyncratic ones (as is also the case for syntactic and phonological structures).Take contiguity constraints operating on colour terms as an example. These constraints are pre-sumably universal; they have the effect that only contiguous sectors of the colour spectrum are lexi-

inanimate animate

13

calised. As a result, there is for instance no colour term for red and green that does not include yel-low.11 Under the account put forward here, such a phenomenon can now be identified as a universalconstraint on view functions to the effect that for any language L, the view function d SEM

L discountsthose conceptual configurations that represent discontiguous sectors of the colour spectrum.

By defining SEM as a system in its own standing, the definition is in accordance with the neu-rological evidence discussed in section 3.3 above; it can account for dissociations of conceptual andgrammatical knowledge: according to the model proposed here, SEM is a particular relationalstructure that – while being part of CS and generated by a view function that operates on CS – isgenerated in accordance with linguistic constraints, that is, constraints that are autonomous fromconceptual phenomena. As a result, SEM and CS proper constitute autonomous subsystems, withindependent and possibly divergent organisations.

5 Implications: Semantics on a par with phonologyBased on the notion of interface levels and view functions provided by Definition 1, we can definePhonology as the linguistic interface level of PHON. I call this interface level ‘PHOL’. PHOL isgenerated by a view function d PHOL from phonetic to phonological representations, as formalised inDefinition 4.

Definition 4: Generation of PHOL as the linguistic interface level of PHON

For a given language L, d PHOLL is an identified view function that generates the interface level IL-

PHONL of the phonetic system PHON, and ILPHON = PHOL, such thate d PHOL: PHON’ f PHOL, where PHON’ g h (PHON), ande PHOL is a relational structure, PHOL = i j PHOL kml PHOL n , wheree j PHOL is a set of phonological representations, and for each o p j PHOL,

there is a q p PHON’ such that d PHOL( q ) = o ,and for each x p q : there is a context CT, such that r PHON( s , CT) = x

[Hence s t PHOL, x t PHON. r PHON is a context-sensitive function from phonological to phonetic rep-resentations.];u v PHOL is a set of relations in w PHOL.

PHOL is derived from PHON in a parallel way as SEM is derived from CS: PHOL is generated bya view function x PHOL that operates on a subset PHON’ of PHON whose elements are sets of PHONentities. x PHOL yields phonological representations. On the phoneme level, x PHOL operates on sets ofallophones. The choice of a specific allophone in a given context is governed by rules that derivephonetic representations from phonological representations. I refer to the set of these rules as ‘ r PHON’in Definition 4.

r PHON is a counterpart of the interpretation function Int from Definition 3. In order to indicatethis parallelism, we can refer to Int as ‘ r CS’ , the set of (context-sensitive) rules that derive concep-tual from semantic representations. The view functions x PHOL and x SEM generate underspecifiedphonological and semantic representations as part of lexical entries; r PHON and r CS specify this in-formation by mapping it onto phonetic or conceptual representations, respectively.

Table 2 ill ustrates these relationships: The stars labelled ‘ s ’ are interface representations inPHOL or SEM. They are linked to a set y = { x1, x2, x3} of PHON or CS representations by the viewfunctions x PHOL and x SEM, respectively. For a given context, r PHON and r CS map s onto an element of y(in the example in Table 2, s is mapped onto x2 as indicated by a bold arrow; dotted arrows indicatepossible specifications in other contexts).

11 For a discussion of lexical and conceptual aspects of contiguity constraints cf. Bickerton (1990).

14

PHOL SEM

Table 2: Generation of underspecified lexical information ( z z ), and contextual specification ( { { )

Like semantic information, phonological information is underspecified in terms of phonetic repre-sentations. And like semantic information, phonological information is part of lexical entries, and islanguage-specific. The view function | PHOL prepares phonetic representations for the grammaticalsystem, just like | SEM prepares conceptual representations for the grammatical system.

Due to this intermediary function, | SEM and | PHOL observe both language-specific constraints,and universal constraints that can be grounded in the systems feeding CS and PHON. Language-specific constraints are evidenced in the generation of semantic or phonological representations ofparticular lexical entries, say, MARE in SEME and / }�~���� / in PHOLE for English, or in language-specific classifications like [ � animate] or [ � aspirated] that can have a different impact and differ-ent boundaries within SEML1 or SEML2 and PHOLL1 or PHOLL2 for different languages L1 and L2.Examples for universal constraints are the contiguity constraints on colour terms in SEM (which aregrounded in our conceptualisation of the colour continuum as represented by the visual system), andconstraints in PHOL that reflect anatomical limitations and rule out phonemes that are based on,say, pharyngeal nasals.

Both PHON and CS interface with non-linguistic systems: CS interacts with mental modulesthat represent spatial and visual information, emotion, and others.12 The phonetic system has inter-faces to auditory and motor systems: on the one hand phonetic representations provide an analysisfor acoustic events (in the case of sign languages: visual events), on the other hand they serve as abasis for the motoric plan in speech production.

Another feature that sets phonetics on a par with the conceptual system and phonology on apar with semantics is the gradience vs. non-gradience of rules. Phonetic rules are gradient, whilephonological rules are not. This is paralleled in CS: conceptual features are typically based on pro-totypes or ‘best examples’ , whereas semantic classifications are presumably non-gradient and gov-ern grammaticality judgments like those discussed for (9) above.

Hence the definition of SEM and PHOL as parallel li nguistic interface levels for CS andPHON, respectively, is supported by a number of shared substantial features. Table 3 summarisessome of the parallels between the systems:

12 For a discussion of the interaction of CS with non-linguistic modules see Jackendoff (1992; 1997).

x3

x2x1

� �

x3

x2x1

� �

� �� �

15

PHOL PHON SEM CS

underspecified � � � � � � in lexical information; language-specific � � � � � � interfaces with non-linguistic systems � � � � � � gradient rules � � � � � �

Table 3: Parallels between PHOL/PHON and SEM/CS

While semantic representations mediate the correlation of conceptual and syntactic representations,phonological representations mediate the correlation of phonetic and syntactic representations. Thesyntactic system is essential for the translation of hierarchical order into linear order, and vice versa:the homomorphism correlating syntax and semantics focuses on hierarchical order, whereas the onethat correlates syntax and phonology preserves the linear order between the elements:

PHOL SYN SEM

Table 4: Correlation of linear order and hierarchical order by SYN

I have not specified the interface level of syntax, ILSYN, in this graphic. Within the present ap-proach, we can think of ILSYN as an enriched version of S-structure. In order to provide a basis forthe correlation with SEM, ILSYN includes functional information, in particular notions like ‘subject’ ,‘object’ , and ‘modifier’ . This information can be defined in terms of syntactic f-structure represen-tations along the lines of Bresnan (2000).

If we follow a view of symbolic cognition as proposed by Deacon (1997), the interface levelsthat our definition identifies are those elements of the linguistic system that establish a basis forsymbolic reference, as a key feature of the human language faculty. Deacon argues that the mainstep in the emergence of human language (as opposed to animal communication systems) is the de-velopment of a symbolic system; in a process of co-evolution of language and the brain, the adapta-tion of our brain to symbolic thinking gave rise to the development of the linguistic capacity. Ac-cording to this view, symbolic reference – as opposed to iconic and indexical reference – is medi-ated by relations between signs: symbols refer to objects not as individual tokens, but with respectto their position in a system.

Under this account symbolic reference, as the basis of human language, is crucially a linkbetween relations (sign-sign and object-object), not between individuals (signs and objects); it ishence based on the association of relational structures. It is this kind of association that our accountof interface levels provides: the systematic translation between elements of relational structures thatserve as interfaces for modules of different formats.

6 Illustration: The operation of language-specific view functions insome mass/count alternations in English and Kurdish

In the present section, I illustrate the operation of view functions and the distinction of conceptualand linguistic phenomena within the present approach, with the example of ‘grinder constructions’ ,a kind of mass/count alternations. A well-known phenomenon from the domain of mass and count

hierarchical orderlinear order

16

nominals is that cross-linguistically nouns that usually refer to objects, like chicken, can occur inconstructions where they behave like mass nouns and denote food, namely the substance that the(edible parts of the) objects in question consist of; cf. the English examples in (10) vs. (11):

(10) There { is a chicken / are chickens} in the yard. count noun: object(s)(11) There is chicken in the soup. mass noun: substance

The nominal transition can be blocked in cases like English cow or pig, where lexical items exist –namely beef and pork – that denote the corresponding substances (namely the meat of cows or pigs,respectively).

This nominal reference shift from object (‘a chicken / chickens’) to substance (‘chicken’) cannot only be observed in singular-plural languages like English, but also in languages where nominalplural marking is absent or not compulsory, in other words: in languages where all nominals aretransnumeral. As an instance for a language with predominantly transnumeral nominals, I discussKurdish (Sor � n� ).

As the examples in (10) and (11) illustrate, in singular-plural languages the transition inmeaning is accompanied by a morpho-syntactic alternation: whereas in (10), chicken as an object-denoting noun is marked for plural or accompanied by an indefinite article, in (11) when referringto a substance it is used as a bare noun, without number marking. The shift from object to substanceis reflected by a morpho-syntactic shift from [– tn] (‘not transnumeral’ : the nominal is marked fornumber or accompanied by an article) to [+ tn] (‘ transnumeral’ : no number marking or indefinitearticle).

As (12) and (13) show, no such correlation is evident in Kurdish: the conceptual shift fromobject to substance is not reflected in the grammatical system, i.e., object- and substance-denotingnouns alike are transnumeral:

(12) masi-m kri transnumeral noun: object(s)fish-1.SG.ERG. bought – I bought {a fish / fishes}.

(13) xordn-aka be masi-a transnumeral noun: substanceeat-DEF. without fish-is – The food is without fish.

A singular-plural language like English can also have some nouns, like cattle, that do not refer to asubstance, but are not marked for number either. These nouns are lexically transnumeral, similarlyas Kurdish nouns.

In sum, an analysis of count/mass alternations has to account for the following phenomena:(a) there are conceptual relations between objects and certain substances, namely the substances thatconstitute their edible parts (‘ foodstuff’ ); (b) across languages, some lexical items can denote bothan object and the corresponding foodstuff (English: chicken; Kurdish: masi), while others can not(English: cow, beef); (c) the distinction of object and substance can correspond to a morpho-syntactic distinction [± tn] in some languages for some nouns (English: chicken / chickens), but notfor all nouns (English: cattle), and not in all l anguages (Kurdish: masi).

To account for these phenomena within the model outlined above, I introduce lexical entriesfor the English noun chicken (E 1) and the Kurdish noun masi (K 1), as examples for nouns that un-dergo the reference shift from object to substance, and entries for cow (E 2) and beef (E 3) as ex-amples for nouns that do not. Each lexical entry is a triple consisting of a phonological representa-tion, a syntactic category, and a semantic representation:

17

(I) Lexical entries for some English and Kurdish nouns :(E 1) ‘chicken’: </chicken/, N, CHICKEN>(E 2) ‘cow’: </cow/, N, COW>(E 3) ‘beef’ : </beef/, N, BEEF>(K 1) ‘masi’ : </masi/, N, FISH>

In accordance with the definitions above, the semantic constants (CHICKEN, COW, BEEF, FISH)are correlated with conceptual representations via view functions that operate on the conceptualsystem CS and yield language-specific semantic representations for English and for Kurdish. I callthese view functions � SEM

E and � SEMK, for English and Kurdish respectively. For our examples, the

relevant arguments of � SEME and � SEM

K are the concepts chicken, cow, and fish and the correspondingfood concepts (representing the ‘ foodstuff’ , namely the substances that the edible parts of the re-spective objects consist of).

To account for the relations between objects and foodstuff, I distinguish two domains withinCS: A, the domain of objects, and M, the domain of substances. On this basis, we can define a CS-function u_g (‘Universal Grinder’) that maps elements of A onto elements of M. For an element xof A, u_g(x) is the substance the edible parts of A consist of; u_g(x) is an element of the subdomainM food of M that contains representations of food. u_g as a CS-function is language-independent.

(II) CS-function (language-independent):u_g: { x � A | u_g(x) � Mfood; u_g(x) = s(e(x))} ; such that:

for every object x, e(x) represents the edible parts of x, and

for every entity � : s( � ) represents the substance � consists of.

Realisations of the concepts chicken, cow, and fish belong to the domain of objects, A. According to(II ), they are related to conceptual representations in Mfood, namely conceptualisations of the sub-stances their edible parts consist of: u_g(chicken), u_g(cow) and u_g(fish).

The view functions � SEME and � SEM

K now take the concepts chicken, cow and fish, as well astheir counterparts u_g(chicken), u_g(cow) and u_g(fish), as arguments, and map them onto the se-mantic constants CHICKEN, COW, BEEF, and FISH. In the case of English chicken, both chickenand u_g(chicken) are possible interpretations for CHICKEN, instantiating a general rule ofcount/mass meaning alternations. The same holds for the lexical entry masi (‘f ish’) in Kurdish. Inthe case of English cow, the transition is blocked; � SEM

E relates the semantic constant COW only tocow, mapping u_g(cow) onto BEEF, the semantic representation for beef.

(III) Language-specific view functions on CS, generating SEM representations for English andKurdish:

� SEME: CS’ � SEME; � SEM

K: CS’ � SEMK

� SEME({ chicken, u_g(chicken)} ) = CHICKEN

� SEMK({ fish, u_g(fish)} ) = FISH

� SEME({ cow} ) = COW; � SEM

E({ u_g(cow)} ) = BEEF

instantiating a cross-linguistic default rule:

{ � SEM (x) = X, � � x | u_g( � ) � x} by default

(iff u_g( � ) is provided by CS)

18

Both view functions access the CS distinction of domains for objects (A) and substances (M): se-mantic representations in English and Kurdish are classified as [+ mn] if they are interpreted (in agiven context) by an element of M, and as [– mn] if they are interpreted by an element of A. Thisclassification yields semantic sorts in the sense of Dölling (1995).

(IV) Classification of SEM representations (relating to the domain of their conceptual interpretations):{ Int(x) � A � M | Int(x) � M � x [+ mn]} for SEME and SEMK

The classification can be employed by correspondence rules between the semantic level and thesyntactic system SYN. In a singular-plural language like English the default correspondence of se-mantic and morpho-syntactic features is [+ mn] � [+ tn] and [– mn] � [– tn]: by default, nominalsthat refer to substances are transnumeral, whereas object-denoting nominals are not.

(V) SYN/SEM correspondence rule for English{ x � SEME, y = SemSynE(x) | x [+ tn] � y [+ mn]} by default

[SemSynE links elements of SEME to their counterparts in SYN]

This correlation is, among others, used as a clue to determine reference in first language acquisition(cf. Bloom 1994). However, the two features are still differentiated, and some collective nouns, likecattle, deviate from this default rule without posing serious problems in acquisition (cf. Gordon1985). In contrast to the correspondence rule formalised above, cattle refers to objects [– mn], but isnot marked for number [+ tn]. Accordingly, the SYN/SEM correspondence is defined as a defaultrelation; it does not apply if the noun is already marked for [+ tn, – mn] in its lexical entry. This cannow be utilised in the case of cattle:

(E 4) cattle: </cattle/, N[+ tn], COW>

According to (E 4), cattle is specified as transnumeral as part of the syntactic information in its lexi-cal entry, which allows it to receive a semantic representation [– mn], deviating from the defaultcorrespondence stated in (V): COW is interpreted by an element from A, namely cow (cf. the defi-nition of � SEM

E in (II I)), hence it is [– mn] according to the classification in (IV).

The default correspondence of a semantic feature [mn] and a syntactic feature [tn] is relevantin the derivation of semantic representations for full noun phrases in English. A nominal like chick-ens in (10) above is syntactically [– tn] (it is marked for plural). Correspondingly, the semantic rep-resentation is classified as [– mn], and as a result – according to the correlation of [± mn] and con-ceptual domains defined in (IV) – CHICKEN is interpreted by chicken, an element of the CS-domain A. If on the other hand the nominal is syntactically [+ tn], like chicken in (11) above, weget a [+ mn] representation, and CHICKEN is interpreted by the complex concept u_g(chicken), anelement of Mfood.

In Kurdish, by contrast, no such correspondence between [± tn] and [± mn] holds, all nounsare [+ tn]. The classification of a nominal as [+ mn] or [– mn], the interpretation of a semantic con-stant like FISH by fish (CS-domain A) or u_g(fish) (CS-domain M), is not reflected by morpho-syntactic features. This has the effect of making a nominal like masi (‘f ish’) potentially ambiguous.Accordingly in Kurdish constructions where reference to the food is intended, a specification likegosht-i, ‘meat of’ , is often (but not necessarily) added, cf. (14):

(14) xordn-aka be (gosht-i) merishk-a eat-DEF. without meat-of chicken-is – The food is without (the meat of) chicken.

These examples from the nominal mass/count domain illustrate the way view functions operate onCS and generate language-specific representations. View functions have access to language-

19

independent conceptual domains (like A, the domain of objects, and M, the domain of substances)and to conceptual processes and functions like u_g that operate on the elements of these domains.The semantic representations they generate for nouns like chicken, cow, cattle, beef and masi arelanguage-specific and can be underspecified in terms of conceptual representations.

View functions like � SEME (for English) and � SEM

K (for Kurdish) establish links between se-mantic constants (like CHICKEN, COW, BEEF, and FISH) and their possible conceptual interpre-tations, taking into account language-specific aspects of meaning (like differential access to con-ceptual representations of objects and their foodstuff counterparts), and laying the grounds for cor-respondences between semantic and syntactic features (like [± tn] and [± mn]) in the derivation oflinguistic representations.

This way, we can think of SEM as a gateway to language for the conceptual system. ViaSEM, conceptual representations enter linguistic structures; SEM prepares CS entities for languageand organises the access of lexical items to conceptual processes (like ‘Universal Grinder’) and thecorrelation of syntactic distinctions (like [± tn]) with conceptual distinctions (like A versus M).

7 Summary: SEM as a gateway to languageThe model I presented in the preceding sections has the following elements:

Tripartite Parallel Architecture (Jackendoff 1997): Three autonomous generative systems, themodules PHON, SYN and CS, are involved in the derivation of linguistic representations. Themodules are connected by way of their interfaces: interface level representations are subject tocorrespondence rules between PHON, SYN and CS. The set of correspondence rules betweentwo interface levels constitutes a homomorphism. The lexicon is a subset of these correspon-dence rules

View functions and linguistic interface levels: Linguistic interface levels are specific relationalstructures within the modules; they constitute systems with an autonomous, linguistically de-termined organisation. These interface levels are generated in accordance with language-specific constraints by view functions that operate on the modules.

Semantics as the linguistic interface level of CS: The semantic system SEML of given languageL is a relational structure generated by a view function � SEM

L that operates on (sets of) concep-tual representations. � SEM

L yields linguistically relevant conceptual ensembles and those rela-tions between them that are visible to the grammatical system. By doing so, the view functionalso organises access of linguistic structures to CS processes in the course of semantic alterna-tions (like mass/count alternations).

Phonology as the linguistic interface level of PHON: The phonological system PHOLL of a givenlanguage L is a relational structures generated by a view function � PHOL

L that operates on pho-netic representations.

Syntax as a mediating system: The homomorphism correlating phonology and syntax focuses onlinear order, the one correlating semantics and syntax focuses on hierarchical order. SYNtranslates between linear and hierarchical order.

Within this approach, semantic considerations are integrated into a broader model of linguisticsubsystems and their association with non-linguistic mental systems. Subsuming semantics under aunified notion of linguistic interfaces, this model aims to provide a new perspective onto the dis-tinction and interaction of conceptual and linguistic processes and the correlation of semantic andsyntactic structures.

Table 5 ill ustrates this view of the different modules and interfaces (the grey part of eachmodule indicates its interface level IL; � PHOL and � SEM represent the view functions generating pho-nology and semantics as the linguistic interface levels of PHON and CS):

20

Table 5: Semantics as the linguistic interface level of CS within a TPA framework

As the graphic illustrates, in this model SEM does not constitute a separate module, in accordancewith a TPA framework. However, and consistent with the assumptions in Two-level models, SEMdoes constitute a specific system, namely the linguistic interface level of CS. SEM is that part of CSthat makes CS entities accessible for correspondence rules to the syntactic module, designed to ac-count for the underspecification of lexical items and for linguistic classifications of meaning.

Hence in the model advocated here, conceptual information does not enter the lexicon di-rectly, but via linguistically motivated semantic representations. As a consequence, the lexicon doesnot contain non-linguistic information. On the other hand, the location of SEM within the concep-tual system CS accounts for the close interaction between conceptual structures and lexical seman-tic structures in language acquisition and representation. In addition, this ‘2 in 1’ approach reflectsthe fact that SEM and CS do not consist of ontologically distinct entities; we do not posit a semanticsystem that on the one hand constitutes a separate module, but on the other hand would have to bedefined essentially in terms of CS entities.

The notion of interface levels and the definition of SEM as the linguistic interface level of CSmakes explicit the way conceptual structures enter language. It gives us a handle on the relationsbetween semantic and conceptual structures and the fine-tuning of meaning by conceptual interpre-tations. By doing so, the model accounts for dissociations of grammatical semantic and generalconceptual knowledge. It acknowledges linguistic aspects of meaning as grounded in conceptualrepresentations, but characterises them as forming a separate system in its own standing, with anorganisation that does not necessarily reflect conceptual structures.

This allows us to discuss conceptual processes and configurations separately from linguisticstructures, and to analyse the interaction between them. As I have shown with the example ofmass/count alternations, the model enables us to distinguish conceptual, semantic, and syntactic as-pects involved in the generation of meaning, and to capture differential access of CS-processes fordifferent lexical items or different languages.

To sum up my results, I hope to have shown that the approach developed here provides an ex-plicit framework (a) to distinguish conceptual and linguistic processes involved in the generation ofmeaning and to analyse the relations holding between them, (b) to account for dissociations of lin-guistic aspects of meaning and those conceptual structures that are invisible to grammar, and (c) to

ILPHON ILCS

ILSYN

21

acknowledge semantics and phonology as parallel systems within the architecture of the languagefaculty, namely as interface systems that provide gateways to language for conceptual and phoneticrepresentations, respectively.

ReferencesBickerton, Derek (1990). Language and species. Chicago, London: Univ. of Chicago Press.Bierwisch, Manfred & Robert Schreuder (1992). From concepts to lexical items. Cognition 42, 23-60.Bierwisch, Manfred (1983). Semantische und konzeptuelle Repräsentation lexikalischer Einheiten. In: R.

Ruzicka & W. Motsch, eds., Untersuchungen zur Semantik [studia grammatica XXII] , 61-99. Berlin: Aka-demie.

Bierwisch, Manfred (1989). Focussing on dimensional adjectives: Introductory remarks. In: M. Bierwisch &E. Lang, eds. (1989). Dimensional Adjectives. Grammatical Structure and Conceptual Interpretation, 1-11.Springer: Berlin et al.

Bierwisch, Manfred (1996). Lexical information from a minimalist point of view. In: C. Wilder, H.-M. Gärt-ner & M. Bierwisch, eds., The Role of Economy Principles in Linguistic Theory, 227-266. Berlin: Akade-mie.

Bloom, Paul (1994). Syntax semantics mappings as an explanation for some transitions in language sevel-opment. In Y. Levy, ed., Other Children, Other Languages: Issues in the Theory of Language Acquisition,41-75. Hill sdale, NJ: Erlbaum.

Bresnan, Joan (2000). Lexical-Functional Syntax. Oxford: BlackwellCarpenter, Kathie (1991). Later rather than sooner: Extralinguistic categories in the acquisition of Thai clas-

sifiers. Journal of Child Language 18 (1), 93-113.Chomsky, Noam (1995). The Minimalist Program. Cambridge, Mass: MIT Press.Comrie, Bernard (1989). Language Universals and Linguistic Typology: Syntax and Morphology. 2nd edi-

tion. Chicago: University of Chicago Press.Dahl, Oesten & Kari Fraurud (1996). Animacy in grammar and discourse. In: T. Fretheim & J. K. Gundel,

eds., Reference and Referent Accessibility, 47-64. Amsterdam: Benjamins.Deacon, Terrence W. (1997). The Symbolic Species. The Co-Evolution of Language and the Brain. New

York: Norton & Co.Dölling, Hannes (1995). Ontological domains, semantic sorts and systematic ambiguity. International Jour-

nal of Human-Computer Studies 43, 785-807.Gelman, Susan A., & Gottfried, Gail M. (1996). Children's causal explanations of animate and inanimate

motion. Child Development 67(5), 1970-1987.Gordon, P. (1985). Evaluating the semantic categories hypothesis: The case of the count/mass distinction.

Cognition 20, 209-242.Greenberg, Joseph H. (1974). Numeral classifiers and substantival number: Problems in the genesis of a lin-

guistic type. In: L. Heilmann, ed., Proceedings of the 11th International Congress of Linguists Bologna-Florence, Aug. 28 - Sept. 2, 1972, 17-37. Bologna: Mulino.

Jackendoff , Ray S. (1992). Languages of the Mind. Cambridge, Mass.: MIT Press.Jackendoff , Ray S. (1997). The Architecture of the Language Faculty. Cambridge, Mass.: MIT Press.Keil, Frank C. (1985). Concepts, Kinds, and Cognitive Development. Cambridge, Mass.: MIT Press.Kemmerer, David (1999). “Near” and “ far” in language and perception. Cognition 73(1), 35-63.Kemmerer, David (2000a). Selective impairment of knowledge underlying prenominal adjective order: Evi-

dence for the autonomy of grammatical semantics. Journal of Neurolinguistics 13(1), 57-82.Kemmerer, David (2000b). Grammatically relevant and grammatically irrelevant features of verb meaning

can be independently impaired. Aphasiology 14(10), 997-1020.Krifka, Manfred (1989). Nominal reference, temporal constitution and quantification in event semantics, in:

Bartsch, Renate; van Benthem, J., & van Emde Boas, P. (eds.), Semantics and Contextual Expressions, 75-115. Dordrecht: Foris.

22

Lang, Ewald (1994). Semantische vs. konzepuelle Struktur: Unterscheidung und Überschneidung. In: M.Schwarz, ed., Kognitive Semantik, 25-41. Tübingen.

Levelt, Willem J. M.; Roelofs, Ardi, & Meyer, Antje S. (1999). A theory of lexical access in speech produc-tion. Behavioral and Brain Sciences 22.

Pinker, Steven (1989). Learnability and Cognition. The Acquisition of Argument Structure. Cambridge,Mass.: MIT Press.

Pustejovsky, J. (1995). The Generative Lexicon. Cambridge, Mass.: MIT Press.Smith-Stark, T. Cedric (1974). The plurali ty split. In: Papers from the Tenth Regional Meeting of the Chi-

cago Linguistic Society, 657-671. Chicago, Illinois.

Wiese, Heike (1997a). Semantics of Nouns and Nominal Number. ZAS Papers in Linguistics 8, 136-163.

Wiese, Heike (1997b). Zahl und Numerale. Eine Untersuchung zur Korrelation konzeptueller und sprachli-cher Strukturen. Berlin: Akademie.

Wiese, Heike (1999). Die Verknüpfung sprachlichen und konzeptuellen Wissens: Eine Diskussion mentalerModule. In: I. Wachsmuth & B. Jung, eds., KogWis99. Proceedings der 4. Fachtagung der Gesellschaft fürKognitionswissenschaft Bielefeld, 28.9. – 1.10. 1999, 92-97. St.Augustin: Infix.

Wiese, Heike (2001). Numeral-Klassifikatoren und die Distribution von Nomen: Konzeptuelle, semantischeund syntaktische Aspekte. To appear in: Fries, Norbert; Kürschner, Wil fried (Hg.): Akten des III. Ost-West-Kolloquiums für Sprachwissenschaft. Tübingen: Narr.