Actions

Difference between revisions of "The Role of Entropy in Language and Communication"

From Santa Fe Institute Events Wiki

(New page: {{The Role of Entropy in Language and Communication}})
 
Line 1: Line 1:
 
{{The Role of Entropy in Language and Communication}}
 
{{The Role of Entropy in Language and Communication}}
 +
 +
 +
Working Group Abstract
 +
 +
Mark Johnson and Katherine Demuth
 +
 +
The Role of Entropy in Language and Communication
 +
 +
The focus of this workshop will be on developing an understanding of human language that is �rmly grounded in modern information theory, and exploits recent theoretical developments such as maximum entropy models. The key questions we wish to persue in this workshop are: What aspects of human language and communication can be pro�tably understood in terms of information theory and entropy? Are there aspects of human language and communication that can't be described in information-theoretic terms, and do these suggest ways in which the notion of information might need to be developed or re�ned? Does information theory primarily describe the use of language for communication, or does it also provide insights into the way that languages are acquired and change in the course of time? What are the consequences of the broad-tailed Zip�an distributions that characterize natural languages? Is information theory and entropy only relevant to linguistic processes such as acquisition, processing and language change, or does it influence the fundamental \competence" structure of language itself?

Revision as of 21:02, 4 February 2010

Working Group Navigation


Working Group Abstract

Mark Johnson and Katherine Demuth

The Role of Entropy in Language and Communication

The focus of this workshop will be on developing an understanding of human language that is �rmly grounded in modern information theory, and exploits recent theoretical developments such as maximum entropy models. The key questions we wish to persue in this workshop are: What aspects of human language and communication can be pro�tably understood in terms of information theory and entropy? Are there aspects of human language and communication that can't be described in information-theoretic terms, and do these suggest ways in which the notion of information might need to be developed or re�ned? Does information theory primarily describe the use of language for communication, or does it also provide insights into the way that languages are acquired and change in the course of time? What are the consequences of the broad-tailed Zip�an distributions that characterize natural languages? Is information theory and entropy only relevant to linguistic processes such as acquisition, processing and language change, or does it influence the fundamental \competence" structure of language itself?