To browse Academia. Skip to main content. By using our site, you agree to our collection of information through the use of cookies. To learn more, view our Privacy Policy. Log In Sign Up.

Author:Dujar Arashijas
Language:English (Spanish)
Published (Last):18 April 2010
PDF File Size:2.10 Mb
ePub File Size:3.80 Mb
Price:Free* [*Free Regsitration Required]

To browse Academia. Skip to main content. By using our site, you agree to our collection of information through the use of cookies. To learn more, view our Privacy Policy.

Log In Sign Up. Introducing Functional Grammar Introducing Functional Grammar, third edition, provides a user-friendly overview of the theoretical and practical aspects of the systemic functional grammar SFG model.

A glossary of terms, more exercises and an additional chapter are available on the companion website at: www. Introducing Functional Grammar remains the essential entry guide to Hallidayan functional grammar, for undergraduate and postgraduate students of language and linguistics.

Halliday and C. All rights reserved. No part of this book may be reprinted or reproduced or utilized in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe.

Functionalism Linguistics I. Like any model that attempts to offer a global view of how language works, Functional Grammar is complex, and students may be understandably daunted not only by the seemingly abstruse explanations but simply by the amount of new terminology. What I have tried to do is to set out the approach from the point of view of readers who are not familiar with this way of looking at language, and who may, indeed, have little background in linguistic analysis generally.

This involves describing the theoretical and practical aspects of the Functional Grammar model in as accessible a way as possible; but it also involves trying to make clear the reasons why the model is as it is, at all levels — from why a functional approach is adopted to why one particular analysis of a wording is preferable to another.

The constant aim is, without underestimating the initial difficulties, to encourage readers to realize that the fundamental assumptions of the model have an appealing simplicity and an intuitive validity. Once that step is achieved, it becomes easier to cope with the inevitable complexity of the details, and to see beyond the terminology to the important and useful insights offered by the approach.

The book is consciously modelled on the Introduction, covering much of the same ground, though not necessarily in the same order or from exactly the same perspective.

Many of the major revisions in this third edition are designed to reflect the changes in the fourth edition of IFG; others, particularly the choice of texts to analyse, derive from my own teaching of the subject and the ways in which my understanding of the concepts has developed. In addition to the intellectual inspiration provided by Michael Halliday, the book naturally owes a great deal to many other people, of whom I am particularly grateful to the following.

To my past and present colleagues in the former Applied English Language Studies Unit at Liverpool — above all, Flo Davies, who first encouraged me to start teaching Functional Grammar, and who was a constant source of ideas, insights and argument during our time as colleagues.

To my students at the University of Liverpool, especially those on the MA programmes in Applied Linguistics and TESOL, and on the undergraduate Grammar in Discourse module; and to students and staff in universities in Argentina, Austria, Brazil, China, Colombia, Germany, Norway, Sweden, Venezuela and Wales, who at various times kindly allowed me to indulge my enthusiasm for SFG: they all had different parts of the material in the book tried out on them, and their difficulties, comments and insights helped me to think through and clarify ideas that I had sometimes taken for granted.

To Naomi Meredith, Christina Wipf Perry and Eva Martinez at Arnold, who provided encouragement and advice for the two previous editions of the book; to Lucy Winder and Lavinia Porter at Hodder Education, who were very patient with me as I missed several deadlines; and to Sophie Jaques and Louisa Semlyen at Routledge, who had the unenviable task of taking over the publication of the book at a late stage.

I owe an unusual debt to those colleagues in the School of English at Liverpool who made early retirement an attractive option, leading to the situation in which I had time to devote to this new edition. And, above all, I am grateful to Susan Thompson, who is, happily for me, always available to argue over interpretations and explanations, to identify confusions and evasions, and to suggest alternative ways of understanding or expressing the ideas; and who puts up with my endless hours in my study working on this book and other projects.

As before, the completion of this edition owes a great deal to her. Young, 8th edition, p. Mayne et al. Bex and R. Watts eds , Standard English: the widening debate. London: Routledge, —, p. Lassig et al. McRae, Clinical orthopaedic examination, 3rd edition, p. Every effort has been made to trace the copyright holders but if any have been inadvertently overlooked, the publishers will be pleased to make the necessary arrangements at the first opportunity.

The author is also grateful to Sultan Al-Sharief and Angela Reid for kindly providing textual data and allowing it to be used in this book. In the second half of the last century, there built up an immensely influential view of what the study of language should involve which insists that there is only one proper place to start — from a view of language as an abstract set of generalized rules detached from any particular context of use.

It would be possible to ignore this view and simply start with the approach that I will be setting out in the book — based on a view of how language functions as a system of human communication. However, a comparison of different possible approaches will help us to understand better not only the destinations that each approach allows us to head for but also the reasons why we might choose one of the approaches in preference to another.

Therefore, in this chapter I will briefly outline the approach that was dominant, attempting to show why it was so attractive but also showing why an increasing number of linguists have come to feel that it does not make it easy for us to talk about many of the most central features of language.

I will then go on to introduce an alternative approach which takes full account of those features, and which offers a more appropriate place to start from if we are interested in language in use.

This comes from an advertisement aimed at attracting people to take up nursing as a career. Before reading on, can you decide what aspects of the sentence you might want to consider in providing a linguistic description of it? When I have asked students to do this kind of preliminary analysis, some often those who have learnt English as a foreign language and therefore have more background in traditional grammatical parsing break it up into its components as far as they can this is in fact trickier than it might look.

They label the parts of the sentence using terms like Subject and Verb, or non-finite verb and prepositional phrase. What they are essentially focusing on is what the different parts of the sentence are and how they fit together — in other words, the form. Underlying the points, though not usually made explicit, is also the idea of choice: that there are potentially identifiable reasons why the writer is expressing the message in this particular way rather than in other possible ways.

Both of these ways of looking at the sentence tell us something useful about it, and, in the informal descriptions given here at least, there is a good deal of potential overlap.

Any full analysis of the sentence will inevitably need to take account of both the meaning and the form and of the links between them. However, in order to make the analysis fairly rigorous rather than just an unordered list of points about the sentence, we need to decide on a reasonably systematic method; and in practice this involves choosing between form and meaning as our starting point.

This may at first seem simply a difference in emphasis, but, if carried through consistently, each approach in fact ends up with a strikingly different kind of description of language.

Chomsky insisted that linguistics should go beyond merely describing syntactic structures, and aim to explain why language is structured in the way it is — which includes explaining why other kinds of structures are not found. He argued that, in order to do this adequately, it was essential to make language description absolutely explicit.

Although the aim of TG was not to produce a computer program that could generate language, it was computers that provided the driving metaphor behind the approach. A computer is wonderfully literal: it cannot interpret what you mean, and will do exactly — and only — what you tell it to do. Therefore instructions to the computer have to be explicit and unambiguous: this includes giving them in exactly the right order, so that each step in an operation has the required input from preceding steps, and formulating them so as to avoid triggering any unwanted operations by mistake.

TG set out to provide rules of this kind for the formation of grammatically correct sentences. Note that the following outline describes TG in its early form.

The theory has changed radically since the s, becoming more abstract and more powerful in its explanatory force; but the basic concerns, and the kind of facts about language that it attempts to explain, have remained essentially the same. In setting up its rules, TG started from another deceptively simple insight: that every verb has a Subject, and that understanding a sentence means above all identifying the Subject for each verb. In English, Subjects normally appear in front of the verb, so it might be thought that identifying them would be too easy to be interesting.

We are so skilled at understanding who does what in a sentence that we typically do not even notice that in such cases we have to interpret something that is not explicitly said. One well- known example used by Chomsky was the pair of sentences: John is eager to please. John is easy to please. These appear, on the surface, to have the same structure; but in fact we understand that in the first case it is John who does the pleasing i. And how can the linguist show, in an explicit way, what it is that we actually understand?

That means that our description is not in fact fully explicit. We need to work with labels that tell us what each constituent is in itself, not what it does in the sentence. At the same time, we also need to show where each constituent fits in the basic structure. Translated into over-simple functional terms, it means in effect that every clause must have a verb and every verb must have a Subject. As the final S above suggests, the VP element does not only include the verb but any other elements that depend on the verb.

We can therefore go on splitting the clause elements into their component parts until we reach the basic constituents essentially words, though with some exceptions. This splitting up must, however, be done in the correct sequence in order to show the dependencies between different parts of the clause correctly.

However, we have not yet dealt with the VP in S1 or S2. This will allow us to show how S1—3 combine into the sentence as we actually see it. Although the operation is immensely complex in practice, it is simple in theory: it turns out that we can identify not only a finite set of explicit rules governing the possible combinations the complexity comes especially from the interaction between the rules , but, more crucially, an even more restricted set of underlying regularities in the type of rules that are possible.

What are the S1—3 underlying this version of the example? Which burglar did the policeman say Mary told him she had shot? It is perhaps surprising that, using such apparently marginal examples, the approach should have thrown so much light on how sentences are structured; and yet the insights gained have been extensive and in some ways revolutionary.

For our present purposes, however, it is less important to look at these discoveries in any detail than to consider where the approach leads us. The following two sentences have exactly the same propositional content and therefore the same analysis in terms of Ss: The burglar had shot himself.

Had the burglar shot himself? On the other hand, the fact that a statement and a question serve entirely different functions in communication is regarded as irrelevant in the grammatical analysis — it is taken into account in a different part of the linguistic description though there was relatively little interest in developing that part within the approach.

Chomsky made a principled decision to exclude how we use sentences in communication e. The aim is to discover the rules that govern how constituents can be put together to form grammatically correct sentences, and to formulate these rules in as general a way as possible ideally, so that they apply to all human language rather than just individual languages ; therefore each sentence is analysed in complete isolation, both from other sentences and from the situations in which it might be used.

The ways in which language is used are thought to be, unfortunately, too messy and are therefore ignored, at least until someone can find a way of describing them according to scientific general laws. But if the road towards an examination of use is blocked off, where else can we go from this starting point? The answer is inwards, into the brain. At the same time, the fact that we do not need to be explicitly taught how to do this means that we must in some way be born with the required mental capacities.

Thus a rigorously formal approach to the description of language leads us towards neurology and genetics. Clearly, these are fascinating and worthwhile areas, but they do involve giving up any idea of looking at language in use.

It is a system for expressing thought, something quite different. More importantly, there is little doubt that it does not reflect how the users themselves view language.

They respond above all to the meanings that are expressed and the ways in which those meanings are expressed. Do colds last seven days on average?

The syntactic underpinning in the examples above is of course essential in expressing the different meanings, but only as a tool that enables what most people see as the primary function of language — communicating meanings in particular contexts — to be carried out.

As always, the exact nature of the tool used depends on the task in hand. In linguistic terms, we can express this as the assumption that, if we start from the premise that language has evolved for the function of communication, this must have a direct and controlling effect on its design features — in other words, the form of language can be substantially explained by examining its functions.

Generative approaches provide a possible way of investigating those characteristics though their validity has been increasingly questioned.


Introducing Functional Grammar

Introducing Functional Grammar. Geoff Thompson. Closely based on Michael Halliday's 'Introduction to Functional Grammar', this book is an accessible introduction to the most fully developed functional approach to grammar currently available. It can be used in its own right or to prepare students for the more theoretical presentation of grammar in Halliday's book. It clearly explains why the functional approach is necessary in order to investigate how grammar is used as a resource for making meaning, and it describes each of the major grammatical systems in terms of the meaning that they contribute to messages. Starting with simple procedures for identifying the choices in a particular system, each chapter discusses the function of the system in context. This involves analysing what it means to make one choice from the system rather than another, e.


Introducing Functional Grammar, third edition, provides a user-friendly overview of the theoretical and practical aspects of the systemic functional grammar SFG model. An opening chapter on the purpose of linguistic analysis, which outlines the differences between the two major approaches to grammar - functional and formal. Advice and practice on identifying elements of language structure such as clauses and clause constituents. Numerous examples of text analysis using the categories introduced, and discussion about what the analysis shows. The third edition is updated throughout, and is based closely on the fourth edition of Halliday and Matthiessen's Introduction to Functional Grammar.


Introducing Functional Grammar. Geoff Thompson. This is an accessible introduction to the most fully developed functional approach to grammar currently available. It can be used either as a comprehensive course book in its own right or as a means of preparing students for the more theoretical treatment of grammar as presented in Halliday's book. In this thoroughly updated edition, Introducing Functional Grammar describes clearly each of the major grammatical systems in terms of the kind of meaning that they contribute to messages.



Related Articles