Mantex

Tutorials, Study Guides & More

  • HOME
  • REVIEWS
  • TUTORIALS
  • HOW-TO
  • CONTACT
>> Home / Archives for Technology

Dust or Magic

June 27, 2009 by Roy Johnson

secrets of successful multimedia design

Dust or Magic is a book for people who want to know about or work in the new media. It takes the line of revealing the truth about how multimedia projects really work – pointing to both successes and complete turkeys. Bob Hughes has been active in the field over its last decade, and he discusses a fascinating range of examples – from websites and CD-ROMs to kiosk programs and interactive video.

Dust or MagicHe starts with an account of digital technology from Alan Turing onwards – but the chronology darts backwards and forwards from Russian constructivists to Greek theatre and back again to Richard Wagner. Later, he settles down to a slightly smoother chronology, but without sacrificing his wide range of reference. He offers Vannevar Bush, Douglas Engelbart, and Ted Nelson as key pioneers and presents excellent accounts of their work.

This is followed by detailed sketches of the pioneers of Virtual Reality, Interactive Video, and early hypertext programs such as Guide, Toolbook, and Hypercard – including developments which have been passed by which he claims could be revived with the development of new technology.

There’s something of an intellectual dip in the middle of the book when he compares English revolutionaries of the seventeenth century with the Guerilla Girls, and he celebrates web sites and Hyperstacks which are not much more than collections of idiosyncratic enthusiasms. Fortunately, the level rises again with a whole chapter devoted to Voyager, which he claims made innovations with the bare tools [Hypercard] available at the time.

The latter parts of the book are devoted to accounts of working on multimedia projects – one for the Nationwide Building Society, of all people – and he covers the disaster of the Microsoft ‘Sendak’ project, before passing on to discuss theories of ‘creativity’ and report on forays into the world of advertising. He discusses the psychology of idea-generation, its relation to programming and the world of computer games, the advantages of motion and sounds on screen, and there are some interesting observations on the need for visual ‘transitions’ between one screen of information and another.

Reading all this, you get an invigorating sense of intellectual excitement, the downside of which is that no single idea is pursued to any depth. This is a weakness occasionally reinforced by a surprisingly cavalier attitude towards his readers – ‘sorry – I’ve lost the URL’.

And yet he’s actually gone to the trouble of locating the original authors of some of these programs – an admirable trait in an age when a lot of software has a lifespan of five years or less. He’s very fond of using metaphors to explain his arguments, and there are lots of interesting historical anecdotes woven as side-bars into the text. At its best, he throws up novel connections from different media and sources of technology; at its weakest, he flits from one unexamined generalisation to another.

Apart from concluding that projects are best carried out by small teams, he never seems to get round to explaining the ‘secret’ in his sub-title, but this is a lively and stimulating introduction to the history of software development which should go onto the reading list of anyone who wants to know what happens on real-life projects. It’s a revelation of the costly disasters as well as a celebration of the often unsung heroes of new technology during the last thirty years.

© Roy Johnson 2000

Dust or Magic   Buy the book at Amazon UK

Dust or Magic   Buy the book at Amazon US


Bob Hughes, Dust or Magic: Secrets of Successful Multimedia Design, London: Addison-Wesley, 2000, pp.264, ISBN: 0201360713


More on online learning
More on technology
More on digital media
More on web design
More on computers


Filed Under: Information Design, Media, Online Learning Tagged With: Communication, Media, Multimedia, Online learning, Technology

E-Learning in FE

July 24, 2009 by Roy Johnson

practical guide and resources for the e-tutor

This is written from the perspective of practising FE teachers – and healthily sceptical ones at that – well aware of the resistance to and pitfalls in e-learning. And it covers all the possibilities – from simple Word documents to Moodle and other advanced courseware. They start off by looking at all the very common objections made to the use of IT in teaching. ‘Computers can’t replace teachers’; ‘It might be OK in other subjects, but not mine’; and ‘Not in my back yard’.

E-Learning in FE You’ll have heard them all. These are firmly refuted, whilst at the same time they acknowledge the sceptics and the pressures of daily life in FE. Then come some simple suggestions for interactive eLearning without any advanced IT skills – largely based on using the tools available within Microsoft Word and PowerPoint – to which many (if not all) are likely to have access.

This includes the inventive suggestion of using ‘comments’ to attach audio files giving feedback on pieces of submitted work – which shows what’s possible with these relatively simple and widely available features. This technique is not complex and is within the technical skills of most tutors. Moreover, it can be used in both ‘directions’. Students in art and design can supplement their submitted work with critical commentaries on their choice of materials via attached podcasts.

There are also examples of audio recordings used in PowerPoint for language lessons – and as they point out, these techniques can easily be repeated with new materials. Once an item of interactivity has been created, it can act as a ‘learning object’ – a small, independent and re-usable unit of learning.

Next comes a tour of the free and nearly-free software programs which allow tutors to create course tests and exercises: Hot Potatoes (quizzes) Action Mazes (choice actions) mind mapping, course management tools, and web quests. The main problem here is that many of these programs merely encourage users to link up existing Word files to create a spurious sense of interactivity – which isn’t real eLearning.

The new digital classroom can make use of cameras, audio-recording devices, and video recorders – all of which are now regularly combined in mobile phones. There’s also a discussion of interactive whiteboards (which I personally recommend you practise using thoroughly before embarrassing yourself in front of a class).

And if you don’t want to make your own eLearning materials, there are lots of ready-made options available for free or licensed download. They include maps, images, encyclopedias, and mini-courses endorsed by BECTA and NLN (National Learning Network).

This leads naturally into a discussion of how these materials are made available to students. The answer is via VLEs (Virtual Learning Environments). These can be intimidating for teachers – but at the same time their salvation. What they offer is a central repository for documents, exercises, student work, learning plans, and interactive courses – as well as facilities such as email, chat rooms, and discussion forums.

There’s an interesting chapter on mobile learning devices – laptops, PDAs, phones, and tablets. What emerges here as the unsung hero is the flash disk (or pen drive) – up to 2 GB of complete portability which can store information and even executable programs and fits in your shirt pocket.

They end with a comprehensive review of the support organisations and sources of help for the aspirant eTutor. My only reservation was that there might have been more practical examples and illustrative screenshots. But apart from that, I would say that this was the best guide to eLearning I have come across.

© Roy Johnson 2006

artie shaw Buy the book at Amazon UK

artie shaw Buy the book at Amazon US


John Whalley, Theresa Welch, Lee Williamson, E-Learning in FE, London: Continuum, 2006, pp.118, ISBN 0826488625


More on online learning
More on technology
More on digital media
More on web design
More on computers


Filed Under: Online Learning Tagged With: Education, eLearning, Further Education, Online learning, Technology

e-Learning in the 21st Century

June 19, 2009 by Roy Johnson

theory and practice of designing online learning

e-Learning is education’s Big Thing at the moment. After all, it makes sense. If courses are put on line, students can study where and when they wish, tutors are freed from lecturing and classroom drudgery, and the institution can offer its courses to customers worldwide. That’s the theory anyway, and many institutions have thrown their text-based materials onto web sites, hoping to keep up with the rush. But of course, there’s a lot more to it than that.

e-Learning in the 21st CenturyGarrison and Anderson take a gung-ho line on e-Learning, arguing that it will transform education in the coming century – but they point out from the start that a lot of careful planning is required. As far as educational theory is concerned, their approach is ‘collaborate constructivist’. That is, it’s based on the idea that individuals create meaning for themselves which is then related to society. A great deal of their emphasis is placed on ‘community’:

A critical community of learners … is composed of teachers and students transacting with the specific purposes of facilitating, constructing, and validating understanding, and of developing capabilities that will lead to further learning.

Almost all their observations in the first half of the book are posited in terms of educational theory. But when in the second they come to give practical advice, most of it confirms my own experience of online tuition and course design. For instance, they emphasise the need to establish as rapidly as possible what they call ‘social presence’ – some sense of rapport between members of the learning community.

There are also some useful tips on course design – such as not overloading students with too much content, and placing more emphasis on cognitive skills and critical thinking. They are also good on how to promote and guide online conferences. Open University tutors please take note.

They cover evaluation and assessment, problem-based learning, and the organizational problems created for institutions, plus repositories of free learning objects which might help designers overcome them.

The authors are unashamed enthusiasts, and they cover in detail how the skills and facilities of successful online learning can be harnessed to overcome the apparent weaknesses of asynchronous communication in a networked community.

It’s a pity there are no practical examples of online courses or reviews of software, but anyone involved in the development of online courses who needs theoretical justification for their enterprise will find plenty of it here.

© Roy Johnson 2003

Buy the book at Amazon UK

Buy the book at Amazon US


D. R. Garrison and Terry Anderson, E-Learning in the 21st Century: A Framework for Research and Practice, London: Routledge, 2003, pp.167, ISBN 0415263468


More on online learning
More on technology
More on digital media
More on web design
More on computers


Filed Under: Online Learning Tagged With: Education, eLearning, Online learning, Technology

eBook Readers – compared

May 15, 2010 by Roy Johnson

a comparison chart of ebook reader features and prices

eBook readersKindle eBook Reader
Main features: 6″ diagonal E Ink® electronic paper display – weight 10 ounces (290 grams) – 1.5 GB storage – USB 2.0 port – supports multiple ebook formats – download via free built-in WiFi – 2 weeks battery life (reading) – holds up to 1,500 books 

UK=£176
US=$259

 

eBook readersSony Reader eBook Space
Main features: 6″ diagonal E Ink® electronic paper display – weight 260 grams – 192 MB storage expandable via MemoryStick or SD Card – supports multiple ebook formats – download with USB connection to PC via broadband – rechargeable battery – holds up to 160 ebooks 

UK=£275
US=$148

 

eBook readersBookeen Cybook Gen 3 eBook Reader
Main features: 5″ diagonal E Ink® electronic paper display – weight 260 grams – 512 MB storage (1,000 books) with optional SD card – supports multiple ebook formats – download with USB connection to PC via broadband – rechargeable battery – very light – mixed reviews 

UK=£180
US=$219

 


iRiver eBook reader
iRiver eBook Reader
Main features: 6″ diagonal E Ink® electronic paper display – eBook reader, Office Viewer, MP3 Player, Voice recorder, Personal organizer – weight 500 grams – 2.0 GB storage with optional SD card – supports multiple ebook formats – download with USB connection to PC via broadband – rechargeable battery – full QUERTY keyboard 

UK=£195

 


BeBook eBook reader
BeBook eBook Reader
Main features: 6″ diagonal E Ink® electronic paper display – weight 220 grams – 512MB storage (1,000 books) expandable to 4GB via SD slot – supports multiple ebook formats – download with USB connection to PC via broadband – rechargeable battery – preloaded with 150 free eBooks 

UK=£239

Red button eBooks on Writing and Study Skills


More on technology
More on digital media
More on online learning
More on computers


Filed Under: Computers Tagged With: BeBook, Computers, Cybook, eBook readers, iRiver, Kindle, Media, Technology

ECDL: The Complete Coursebook

June 27, 2009 by Roy Johnson

coursebook for ECDL, or for improving computing skills

The European Computer Driving Licence (ECDL) is an internationally recognised certificate of computing skills. In a climate where employers are increasingly keen to employ staff with proven IT skills, the ECDL is highly regarded and provides proof of competence in the most common software applications. The licence is awarded to candidates who pass tests in seven modules, which together make up the ECDL syllabus.

  • Basic Concepts of IT
  • Using the Computer and Managing Files
  • Word Processing
  • Spreadsheets
  • Databases
  • Presentations
  • Information and Communication

This coursebook has been fully approved by the ECDL Foundation. If you can grasp these basic skills, you are well on your way to computer proficiency.

ECDL4: The Complete Coursebook It devotes a chapter to each of the modules and provides a comprehensive guide to some of the most common business applications. It’s written in clear, easy to follow language. It’s also jargon free and assumes little or no previous knowledge of the applications that it covers. Both the tests and the book are based on Microsoft software – and in particular Windows XP®, Internet Explorer 5® and Outlook Express 5®.

Although the ECDL modules are numbered, the tests can be taken in any order. However, the authors here assume readers will work through the sections in order. The earliest sections are aimed at the complete beginner and they explain basic computer terms and concepts. The later chapters provide less explanation of the basics.

Much of the material covered will be familiar to a regular computer user, but there are very clear explanations of, for example, the difference between ROM and RAM, and the meaning of bits and bytes. This is the only section of the coursebook that is entirely theoretical. Its aim is to prepare the reader for a multiple-choice test on the key concepts of Information Technology.

The rest of the book contains over 280 easy-to-follow exercises, which guide you through the various features of the relevant applications. Starting with the simplest of tasks, the exercises enable you to become familiar with the software before introducing its more advanced features. There are over 700 screen shots which show what the results should look like.

There are also plenty of hints and shortcuts, and it’s likely that even the most confident of computer users will pick up the odd little gem.

I used this book as preparation for my own ECDL tests. Since gaining the licence I’ve referred back to it many times to refresh my memory on various points and have found many of its hints and tips to be invaluable.

For anyone interested in taking the ECDL, this book contains everything you will need to pass. And it wouldn’t be wasted on those who simply want to improve their knowledge and skills in popular software applications.

More information about the ECDL is available from the official website at http://www.ecdl.com/

© Kathryn Abram 2003

Buy the book at Amazon UK

Buy the book at Amazon US


Paul Holden and Brendan Munnelly, ECDL4: The Complete Coursebook, Prentice Hall, new edition 2003, pp.640, ISBN 0130399175


More on technology
More on digital media
More on online learning
More on computers


Filed Under: Computers Tagged With: Computers, ECDL, ECDL4: The Complete Coursebook, Education, Technology

Electronic Texts – a bibliographic essay

September 30, 2009 by Roy Johnson

text, editing, and bibliography in the electronic age

Electronic textuality is a relatively recent concept, yet one that has already had a significant impact upon the practice of scholarly editing. Scholars have debated the subject of textual bibliography and the issue of copy-text throughout the twentieth-century without, it seems, reaching any firm conclusions. The term ‘copy-text’ was first coined by Ronald McKerrow almost one hundred years ago (McKerrow and Nash 1904) but there still remains disagreement about how a text can or should be established. The arrival of electronic texts make this problem even more complex.

Advances in information technology have meant that scholars now have access to new and ever more sophisticated tools to assist them in the preparation of traditional codex editions and to aid textual analysis. Increasingly however, some editors are choosing to exploit the potential of digitised material and the advantages of hypertext to produce texts in an electronic format in either editions or archives. This raises various issues including the role of the editor and the relationship of the reader to the text.

One of the most influential and oft quoted theses on the subject of textual scholarship and one which has provoked a significant amount of debate, is W. W. Greg’s paper entitled ‘The Rationale of Copy-Text’ (1950). In this article, Greg highlights the difficulties of the prevailing editorial practice of attempting to select whichever extant text is the closest to the words that an author originally wrote and using this as the copy-text. In the case of printed books, this is generally considered the first edition, but Greg argues that an over-reliance on this one text, to the exclusion of all others, is problematic.

He advocates that a distinction be made between ‘substantive’ and ‘accidental’ readings of a text and suggests that the two should be treated differently. (He uses these terms to distinguish between readings that affect the meaning and readings that only affect the formal presentation of a text.) Greg asserts that it is only when dealing with accidentals that editors should adhere rigidly to their chosen copy-text and that in the case of possible variants in substantive readings there is a case to be made for exercising editorial choice and judgement. He argues that it is only through being allowed to exercise judgement that editors can be freed from what he terms ‘the tyranny of the copy-text’ (p. 26).

This argument is taken up and developed by Fredson Bowers (1964; 1970). Bowers takes issue with some of the finer points of Greg’s argument, but agrees that whilst rules and theories are necessary, the very nature of editing means that a certain amount of editorial judgement will always be needed. G. Thomas Tanselle (1975) examines the arguments of both scholars, expands them and looks at how their work and theories have affected the practice of scholarly editing. Greg, Bowers, Tanselle and others have slight differences of emphasis. They are however, all in broad agreement with the principle of synthesizing two or more variant editions into one text that represents as closely as possible an author’s intention.

The debate about copy-text and its role in scholarly editing rests largely on the status of authorial intention and the extent to which this is possible to discern and represent in a text. Michael Foucault, in his paper entitled ‘What is an author?’ (1984), argues that even when there is little question about the identity of author of a text, there remains the problem of determining whether everything that was written or left behind by him should be considered part of a work. Do notes in a margin represent an authorial addition or amendment, or did the author simply scribble in the margins a sudden thought that he wanted to remember and refer to later? Such issues remain a subject of debate and are some of the many problems with which editors are faced.

Textual ScholarshipThe practice of editing will always generate problems that scholars need to address and this is the basis for David Greetham (1994) and Peter Robinson’s (1997) assertions that to a certain extent, all editing must be seen as conjectural. However, in his examination of the history of textual criticism, Greetham finds that there has been a fluctuation between two equally extreme schools of thought.

The first, he suggests, maintains that a correct reading of a text is discoverable ‘given enough information about the texts and enough intelligence and inspiration on the part of the editor’ (p. 352). The opposing position is one that claims that any speculation on the part of an editor is likely to result in a move away from authorial intention. Because of this, scholars that hold this belief argue that documental evidence should be given priority over editorial judgement and wherever possible this documental evidence should be in the form of only one document – that chosen as the copy-text.

Yet scholars have found that it is sometimes impossible to establish one ‘correct’ text. Jerome McGann (1983; 1996; 1997) believes the very notion to be a falsity and Peter Donaldson (1997) argues that traditional scholarly editions can be misleading as their very nature suggests that a text is fixed and authoritative when the reality is often very different.

Taking the plays of Shakespeare as an example, he suggests that the collaborative nature of life surrounding the London theatres in the Renaissance combined with the fact that the author did not intend his work to be published, means that variants cannot and should not be ignored. Moreover, he contends, in some cases a single original text may never have existed. Donaldson argues that technology can be used to create new forms of text that incorporate variants in a way that is not practical in a codex edition. Donaldson is himself involved in a project that seeks to do this and he refers to his own experiences in assembling an electronic archive of the works of Shakespeare.

Electronic texts provide some solutions to the problems of editing, but they also raise new issues and opinions are divided about the way in which they can best be used. Some scholars welcome digital texts as a tool to aid the preparation and production of traditional scholarly editions whilst others prefer to look to electronic textuality as a medium for the publication of a different type of edition – an electronic edition.

Several authors (Donaldson 1997; Greetham 1997; Hockey 2000; Robinson 1997) examine the way in which new developments in information technology affect the traditional process of scholarly editing. Robinson for example, examines the analytic functions of electronic text and provides examples of instances in which computer aided collation has assisted in the preparation of scholarly editions. He cites his own experiences in the production of Chaucer’s The Wife of Bath’s Prologue on CD-ROM and explains how he used the particular techniques of computerised cladistic analysis as a method of textual criticism. Further information about computerized collation can be found in Hockey (1980) and Robinson (1994).

(Cladistic analysis has been developed from systematic biology. Susan Hockey (2000) describes it as ‘software that takes a collection of data and attempts to produce the trees of descent or history for which the fewest changes are required, basing this on comparisons between the descendents’. Cladistics is particularly useful in cases where manuscripts are lost or damaged.)

In addition however, Greetham (1994) and Robinson (1993; 1997) discuss the way in which, in an electronic edition, hypertext can be used to solve the problem of textual variants. The term ‘hypertext’ was coined in the 1960s by Ted Nelson (Landow 1992) and it refers to a means of linking documents or sections of documents and allowing a reader to navigate his or her own way through a series of paths in a non-linear way. Bolter (1991), Landow (1992) and McGann (1997) all write in detail about the technology behind hypertext, its functions and the theories that surround it.

Greetham suggests that decisions that were once the responsibility of the editor can be largely transferred to the reader as hypertext allows all possible variants to be included and linked in an electronic edition. This means that editors do not have to wrestle with the problem of authorial intention or give priority to one text but can incorporate several variants, allowing readers to select the most appropriate text for their particular needs.

Electronic TextsThis type of editing is, as Greetham argues, distinct from the methods of either establishing a text or accurately reproducing a particular version of a work in a critical edition. The desired result with electronic editing is not, according to McGann (1983) and others, a single conflated text as advocated by the Greg / Bowers school of editing but one containing such multiple variants.

McGann believes that this type of edition frees the reader from the controlling influence of editors, and George Landow (1992) suggests that it facilitates a greater degree of interaction between the reader and the text.

Kathryn Sutherland (1997) however, warns that this type of text places greater demands on a reader than a traditional codex edition. A hypertext edition that contains multiple variants necessarily requires a reader to select material, choose from amongst the possible variants and, therefore, exercise discrimination. She also points out, in an allusion to Barthesian distinctions, that a hypertext edition offering choice amongst variants is, in effect, offering the reader the ‘disassembled texts’ rather than the ‘reassembled work’ (p. 9).

McGann (1996; 1997) suggests that scholarly editions in codex form have limitations because their structure is too close to that of the material that they analyse. He asserts that hypermedia projects such as the Rossetti Hypermedia Archive with which he is involved, offer a different type of focus that does not rely on one central document. He argues that hyperediting allows for greater freedom and has the added advantage of giving readers access to more than just the semantic content of a primary text.

Moreover, McGann believes that hypertext is functioning at its optimum level when it is used to create hypermedia editions that incorporate visual and audio documents. Robinson (1997) however, warns that editors working on major electronic editions are producing material that will not be used to its full potential until there are further developments in the field of textual encoding and software that is more widely available.

P. Aaron Potter (c1997) takes issue with McGann and Landow’s ideas. He argues that a Web page editor controls the material that appears on the screen to an even greater extent than does an editor working on a traditional codex edition. A hypertext document is not a non-sequential document because an editor has inserted links and chosen what he considers the most suitable places for those links to be. A reader can, therefore, only navigate to a part of a document to which an editor has chosen to offer a path.

Hypertext links, asserts Potter, are ‘no more transparent that any reasonable index’ and whilst offering a choice amongst variants, and allowing readers to share some of the editorial functions, electronic editions are far from being either authorless or editorless texts. Moreover, her refers to Foucault’s theories and suggests that, as is often the case, hypertext is an example of a concept that is purporting to offer greater freedom, when in reality it is just more successful at hiding the mechanisms by which it exerts control – in this instance, control of a reader.

Susan Hockey (2000) warns that whilst editors working on electronic editions are freed from many of the limitations of printed books, and the need to rely on one particular text or reading, there is a danger of such projects becoming overly ambitious. She asserts that the inclusion of too much source material can result in editions that have little scholarly value. She maintains that source material should not replace the critical material that makes scholarly editions valuable. Similarly, Sutherland (1997) suggests that a balance needs to be struck between the quantity and the quality of the material that electronic editors choose to include. Claire Lamont (1997) examines the specific problems of annotation and compares how they differ in a codex and electronic edition. Hypertext provides the promise of annotations which are easier to access and which conceivably, can contain greater quantities of material.

Electronic TextsLamont draws attention to the fact that hypertext editions also have the advantage over traditional editions because they can be updated whenever necessary without the need to prepare an entire new edition and without the cost and time that this inevitably involves. However, rather than solving the problems of annotation such as where, what, and how much to annotate, Lamont concludes that hypertext has simply resulted in ‘another arena in which the debate may continue’ (p. 63). Sutherland (1997) sums up the feelings of many less fervent supporters of electronic textuality by suggesting that the electronic environment is perhaps best thought of as ‘a set of supplementary possibilities’ (p. 7). These possibilities will be debated by editors, theorists and scholars in a comparable way to which they have debated and continue to consider the medium of the book.

Contrary to the optimistic note struck by writers such as McGann (1997), Landow (1992), Lanham (1993) and others concerning an electronic text’s facility to empower the reader, Sven Birkerts (1995) expresses concern at the effect of electronic texts in a book that is pessimistically entitled The Gutenberg Elegies: The Fate of Reading in an Electronic Age. Birkerts suggests that methods of electronic storage and retrieval have a detrimental effect upon a reader’s acquisition of knowledge. Information in an electronic medium, he believes, remains external – something to be stored and manipulated rather than absorbed.

Without claiming to support Birkerts’ theories, Sutherland (1997) suggests that if they do prove to be correct then the implications will be wide-ranging. The scholar who works for years seeking to become and expert in his chosen field for example, could conceivably be transformed by the computer into little more than a technician – able to locate and manipulate information, but without having any real understanding of it.

Rapid advances in information technology are increasingly becoming the source of debate amongst scholars who seek to determine both the best way of taking advantage of technology and the implications of so doing. Greetham (1997) rightly points out that digitisation is only one small stage in the evolution of texts and Sutherland (1997) remarks that computers, like books, are simply ‘containers of knowledge, information [and] ideas’ (p. 8).

However, as electronic textuality continues to emerge as a force to which the academic community will have to adapt there will, no doubt, be a continued explosion in the literature that addresses the issues that it raises. Jerome McGann is seen by more conservative scholars as too messianic in his endorsement of the electronic medium and it is possible that some of his predictions may well prove to have been extreme. However, in his claim that hyperediting is ‘what scholars will be doing for a long time’ (1997), it is likely that he will, ultimately, be proved right.

© Kathryn Abram 2002


Bibliography

Birkerts, Sven. 1995. The Gutenberg Elegies: The Fate of Reading in an Electronic Age. New York: Fawcett Columbine.

Bolter, Jay David. 2001. Writing Space: The Computer, Hypertext, and the History of Writing. Hillsdale, N.J. : L. Erlbaum Associates.

Bowers, Fredson. 1964. Bibliography and Textual Criticism: The Lyell Lectures, Oxford, Trinity Term 1959. Oxford: Clarendon Press.

______. 1970. ‘Greg’s “Rationale of Copy-Text” Revisited’. Studies in Bibliography Volume 31 , pp. 90-161.

Chaucer, Geoffrey. 1996. The Wife of Bath’s Prologue on CD-ROM. ed. Peter M. W. Robinson. Cambridge: Cambridge University Press.

Donaldson, Peter. 1997. ‘Digital Archive as Expanded Text: Shakespeare and Electronic Textuality’, in Electronic Text: Investigations in Method and Theory. ed. Kathryn Sutherland, pp. 173-97. Oxford: Clarendon Press.

Foucault, Michel. 1984. ‘What is an author?’, in The Foucault Reader. ed. Paul Rabinow, translated by Josue V. Harari, pp. 101-20. New York: Pantheon Books.

Greetham, D. C. 1994. Textual Scholarship: An Introduction. New York and London: Garland.

______. 1997. ‘Coda: Is It Morphin Time?’, in Electronic Text: Investigations in Method and Theory. ed. Kathryn Sutherland, pp. 199-226. Oxford: Clarendon Press .

Greg, W. W. 1950. ‘The Rationale of Copy-Text’. Studies in Bibliography Vol. 3 (1950-1951), pp. 19-36.

Hockey, Susan M. 1980. A Guide to Computer Applications in the Humanities. London: Duckworth.

______. 2000. Electronic Texts in the Humanities: Principles and Practice. Oxford: Oxford University Press.

Lamont, Claire. 1997. ‘Annotating a Text: Literary Theory and Electronic Hypertext’, in Electronic Text: Investigations in Method and Theory. ed. Kathryn Sutherland, pp. 47-66. Oxford: Clarendon Press.

Landow, George P. 1992. Hypertext: The Convergence of Contemporary Critical Theory and Technology. Baltimore: John Hopkins University Press.

Lanham, Richard A. 1993. The Electronic Word: Democracy, Technology, and the Arts. Chicago; London: University of Chicago Press.

McGann, Jerome J. 1983. A Critique of Modern Textual Criticism. Charlottesville: University of Chicago Press.

______. 1996. ‘Radiant Textuality’. Accessed on 19 February 2002.

______. 1997. ‘The Rationale of Hypertext’, in Electronic Text: Investigations in Method and Theory. ed. Kathryn Sutherland, pp. 19-47. Oxford: Clarendon Press.

______. 1997. ‘The Rossetti Hypermedia Archive’ [Web page]. Accessed on 19 March 2002.

McKerrow, Ronald B., and Thomas Nash. 1904. The Works of Thomas Nashe. Vol. 1. London: A.H. Bullen.

Potter, P. Aaron. c1997. ‘Centripetal Textuality’. Accessed on 19 February 2002.

Robinson, Peter M. W. 1993. The Digitization of Primary Textual Sources. Oxford: Office for Humanities Communication Publications.

______. 1994. ‘Collate: A Program for Interactive Collation of Large Textual Traditions’, in Research in Humanities Computing 3. eds. Susan Hockey, and N. Ide, pp. 32-45. Oxford: Oxford Universtiy Press.

______. 1997. ‘New Directions in Critical Editing’, in Electronic Text: Investigations in Method and Theory. ed. Kathryn Sutherland, pp. 145-71. Oxford: Clarendon Press.

Sutherland, Kathryn, ed. 1997. Electronic Text: Investigations in Method and Theory. Oxford: Clarendon Press.

Tanselle, G. Thomas. 1975. ‘Greg’s Theory of Copy-Text and the Editing of American Literature’. Studies in Bibliography Volume 28, pp. 167-231.

 


Filed Under: 19C Literature, 20C Literature Tagged With: Bibliography, David Greetham, Electronic Texts, Kathryn Sutherland, Literary studies, Susan Hockey, Technology, textual scholarship

Enterprise 2.0

June 12, 2009 by Roy Johnson

how social media will change the future of work

The title of this book combines two coded terms – Web 2.0 and ‘The Enterprise’ – for which read social media software’ and Big Business. And the purpose is to show how the techniques and concepts behind Web 2.0 applications (blogs, wikis, tagging, RSS, and social bookmarking) can be used to encourage collaboration efforts in what was previously thought of as secretive, competitive businesses.

social mediaIt’s an argument which is fast becoming quite familiar. To succeed in modern business, managers and directors must learn to listen and talk to their customers and staff. They need to be more agile in their thinking, less monolithic in their practices, and they need to catch up to new Internet-based activities which can sweep away unwary traditionalists overnight [look what happened to Encyclopedia Britannica] and create multi-billion pound enterprises almost as quickly [Amazon, Google].

Niall Cook realises that there will be problems and resistance to such suggestions from within orthodox business communities. But he also points to their existing weaknesses.

Companies spend millions of dollars installing information and knowledge management systems, yet still struggle with the most basic challenges of persuading their employees to use them.

Will it be difficult to persuade large organisations to adopt these very democratic tools? He offers case studies from companies such as the BBC, IBM, Microsoft, and BUPA and others to show that it might. He even makes a case for the use of instant messaging and social presence software (MSN and Twitter).

He also has an example of the US Defence Intelligence Agency using mashups to provide simultaneous streams of information through a single interface (because that’s what its users want), and a multinational software company using Facebook as an alternative to its own Intranet (because its employees use it more).

He gives a very convincing example of the creation of a wiki running alongside the company Intranet in a German bank. The IT staff started using the wiki to generate documentation, and within six months use of the Intranet was down 50%, email was down 75%, and meeting times had been cut in half.

In fact he misses the opportunity to point out that one of the biggest incentives for companies to embrace Web 2.0 software is that much of it is completely free. Almost all major programs are now available in Open Source versions – including such fundamentals as operating systems (Linux) content management systems (Joomla) and virtual learning environments (Moodle).

In the UK, government institutions have invested and wasted billions of pounds after being bamboozled by software vendors. In the education sector alone, VLEs such as Blackboard and WebCT have proved costly mistakes for many colleges and universities. They are now locked in to proprietary systems, whilst OSS programs such as Moodle run rings round them – and are free.

Is the embracing of social software solutions likely to take place any time soon? Well, Cook has some interesting answers. His argument is that these developments are already taking place. Smart companies will catch on, and obstructors will fall behind with no competitive edge.

Bear in mind that within just five years, members of the MySpace generation are going to be entering the workforce, bringing their collaborative tools with them. If you don’t have the software that allows them to search, link, author, tag, mashup, and subscribe to business information in the ways they want to, they are going to do one of three things: use third party software that does; leave to join a competitor that does; not want to work for you in the first place.

Even the software solutions in this radical, indeed revolutionary development, must be fast, light, and quick to implement.

Speed and flexibility. Oracle’s IdeaFactory took just a few days to build. Janssen-Cilag’s wiki-based Intranet was purchased, customised, and launched within two weeks.

This is all part of what Peter Merholz in his recent Subject to Change calls agile technology. Cook provides strategies for those who wish to implement these ideas within their own company – and it has to be said that he assumes a certain degree of subversiveness might be necessary.

The book ends with a review of the literature on social software and a comprehensive bibliography – so anyone who wants to pursue these matters at a theoretical level has all the tools to do so. But I suspect that anybody who is taken with these new ideas – if they have any blood in their veins – will immediately want to go away and put them into practice.

This is a truly inspirational book which should be required reading for managers, IT leaders, systems analysts, developers, and business strategists in any enterprise, small, medium, and especially large. I can think of two organisations I am working with right now (one a university, the other a large city college) who ought to be implementing these ideas but who are doing just the contrary – stifling innovation. One, following its culture of ‘no change’ has just been swallowed up by its rival. The other is running onto the financial rocks precisely because it refuses to learn from its users and its own staff – whilst claiming to do just the opposite.

© Roy Johnson 2008

Enterprise 2.0   Buy the book at Amazon UK

Enterprise 2.0   Buy the book at Amazon US


Niall Cook, Enterprise 2.0: how social software will change the future of work, London: Gower, 2008, pp.164, ISBN: 0566088002


More on eCommerce
More on media
More on publishing
More on technology


Filed Under: e-Commerce Tagged With: e-Commerce, Enterprise, Media, Social media, Technology

Evaluating online sources for essays

August 22, 2009 by Roy Johnson

sample from HTML program and PDF book

1. The Internet is the biggest library in the world, and tens of thousands of documents are added to it – every day. Evaluating the online sources of any information you download is a vital part of making sure it is relevant to your needs.

2. You also need to be sure about the accuracy, reliability, and value of any information you use. For instance, there is a big difference between a web site run by an amateur enthusiast and the official site of a big organisation.

3. Even the world’s largest encyclopedia — WIKIPEDIA — has its limitations. It is written by amateur volunteers and then edited by self-appointed experts, but it might contain mistakes or information which contains personal bias.

4. The following articles are designed to help you in the task of evaluating the information your retrieve from online sources.

5. You can print out this page for reference, or if you are connected to the Internet, just click the URL to go straight to the site named.

6. Documents are sometimes moved from one location to another on the Internet. If you receive a ‘Document not found’ message, try progressively removing the last section of the URL [ / this-bit] in your browser. Re-submit your search each time.

7. If you are reading this whilst connected to the Internet, click any of the addresses below and you will be taken directly to the document.

  • How to Evaluate a Web Page
  • Evaluating Web Pages
  • Evaluating Web Pages: Questions to Ask
  • Evaluating Web Resouces
  • Ten C’s for Evaluating Internet Resources
  • Evaluating Web Resources

© Roy Johnson 2003

Buy Writing Essays — eBook in PDF format
Buy Writing Essays 3.0 — eBook in HTML format


More on writing essays
More on How-To
More on writing skills


Filed Under: Writing Essays Tagged With: Academic writing, Essays, Evaluation, Online sources, Study skills, Technology, Web sources, Writing skills

Facebook the missing manual

March 17, 2010 by Roy Johnson

complete guide to social networking

Facebook is flavour of the year in social networking terms right now. It didn’t start until 2004, and it already boasts a billion subscribers, with a user base which is claimed to be slightly more adult than that of My Space. But when you’ve posted all those snaps of yourself getting drunk at the parties – did you know that it’s quite difficult to take them down again just before that vital job interview? If you’re going to use Facebook and take it seriously, you need a guidance manual, and there hasn’t been one – until now. Facebook: the missing manual takes you through the whole process, step by step, from registering and creating your profile to joining networks and finding friends. And every one of those steps is spelled out in a commendably clear manner.

Facebook - The Missing ManualAuthor Emily Veer also reminds you at every stage that the attraction of being able to see the private details of other people’s lives means conversely that they can see yours. You should therefore think carefully about the information you make public.

Once you’ve made or located your friends, there’s a number of different ways of contacting them which are more subtle than a simple email message. You can ‘poke’ people (nudge them), ‘write on walls’ (make public statements inviting a response), and even send gifts. News feeds and blogs are built into the system, and you can participate in ‘groups’.

These groups can be based on a shared interest or hobby (physical astronomy or knitting) something you have in common (your old school), or even the locality where you live. Interestingly however, you are only allowed to join one group based on geographical location – so tough luck for second home owners.

Those are the main Facebook elements: next come the extensions to these basic functions. There’s a system of listing social (real world) events where you can arrange to meet friends. Then there’s a market place where you can place ads (which Facebook calls ‘listings’) so you can sell unwanted items (as on eBay) or buy from other people – all the while checking their credentials via what they post about themselves.

There’s also a system for job-finding and hiring people, or you can use Facebook’s bulletin boards and ‘notes’ feature to work on collaborative projects. And as on many other popular software systems, there are now free add-on applications (widgets and plug-ins) which can add functionality to the basic set-up.

The last section of the book returns, very responsibly, to the issue of privacy. Apart from showing you how to configure the advanced settings of your account, Veer recommends applying a simple rule: ‘Don’t put anything on view which you wouldn’t want your mother or your boss to know about you.” And remember that although at the time of writing Facebook is going through a re-design, it’s still very difficult to remove anything, once it’s up there.

Facebook   Buy the book at Amazon UK

Facebook   Buy the book at Amazon US

© Roy Johnson 2010


E.A. Vander Veer, Facebook: the missing manual, Sebastopol (CA): O’Reilly, 2nd edition 2010, pp.272, ISBN: 144938014X


More on publishing
More on journalism
More on creative writing
More on writing skills


Filed Under: Computers, Publishing Tagged With: Computers, Facebook, Missing Manual, Publishing, Technology, Writing skills

Flickr Hacks

May 22, 2009 by Roy Johnson

tips and tools for sharing photos online

Photo blogging is one of the most expansive parts of the Internet and online media just at the moment. You take a picture with your digital camera or your mobile phone, and blog it straight onto a public site. Flickr is owned by Yahoo! They allow you to upload your photos into a web space, and you are given 20MB per month, which is quite generous. Instead of keeping your snaps just for yourself and family members on your hard disk, you can store them, share them with the world, tag them, and make them available for worldwide consumption. You can even make money out of them if you play your cards right.

Flickr HacksAlthough your photos are in the public domain, you can control who is allowed to see them. There are full instructions here for setting your privacy options. Tagging and meta-data are fully explained (that’s giving titles, categories, and links to your photos) and there are also tips on resizing photos to save on your allotted storage space.

When extra information in the form of meta-tags is added to the images, all sorts of new possibilities are created. Paul Bausch shows games involving comparisons with similarly tagged photos, and he demonstrates how geo-tagged images can be mapped.

With so many of these images being viewed and viewed across the web, it’s good that he also explains issues of copyright and licensing, including the relatively new Creative Commons licences.

He also show how you can subscribe to a news feed which will notify you when other people upload new images. Then the later part of the book offers some fairly simple scripts for constructing screensavers, tracking your friends’ favourites, and even plotting your personal contacts using Google Maps.

Assuming you eventually end up with a large collection of photos, the next more advanced level shows you how to back up the collection, then how to store and sort them.

Finally, for those who might wish to interact with Flickr and operate at an administrator level, there are some advanced scripts which allow you to act as a moderator, create custom mosaics and collages, and mash up your photos to produce all sorts of special effects.

© Roy Johnson 2006

Flickr Hacks   Buy the book at Amazon UK

Flickr Hacks   Buy the book at Amazon US


Paul Bausch, Flickr Hacks, Sebastopol: CA, O’Reilly, 2006, pp.335, ISBN: 0596102453


More on digital media
More on technology
More on theory


Filed Under: Media, Technology Tagged With: Flickr, Media, Social media, Technology

  • « Previous Page
  • 1
  • 2
  • 3
  • 4
  • 5
  • …
  • 13
  • Next Page »

Get in touch

info@mantex.co.uk

Content © Mantex 2016
  • About Us
  • Advertising
  • Clients
  • Contact
  • FAQ
  • Links
  • Services
  • Reviews
  • Sitemap
  • T & C’s
  • Testimonials
  • Privacy

Copyright © 2025 · Mantex

Copyright © 2025 · News Pro Theme on Genesis Framework · WordPress · Log in