Mantex

Tutorials, Study Guides & More

  • HOME
  • REVIEWS
  • TUTORIALS
  • HOW-TO
  • CONTACT
>> Home / Archives for Technology

Remediation: Understanding New Media

May 26, 2009 by Roy Johnson

new media – theoretical and practical studies

Jay Bolter is the author of Writing Space, one of the most important books on hypertext of the early 1990s. Here he continues with co-author Richard Grusin his theoretical and practical consideration of digitisation into the realm of ‘new media’: that is, computer graphics, streamed video, and virtual reality. He begins by looking at the ways in which various media – from Renaissance paintings to modern 3D digital images – have tried to render convincing images of the real world.

Remediation: Understanding New MediaThe first part of the book discusses the interesting distinctions to be made between immediacy and hypermediacy. Their definition of remediation is ‘the representation of one medium in another’ and they argue that this is ‘a defining characteristic of the new digital media’.

Their premise is that all new media take over and re-use existing media. Thus the Web grabs aspects of television and printed books. A good Web homepage presents all its most important information visible ‘above the fold’, in the same way as newspapers are designed.

This doesn’t prevent some of the ‘old’ media fighting back. So, for instance, television news programs begin to show separate ‘windows’ on screen – in a clear imitation of the multitasking environment of the computer monitor to which we are now all accustomed.

There’s some heavy-duty language to get through in this section – but the tone lightens in the second part of the book. This deals with studies of a dozen contemporary forms of digital media – computer games, virtual reality, digital art and photography, television, and the Web.

This includes an extended analysis of the game Myst, which is seen in terms of a remediation and critique of film. They chase the differences between photo-realistic painting and digital art all around the houses. It also includes an interesting analysis of Hitchcock’s Vertigo, of Disney theme parks, and of web cam sites.

The analysis does include some dubious assumptions – such as the idea that the difference between film and TV is that we watch films in public and television in private. The truth is that not many people go to the cinema any more, and lots of people watch television films with friends.

The last part of the book returns to a theoretical consideration of the effect of all this on the individual. We are taken into issues such as ‘gender problems in MUDs’ and what it is like in VR experience to be a molecule or virtual gorilla.

It might be hard work to read, but it’s stimulating stuff. There may be further theoretical questions posed on many of these issues – but anybody interested in the uses of new media will not wish to miss what Bolter and Grusin have to say about them.

© Roy Johnson 2000

New Media   Buy the book at Amazon UK

New Media   Buy the book at Amazon US


Jay David Bolter and Richard Grusin, Remediation: Understanding New Media, Cambridge (MA): MIT Press, 2000, pp.295, ISBN: 0262522799


More on digital media
More on technology
More on theory


Filed Under: Media, Theory Tagged With: Jay Bolter, Media, New media, Remediation, Technology, Theory

Smartphones for Internet Access

April 11, 2012 by Roy Johnson

T-MobileToday’s Internet users are relying on their smartphones or tablets for quick and easy Internet access, rather than laptops and desktops. And no wonder. It’s simply more convenient and portable. However, those with tight finances haven’t always been able to enjoy the benefits of 4G connectivity. But this will all change with the announcement of T-Mobile’s reinvigorated challenger strategy. This offers more affordable options to cash-strapped customers. When subscribers to T-Mobile compare cell phone plans with those of other service providers, T-Mobile comes out on top as one of the most affordable cell phone carriers with reliable service.

With mobile devices increasingly imitating each other’s features, it’s the quality and cost of the service that will determine user choice. T-Mobile’s challenger strategy, outlined by CEO and President Philipp Humm recently, focuses on making 4G services affordable and establishing growth for the business by investing $4 billion on network modernization and 4G evolution.

"We want to be known for delivering the best value in wireless because of the advanced technology we deliver at an affordable price," Humm said. "Over the next two years, we’re prioritizing and investing in initiatives designed to get T-Mobile back to growth in the years ahead—beginning with the transformation of our network."

Over 90 percent of T-Mobile device sales in the fourth quarter were from 3G and 4G smartphones, and data usage as well as smartphone adoption continue to accelerate. This has prompted the telecommunications giant to improve its data services to keep loyal customers happy as well as attract new subscribers.

"Today we operate America’s Largest 4G Network delivering a fast and reliable 4G data experience with Evolved High Speed Packet Access (HSPA+)" T-Mobile Chief Technology Officer Neville Ray said. "Launching Long Term Evolution (LTE) next year lets us take advantage of technology infrastructure advancements and benefit from a more mature LTE device ecosystem, while continuing to meet the growing demand for data with a powerful 4G experience."

T-Mobile plans to deliver better performance and coverage to its customers by improving its 4G network infrastructure with "new antenna integrated radios on many of its cell towers." The company may even be the first carrier in North America to accomplish this feat.

These technological developments should give users access to much higher rates of data transfer, and a smoother user experience. For instance, they can produce significant improvements to battery life, and quicker wake-from-idle time. This will be similar to an always-on connection. That’s the sort of service mobile device users increasingly expect in a fully-connected environment.


More on technology
More on digital media
More on online learning
More on computers


Filed Under: Computers, Media Tagged With: Communication, Media, Mobile phones, Smartphones, Technology

Test Driving Linux

May 31, 2009 by Roy Johnson

try out Linux without installing it on your hard disk

Linux is the completely free operating system which has been developed as part of the Open Source Software movement. It offers a powerful and safer alternative to Windows XP, which many people cannot afford, particularly where multiple installations are concerned. But switching from a well known operating system to something new can be a scary experience. Wouldn’t it be good if you could take a test drive first. That’s where the clever idea behind this book appears. It comes with a CD that lets you run Linux off the disk, without installing it onto you hard drive.

Test Driving Linux It will run more slowly off the CD, but you get to try Linux without taking any risks. David Brickner even supplies the neat wrinkle of saving your settings onto a USB flash disk so that you don’t need to reconfigure Linux each time you boot up. The CD also comes with a full copy of OpenOffice, the free alternative to Microsoft Office, as well as free browsers, graphics editing software, and hundreds of other applications for every type of daily computer task. The desktop interface which comes on the CD is called K Desktop Environment (KDE). He explains how this works and shows how a variety of applications run in it. KDE also has a number of key features of its own – such as the ability to run virtual desktops.

After guiding you through KDE (which works in a similar way to Windows XP) he offers a tour of the free Linux software applications which come on the CD – Konqueror web browser and file manager, Totem multimedia player, Kontact personal information manager, GIMP image editor, and OpenOffice.org. This is a fully featured suite of word processor, spreadsheet, and presentations programs which can open any files from Microsoft Office and save back into that format.

If you don’t like these particular applications, you can just as easily download the Firefox browser or the Thunderbird email client. Almost all these programs are more bug-free and more secure than their Windows counterparts.

The book’s subtitle is “from Windows to Linux in 60 seconds”. You might boot up from the CD in that time, but you’ll want to spend a while using all the programs and trying out the software provided.

If when you’ve finished you feel like making the switch to Linux or running it alongside Windows on a dual-boot system, then he has a very useful conclusion which explains the advantages and shortcomings of the many versions of Linux available.

© Roy Johnson 2005

Buy the book at Amazon UK

Buy the book at Amazon US


David Brickner, Test Driving Linux, Sebastopol, CA: O’Reilly, 2005, pp. 341, ISBN: 059600754X


Filed Under: Technology Tagged With: Linux, Open Sources, Operating Systems, Technology

The Art and Science of Web Design

July 22, 2009 by Roy Johnson

web design strategies – for the present and the future

Jeffrey Veen is a webmaster who put the pages of avant-garde magazine WIRED onto the Net, and he went on to have something of a best-seller with his first book – HotWired Style: Principles of Building Smart Web Sites. The Art and Science of Web Design is his follow-up, and if anything, it’s even better. Be warned though. It’s not a practical HTML manual, with lots of advice about how to write web pages. It’s a study of how the Web works, why it works how it does, and what strategies serious users ought to adopt to make it work for their benefit.

The Art & Science of Web Design He starts from a simple premise. Any graphic designers creating attractive printed pages know that they must have a comprehensive knowledge of their tradition. This means they should know the principles of good layout, modern print technology, paper, inks, and the full range of resources for translating their ideas from one medium into another. They are in fact drawing on a design tradition which is nearly six hundred years old.

If that is the case, web designers need to know how their own (new) medium works – and he sets out to make it clear. He starts with a history of the Web, and how its content has always been separated from its appearance. We all know about the constraints and limitations of HTML coding – but now there are style sheets to give us more control. And he shows how this can be done.

In fact he is what might be called a radical traditionalist. He believes that you must respect the fact that web visitors bring notions of navigation and structure from the other sites they have visited. You can introduce novelties onto your site – but these should be subtle, and they absolutely must keep the user’s needs in mind at every click of the mouse.

There are some wonderful in-depth analyses of major sites, showing how they have managed to keep user’s needs in mind, even when building their information from huge databases. Yahoo and Amazon come out well – because they create fast downloading pages which give visitors what they want.

On browsers and speed he is quite uncompromising. You must check your pages in as many browsers as possible, and you must eliminate all superfluous coding so that users get what they are looking for immediately. There’s a fascinating page of screenshots from a competition for high quality homepages – all created within a size of 5Kb maximum.

The other central feature of his argument is an interesting notion of what he calls ‘liquid’ pages. He argues that designers ought to stop worrying about the exact appearance of the layout and graphic features of their pages. Instead they should design so that the page will flow into any browser, on any screen size, set at any resolution.

And he shows how it can be done. There is a wonderful example, reduced to only a few lines of code, showing how style sheets and a clever use of font specifications can be used to create paragraphs which will look good in any browser.

Finally, he presents the idea of what he calls ‘object oriented publishing’. This is creating dynamic websites using templates, stylesheets, and information stored in databases so that the work of the designer is minimised. This part is more technically demanding, but he keeps jargon and coding to a minimum.

Click for details at AmazonIt’s written in an engaging, accessible style. You can try out his ideas immediately, and he gives an account of the way modern web technology works which is inspiring and enthusiastic. This is a very impressive follow-up to his earlier best-seller HotWired Style, and even though it will be of most use to those who already have web sites, anybody with an interest in web design will find something of interest in what he says. It’s going straight onto the Highly Recommended list at this site, and then after fiddling with some code, the first thing I’m going to do is read the book again.

© Roy Johnson 2002

artie shaw Buy the book at Amazon UK

artie shaw Buy the book at Amazon US


Jeffrey Veen, The Art & Science of Web Design, Indianapolis (IN): New Riders, 2001, pp.259, ISBN 0789723700


More on web design
More on technology
More on digital media


Filed Under: Web design Tagged With: Technology, The Art & Science of Web Design, Web design

The Art of SEO

October 31, 2010 by Roy Johnson

Mastering Search Engine Optimization

The Art of SEO seeks to explain an arcane issue. Search Engine Optimization is the art of getting more visitors to a web site. You can do this in a number of ways: by making it look more attractive, advertising its existence, or persuading more people to make links to it from their own sites. But the number one method which beats all of these put together is to make it come higher in Google search results. If somebody types washing machines into a Google search box and your site Wash-o-Matic comes up first, the chances are you will get more visitors. All you need to do is construct pages that Google will rank more highly than all your competitors – and this five hundred page compendium explains the equally large number of things you need to know to achieve it.

The Art of SEOThe book starts with a complete explanation of how search engines work, how they spider sites, and what they do with the information they gather. The same principles apply to all search engines, but the authors can be forgiven for concentrating almost all of their attention on Google, so predominant has it become. Quite apart from all the very technical matters of keywords and search algorithms, there’s a splendid chapter on creating a search engine friendly web site. This covers sitemaps, information architecture, site structure and navigation – all aimed at maximizing the effectiveness of every single page on a site. And you probably do need to start thinking of your site in this way – because that’s how your visitors will arrive, via a single page.

There are lots of free tools available – the best being at webmasters.google.com – but be prepared to go into a lot of technical detail if you wish to optimize your pages. I sat down and went through a number of the recommended steps, and after a while felt like scrapping my site and starting again from scratch. But in fact it’s very unlikely that any site starts out from a state of complete efficiency: they need to be tweaked and evolved to reach this condition. Fortunately on the issue of information architecture, many sites are now run from a content management system that will do the spadework for you. But it still pays to be aware of the underlying principles.

There are lots of subtle and complex issues – ‘keyword cannibalization’, ‘longtail of search’, and ‘thin affiliates’ – and something that had not occurred to me before – ‘self plagiarism‘. Two versions of the same page, even if they are on different parts of a site performing different functions, are dangerous as far as your rankings are concerned for two reasons. The first is that they are regarded by Google as duplicate material and are therefore given lower rating. The second is that the two pages are competing against each other for visitors, and Google will not know of any way to give priority to one of them.

The issue of creating, exchanging, and marketing links is complex almost beyond belief – but the principles on which the page ranking algorithms work is well explained. However, be warned that they are always ‘evolving’ – that is, changing. There’s also a warning on dubious promotional practices and an explanation of why many ‘guaranteed ranking improvement’ schemes aren’t worth a bean. The advice is to ignore all gimmicks, shortcuts, and sharp practice. Concentrate instead on producing lots of good quality content:

Content is at the heart of achieving link building nirvana

There’s an interesting discussion of how ‘link juice’ is generated, and some rather hair-raising warnings about link marketing. To stay on the safe side of Google acceptability policies, you are advised to run an extremely tight and clean ship indeed. Even some of the most innocent-seeming strategies for boosting the popularity or ranking of your pages can result in search engines doing the exact opposite, downgrading your page rankings behind the scenes – unbeknown to you.

In terms of promotion every course imaginable is examined – Google vertical search, local search, image, product and news search, plus all the well known social media services – Twitter, MySpace, Flickr, YouTube, and so on. To do this as thoroughly as suggested would become a full time job for most site owners, but it’s possible to pick and mix, choosing those opportunities that will best suit your own business.

This leads to the art of SEO ‘campaigns’ in which goals and objectives are closely specified, then the results tracked, measured, and analysed. At this point you are dealing with the sharp end of analytics, and you need a combination of IT skills and commercial single-mindedness to stay the course.

The scariest part of all comes last. What do you do if somebody steals your site’s content? Or even worse, if a competitor reports you to Google and asks for your site to be de-listed? Both of these things can easily happen. Fortunately there’s guidance on how to deal with such situations – plus enormously long lists of things to avoid in order to stay out of trouble. These are all the seemingly innocuous tricks people use to increase their site rankings, such as ‘repurposing’ material from other people’s sites, embedding keywords in hidden text, buying popular keywords that are not related to the publisher’s site, using ‘entry pages’, and so forth. The advice – as ever – is to avoid these easily detectable tricks and stick to producing rich original content.

This is one of O’Reilly’s masterful publications that covers a single but enormously complex subject in a thorough and authoritative manner. It’s written by experts in the field of site promotion, and even though several authors are involved it has a consistent tone and approach that makes it both clear and surprisingly readable.

The Art of SEO   Buy the book at Amazon UK

The Art of SEO   Buy the book at Amazon US

© Roy Johnson 2010


Eric Enge et al, The Art of SEO: Mastering Search Engine Optimization, Sebastopol (CA): O’Reilly, 2010, pp.574, ISBN: 0596518862


More on eCommerce
More on media
More on publishing
More on technology


Filed Under: e-Commerce Tagged With: e-Commerce, Publishing, SEO, serach engine optimization, Technology, Web page rankings

The Cathedral and the Bazaar

July 15, 2009 by Roy Johnson

socio-political manifesto of the free software movement

Forget the enigmatic title of The Cathedral and the Bazaar for a moment. This is essentially four long, polemical essays on the open source movement, written by one of its prime movers in the period between 1992 and summer 1999. ‘Open Source’ is a term used to describe the idealistic notion of freely sharing technological development – particularly the software code written by computer programmers. The first and earliest essay sets out the principles of the open source movement. The second inspects the attitudes and moral codes of its members (the hackers) who submit their work to peer review and what Eric Raymond claims is a ‘gift culture’. The third looks at the economic conundrum of how the open source movement sustains itself without a regular income. The last essay is an account of activism relating to the Microsoft anti-trust case.

The Cathedral and the BazaarBasically, it’s an impassioned argument in favour of a new strategy in software development which has arisen from the decision by Linus Torvalds to release the source code of his operating system Linux. He released it not only for free use, but also invited volunteers to help him develop it further. Raymond argues that this represents – dare one say it? – a paradigm shift – a democratic sharing of ideas and repeated testing rather than the development of a product in commercial secrecy.

This is where the title comes in. The ‘cathedral’ is a metaphor for work ‘carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta to be released before its time’. The bazaar represents an open free-for-all approach ‘differing agendas and approaches…out of which a coherent and stable system [can] seemingly emerge only by a succession of miracles’.

He inspects the arguments which have been made in criticism of the open source movement, and whilst I wouldn’t say that he demolishes them exactly, he does come up with some interesting points about a system which he is presenting as a revolutionary alternative to the common commercial model. ‘It is often cheaper and more efficient to recruit self-selected volunteers from the Internet than it is to manage buildings full of people who would rather be doing something else’. If the principles of the open source movement really do work in the long term, this will stand a lot of MBA wisdom on its head.

However, his arguments for the advantages of releasing open source on Netscape (in autumn 1998) seem to evade the issue that NS was under intense pressure from Microsoft. He’s making an argument from technological altruism, when deep down the motive might have been economic. But he does explain how a company such as Red Hat can sell open source code (Linux) for a profit, when it’s free for anyone who wants it. They sell – ‘a brand/service/support relationship with people who are freely willing to pay for that’ – and other companies are free to do the same thing if they wish.

As the book reaches its breathy conclusion, the fourth essay becomes a rather personal and excited account of how the open source movement was established in 1998/9 – largely to support Netscape in its fight against Microsoft. No doubt there will be updates to this statement issued at the appropriate web site [www.opensource.org] following each stage of the fight in court.

Some of the anthropological parallels and excursions into political economy seem slightly fanciful, and at times his polemic becomes a sociological study of hackers’ motives – a trap which in literary studies is known as the ‘intentional fallacy’. That is, we shouldn’t judge outcomes on the strength of what we perceive to be the author’s intent. It’s also very idealistic – though the latest edition of WIRED carries an article about open source warriors selling their services on the open market, and Raymond argues that there is no necessary contradiction in this.

It’s the first book on high-tech developments I’ve come across which provided the slightly bizarre experience of a text printed with double line spacing and one-sentence paragraphs. This I imagine reflects the influence of the email originals written for reading on screen. Another interesting feature is that the majority of the bibliographical references are to articles on the Net, not to printed books – though I still think he should have tried to produce an index and bibliography.

He claims that even this book is in a state of evolution via updates following peer review – and that’s exactly as it should be for such a subject. It’s written in a concise, deeply compacted style, with few concessions to an average reader’s technical knowledge, and he’s occasionally cryptic to the point of obscurity: ‘Before taxonomising open-source business models, we should deal with exclusion payoffs in general’.

This is a crusading text, and anyone concerned with the sharp end of software development and the battles of operating systems will be fascinated by his arguments. This revised and expanded paperback edition includes new material on recent technological developments which has made it one of the essential texts on Open Sources

© Roy Johnson 2000

Buy the book at Amazon UK

Buy the book at Amazon US


Eric S. Raymond, The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary, Sebastopol (CA): O’Reilly, 1999, pp.268, ISBN: 0596001088


Filed Under: Open Sources Tagged With: Linux, Open Sources, Technology, The Cathedral and the Bazaar, Theory

The Computer and the Information Revolution

June 28, 2009 by Roy Johnson

the history of mathematics + technology = computers

This is book which gets mentioned in any serious history of computers. It’s a study of the mathematical, mechanical, and then the electronic developments which led to the creation of modern computers. The first part of The Computer and the Information Revolution offers an account of the development of mathematical systems, ending with the creation of binary notation in the nineteenth century. This paves the way for part two, which is a history of automatic calculation – first by mechanical devices, then by electronic means. It’s a book dense with a sense of history, and Ifrah’s span reaches effortlessly from 3500 BCE (Before the Common Era) to the maths underlying computer technology in the post-war years.

The Computer and the Information RevolutionHis approach can sometimes be a little disconcerting. One minute we’re in ancient Greece, next in the eighteenth century. A more smoothly integrated chronological narrative would have strengthened his case, just as more pictures and diagrams would have spared him page-length descriptions of the machines he discusses. This is a book which is crying out for illustrations.

However, he more than makes up for this in his wide-ranging inclusiveness. Even small-scale and failed inventors are mentioned. He is particularly good at explaining the relationship between mathematical theory and what was technologically possible at any given point. He points out that there are big gaps in the development of information technology – very often caused by the absence of nought/null in the numbering system.

It’s an odd book, because the translator and editor fills in what he clearly regards as important gaps in the author’s knowledge, and the chronology is patchy too. There’s a lot of back-tracking to make up for a lack of continuous narrative.

However, his account gains a great deal of impetus as all strands converge for the creation of the first modern computers. His description of Alan Turing’s conceptual breakthrough in 1936 and his relationship to John Van Neumann’s idea for a program stored in memory become positively gripping.

In fact it’s a shame he doesn’t stick with his theme once computers had been built, because the latter part of the book spins off into cosmology, genetics, and a mosaic of reflections on culture,science, and ‘the future of mankind’. Nevertheless, for anyone remotely interested in the development of information technology, this is a book which should not be missed.

© Roy Johnson 2000

Buy the book at Amazon UK

Buy the book at Amazon US


Georges Ifrah, The Computer and the Information Revolution, trans E.F.Harding, London: Harvill, 2000, pp.410, ISBN 1860467385


More on computers
More on technology
More on digital media


Filed Under: Techno-history Tagged With: Computers, Information architecture, Mathematics, Technological history, Technology, The Computer and the Information Revolution, Theory

the eBay survival guide

May 22, 2009 by Roy Johnson

how to make money on eBay – and avoid losing your shirt

I rarely talk about computers when socialising – otherwise you easily get branded an IT bore. But I was at a dinner party recently with neighbours where it suddenly turned out that half the table were trading on eBay! There are amazing bargains to be had. It turned out that we were eating off antique plates the hostess had bagged to match up with family heirlooms. But for tender mortals like you and me, there needs to be some hand-holding through the jungle of bids, deadlines, and prospective bidding. That’s where guidance manuals such as eBay the smart way, eBay Hacks and the eBay survival guide come in.

the eBay survival guideMichael Banks is an experienced trader with twenty years of eCommerce experience, and he talks you through the basics in a friendly and encouraging manner. First he gives a clear account of the huge variety of services, downloads, and support materials at the eBay site, then explains how the auctions (and the sales) actually work . There are lots of different ways of trading, and he covers them all.

Then he shows you how to find things using eBay’s powerful search engine. This includes neat tricks such as including plurals and deliberate mis-spellings in your search terms.

He deals with the central issue of ‘How much is it worth?’ – which is a much easier question to ask than to answer. His advice is that you need to cross-check with other auctions of the same object; look into price guides; and track what other people are searching for and buying.

Selling items is a more complex business than buying – not because of eBay, but because more of your own time is tied up in handling and posting stock to customers. There are also lots of different ways to set a selling price: you can have a minimum, a reserve, and a buy-it-now price.

He shows you how to describe, display, and illustrate the goods you want to sell. This might sound fairly simple – but you’ve got to remember that you need to stand out from thousands of other sellers, and you’ve got to be completely accurate, otherwise you might get negative feedback.

eBay has a fairly detailed system of resolving complaints and offering protection for both buyers and sellers. If you’re worried about getting into difficulties, he explains quite clearly how to solve problems.

As a buyer, if you really have your heart set on securing a bargain, you might need to get into the skills of bidding at the last possible minute – or ‘sniping’ as it is known in the trade. Once again, he shows you how to do it, and even how to outwit other people who may be doing the same thing.

He finishes by showing you how to recognise scams and misleading descriptions of products for sale. Thanks to eBay’s gigantic database of information on its buyers, sellers, and the history of all their transactions, it’s possible to locate all the information you need to protect yourself.

It’s quite true that some people make a full time living just buying and selling on eBay. If you fancy putting your toes into the waters of eCommerce, this would be an excellent place to start.

© Roy Johnson 2005

the eBay survival guide   Buy the book at Amazon UK

the eBay survival guide   Buy the book at Amazon US


Michael Banks, the eBay survival guide, San Francisco: No Starch Press, 2005. ISBN: 1593270631


More on eCommerce
More on media
More on publishing
More on technology


Filed Under: e-Commerce Tagged With: Business, e-Commerce, eBay, Technology, the eBay survival guide

The Essence of Computing Projects

June 14, 2009 by Roy Johnson

project writing skills for higher education

Projects are now a major part of most undergraduate and postgraduate courses – especially in sciences, business studies, and information technology. Students are required to draw on a number of different but important skills to complete their projects, and it’s not easy to know what’s involved. The Essence of Computing Projects is designed to explain what’s required. It covers surveying the literature, project writing skills, documenting software, time management, project management, and presentation skills.

Project writing skills The chapters follow the logical sequence of undertaking a project, starting from defining the nature of research itself, choosing a project and writing a proposal, then planning what you are going to write – including timing and scheduling.

When it comes to the process of searching and reviewing the literature, Christian Dawson makes sensible distinctions between what is required at undergraduate and postgraduate level. The chapter which deals with actually writing the project confronts some of the most common problems – and how to overcome them. Running out of time, dealing with interruptions and computer crashes; dealing with your supervisor; and working in teams.

The latter part of the book deals with the presentation of your report in written form. Here he stresses the importance of abstracts and structure, presenting data in graphs, pie charts, and bar charts, academic referencing, and two items of special interest – commenting on program code and writing user guides.

Finally he deals with the oral presentation skills required to present your project. It also looks forward to what follows in academic terms – publishing your work, funding, and intellectual ownership and copyright issues.

If you have a project as part of the next stage in your studies, this guide will give you an excellent account of what’s required. You will have to flesh out the details – but that’s exactly as it should be, isn’t it.

© Roy Johnson 2000

Computing Projects   Buy the book at Amazon UK

Computing Projects   Buy the book at Amazon US


Christian W. Dawson, The Essence of Computing Projects – A Student’s Guide, London: Prentice Hall, 2000, pp.176, ISBN: 013021972X


More on study skills
More on writing skills
More on online learning


Filed Under: Computers, Study skills, Writing Skills Tagged With: Computers, Computing projects, Project management, Technology

The Long Tail

May 21, 2009 by Roy Johnson

how endless choice is creating unlimited demand

Chris Anderson is the editor of WIRED magazine. This book started as an article there, took off, and was expanded via seminars, speeches, and further research. It has now become one of the most influential essays on the new sCommerce. Anderson’s notion is relatively simple, but its implications profound. He argues that because the digitisation of commerce allows more people into the trading arena, and because minority goods can be made available alongside best-sellers, the consumer therefore has a much wider choice and cheaper prices. This gives rise to a new phenomena – niche markets – also known in marketing-speak as the ‘long tail’. This is the part of the commercial results graph where returns begin to flatten out and slope towards zero.

The Long TailBut – and this is a very big BUT – in the new digital world they don’t slope off completely. And if you add up all the income from these many tail end transactions, it can be more than the total sales from the Short Head part of the graph.

Once you have grasped these basic issues, the lessons are clear. The profit is to be made in shifting bits, not atoms, plus lower overheads means more profit, because you can sell more. Much of this is possible because the price of electronic storage has now dropped almost to zero, and digital distribution has removed transport costs – as well as making delivery immediate. A physical bricks-and-mortar store has limited shelf space to stock goods, but Peer-2-Peer file-sharers make the downloaders’ options almost limitless.

The only way to reach all the way down the Tail—from the biggest hits down to all the garage bands of past and present—is to abandon atoms entirely and base all transactions , from beginning to end, in the world of bits.

Much of the new digital economy is amazingly counter-intuitive. Amazon for instance has allowed its own competitors to sell their goods on its site. The net result – more profit for Amazon, and the rise of the small second-hand book trader – the very businesses people thought would be put out of work by online trading.

Other positive elements in the new digital economy are the rise of reader reviews and recommendations; the back catalogue becomes valuable again; and new niche markets become available for more buyers.

Anderson looks at the technological history which has made the long tail possible, using a typical Amazon purchase as a model: postal delivery service, standard ISBN numbers, credit cards, relational databases, and barcodes. Of course Amazon’s genius in its latest phase is it gets other people to hold all the stock and fulfil the orders.

He’s a great believer in reputations and taste being formed by social media – the YouTube and MySpace worlds in which personal recommendations and fan reviews help forge best-sellers more than any amount of advertising hype.

There are lots of interesting nuggets thrown out as he makes his way through the socio-economic implication of all this. Such as for instance the fact that Google searches counteract the tyranny of the New over the well-established. That’s because they rank pages by the number of incoming links, which favours those which have had the time to acquire them.

Even though he goes into some economic theory, the study remains accessible and readable throughout – largely because he uses everyday examples with which most readers will be able to identify: the purchase of music CDs, DVDs of films, and supermarket food purchases.

This is a really inspiring book, and a must for anyone remotely connected with the online world. Even if some of his estimations and predictions might be overstated, it offers a glimpse into processes taking place that will change the way we think about business and technology. Time and time again, I thought “Yes! I’ve already started doing that!” – ordering more books from Amazon’s marketplace traders, buying out-of-print titles at knockdown prices, exploring new music, and looking out for recommendations on the new social media. I would rank this book alongside Nicolas Negroponte’s 1996 study Being Digital as a seminal influence for the decade in which it is published.

© Roy Johnson 2007

The Long Tail   Buy the book at Amazon UK

The Long Tail   Buy the book at Amazon US


Chris Anderson, The Long Tail, London: Random House, 2006, pp.238, ISBN: 184413850X


More on eCommerce
More on media
More on publishing
More on technology


Filed Under: e-Commerce Tagged With: Business, e-Commerce, Media, Technology, The Long Tail, Theory

  • « Previous Page
  • 1
  • …
  • 7
  • 8
  • 9
  • 10
  • 11
  • …
  • 13
  • Next Page »

Get in touch

info@mantex.co.uk

Content © Mantex 2016
  • About Us
  • Advertising
  • Clients
  • Contact
  • FAQ
  • Links
  • Services
  • Reviews
  • Sitemap
  • T & C’s
  • Testimonials
  • Privacy

Copyright © 2025 · Mantex

Copyright © 2025 · News Pro Theme on Genesis Framework · WordPress · Log in