Mantex

Tutorials, Study Guides & More

  • HOME
  • REVIEWS
  • TUTORIALS
  • HOW-TO
  • CONTACT
>> Home / Archives for Computers

The Computer and the Information Revolution

June 28, 2009 by Roy Johnson

the history of mathematics + technology = computers

This is book which gets mentioned in any serious history of computers. It’s a study of the mathematical, mechanical, and then the electronic developments which led to the creation of modern computers. The first part of The Computer and the Information Revolution offers an account of the development of mathematical systems, ending with the creation of binary notation in the nineteenth century. This paves the way for part two, which is a history of automatic calculation – first by mechanical devices, then by electronic means. It’s a book dense with a sense of history, and Ifrah’s span reaches effortlessly from 3500 BCE (Before the Common Era) to the maths underlying computer technology in the post-war years.

The Computer and the Information RevolutionHis approach can sometimes be a little disconcerting. One minute we’re in ancient Greece, next in the eighteenth century. A more smoothly integrated chronological narrative would have strengthened his case, just as more pictures and diagrams would have spared him page-length descriptions of the machines he discusses. This is a book which is crying out for illustrations.

However, he more than makes up for this in his wide-ranging inclusiveness. Even small-scale and failed inventors are mentioned. He is particularly good at explaining the relationship between mathematical theory and what was technologically possible at any given point. He points out that there are big gaps in the development of information technology – very often caused by the absence of nought/null in the numbering system.

It’s an odd book, because the translator and editor fills in what he clearly regards as important gaps in the author’s knowledge, and the chronology is patchy too. There’s a lot of back-tracking to make up for a lack of continuous narrative.

However, his account gains a great deal of impetus as all strands converge for the creation of the first modern computers. His description of Alan Turing’s conceptual breakthrough in 1936 and his relationship to John Van Neumann’s idea for a program stored in memory become positively gripping.

In fact it’s a shame he doesn’t stick with his theme once computers had been built, because the latter part of the book spins off into cosmology, genetics, and a mosaic of reflections on culture,science, and ‘the future of mankind’. Nevertheless, for anyone remotely interested in the development of information technology, this is a book which should not be missed.

© Roy Johnson 2000

Buy the book at Amazon UK

Buy the book at Amazon US


Georges Ifrah, The Computer and the Information Revolution, trans E.F.Harding, London: Harvill, 2000, pp.410, ISBN 1860467385


More on computers
More on technology
More on digital media


Filed Under: Techno-history Tagged With: Computers, Information architecture, Mathematics, Technological history, Technology, The Computer and the Information Revolution, Theory

The Essence of Computing Projects

June 14, 2009 by Roy Johnson

project writing skills for higher education

Projects are now a major part of most undergraduate and postgraduate courses – especially in sciences, business studies, and information technology. Students are required to draw on a number of different but important skills to complete their projects, and it’s not easy to know what’s involved. The Essence of Computing Projects is designed to explain what’s required. It covers surveying the literature, project writing skills, documenting software, time management, project management, and presentation skills.

Project writing skills The chapters follow the logical sequence of undertaking a project, starting from defining the nature of research itself, choosing a project and writing a proposal, then planning what you are going to write – including timing and scheduling.

When it comes to the process of searching and reviewing the literature, Christian Dawson makes sensible distinctions between what is required at undergraduate and postgraduate level. The chapter which deals with actually writing the project confronts some of the most common problems – and how to overcome them. Running out of time, dealing with interruptions and computer crashes; dealing with your supervisor; and working in teams.

The latter part of the book deals with the presentation of your report in written form. Here he stresses the importance of abstracts and structure, presenting data in graphs, pie charts, and bar charts, academic referencing, and two items of special interest – commenting on program code and writing user guides.

Finally he deals with the oral presentation skills required to present your project. It also looks forward to what follows in academic terms – publishing your work, funding, and intellectual ownership and copyright issues.

If you have a project as part of the next stage in your studies, this guide will give you an excellent account of what’s required. You will have to flesh out the details – but that’s exactly as it should be, isn’t it.

© Roy Johnson 2000

Computing Projects   Buy the book at Amazon UK

Computing Projects   Buy the book at Amazon US


Christian W. Dawson, The Essence of Computing Projects – A Student’s Guide, London: Prentice Hall, 2000, pp.176, ISBN: 013021972X


More on study skills
More on writing skills
More on online learning


Filed Under: Computers, Study skills, Writing Skills Tagged With: Computers, Computing projects, Project management, Technology

The Renaissance Computer

July 16, 2009 by Roy Johnson

information architecture in early print technology

The Renaissance Computer is a collection of essays which seek to explore the similarities, connections, and lessons to be drawn from a comparison of the advent of digital technology with the age of print in the immediate post-Gutenberg period. In the 15th century the printing press was the ‘new technology’. The first ever information revolution began with the advent of the printed book, enabling Renaissance scholars to formulate new ways of organizing and disseminating knowledge.

The Renaissance ComputerThe basic argument is that the proliferation of printed texts was as revolutionary and presented similar problems of information architecture, storage, and retrieval as we feel we have now in our digital age. The earliest attempts at memory and storage systems were remarkably similar to the Windows operating system, though the fact that they were made physically manifest made them cumbersome and non-portable. Nevertheless, it would have been wonderful to visit Giulio Camillo’s memory theatre, where a visitor occupied the stage, and all the knowledge of mankind was stored on the tiered rows of what would normally be seats.

Editor Jonathan Sawday looks at precursors of the modern computer in the work of Milton, Hobbes, Pascal, Liebnitz, and Descartes. There’s a chapter on the role of illustrations in early modern books, another looks at the role of the index, title page, marginalia, and contents page as early examples of hypertext and navigation.

The authors also point to the amazing persistence of some outmoded technological forms:

Recent work on the circulation of manuscript collections of poetry in the seventeenth century…has demonstrated that this form of publication survived for two centuries after the invention of the printing press. The modern researcher who, seated in the rare book rooms of the Huntington Library or the British Library, laboriously copies out passages from an early printed book is participating in an ancient tradition.

There is a very interesting (and more readable) chapter on Thomas Heywood’s Gunaikeion (1624), an encyclopedia on women. The link with computers is no more than the suggestion that it’s a cut and paste composition, but the content sounds so interesting it made me feel I wanted to read a copy.

These chapters are scholarly academic conference papers – and the have both their strengths and weaknesses. Wide ranging and well informed, but often looking for connections where none exist or finding them to little purpose.

The idea of a Renaissance computer is only a catchy idea. These studies are of how information was organised in text form, how it was understood and retrieved, and how the Renaissance book tackled issues of information architecture which many people now think of as something new.

© Roy Johnson 2002

The Renaissance Computer   Buy the book at Amazon UK

The Renaissance Computer   Buy the book at Amazon US


Neil Rhodes and Jonathan Sawday (eds), The Renaissance Computer: knowledge technology in the first age of print, London: Routledge, 2000, pp.212, ISBN: 0415220645


More on information design
More on design
More on media
More on web design


Filed Under: Information Design, Literary Studies, Media Tagged With: Computers, Cultural history, Information architecture, Information design, The Renaissance Computer

The Whole Internet

July 2, 2009 by Roy Johnson

updated version of first complete Internet guide

The Whole Internet was one of the earliest-ever computer books to become a best-seller. That was in 1992, when the first major wave of Net users needed information, and there as very little of it about. Ed Krol produced a manual which was well informed, comprehensive, and examined the technology in detail. However, it wasn’t very easy to read, and you needed to grapple with an arcane command-line interface which assumed you had grown up with Unix as a second language.

The Whole InternetThis new version is an update and complete re-write. It is based on the big changes which have come over the Net and the way it is used in the last eight years. Number one development of course is the Web, which moves up from a subsidiary chapter in the original to occupy the centre of this edition. Former features such as Gopher, Archie, and Veronica on the other hand are relegated to a footnote section called ‘Archaic Search Technologies’.

But this difference also makes the manual easier to read and understand. The emphasis has been changed from how the Net works, to how it can be used. There is far less impenetrable code cluttering the pages. Instead we get clean screen shots and nice photographs of what the Net looks like on screen, not at the DOS prompt. Ed Krol has been been very fortunate in choosing his co-author, and their co-operation has produced a far more readable book.

They cover all the basics which someone new to the Net would need to know. How to send email and follow the conventions of netiquette.; what to do with attachments; how to behave on mailing lists; understanding newsgroups; and how to deal with security, privacy, and Spam. They explain how to choose from a variety of Web browsers (including even one for the Palm Pilot). I was struck by how much more accessible all this technology has become in the short time since I struggled through the first edition.

This radical shift in user-centred design is also reflected by the inclusion of completely new chapters on Net commerce, banking, gaming, and personal finance. After a chapter on how to create your own Webages, there is an introduction to what are called ‘esoteric and emerging technologies’ – conferencing, streaming audio and video, and electronic books. This is a very successful attempt to cover the full range of the Net and its activities in a non-snobbish manner. They end with practical information – maximising the effectiveness of your Internet connection, searching techniques, and they offer a thick index of recommended resources.

The original Whole Internet may have been a more striking phenomenon because of its originality at the time, but this new edition has the potential to reach even more readers, largely because it explains the Net and shows how it can be used in a way which is much more attractive and accessible. It has gone straight onto my bibliography of essential Net reading, and I will certainly be recommending it to all my students.

© Roy Johnson 2000

Buy the book at Amazon UK

Buy the book at Amazon US


Kiersten Connor-Sax and Ed Krol, The Whole Internet: The Next Generation, Sebastopol: O’Reilly, 1999, pp.542, ISBN 1565924282


More on technology
More on digital media
More on online learning
More on computers


Filed Under: Computers, Techno-history Tagged With: Computers, Media, Techno-history, Technology, The Internet, The Whole Internet

Upgrading and Repairing PCs

July 10, 2009 by Roy Johnson

best-selling comprehensive guide to computer hardware

Scott Mueller’s title here is too modest. Upgrading and Repairing PCs is not just a repair guide – it’s a major encyclopaedia of computer components, their specifications, and a workshop manual on all aspects of dealing with PC hardware. His approach is very simple – and extremely thorough. He describes each major component of a PC in separate chapters, explains how it works, what it does, and even how it is made. You can use this manual for either an explanation of how things function, or for an up-to-date account of technical component specifications. It covers building, maintaining, and repairing all parts of a PC. It’s an approach which works – which is what has made this book a best-seller.

Upgrading and Repairing PCsAll the major manufacturers chips, motherboards, memory, hard drives, and peripherals are covered – so this is a valuable resource if you want to make comparisons before ordering new equipment. There’s even a comprehensive list of suppliers, plus advice on making choices.

The book also comes with a CD containing two hours of video tutorials. These are in fairly plain MP3 files. The process of installing components is described well enough in the book, but it’s made infinitely clearer when shown on screen.

He even shows you how to assemble your own PC – delivering the information in a fluent and cheerful manner. It occurred to me that these clips are also excellent tutorials for those who would like to know what’s inside their PC, but who don’t want to go though the heart-stopping experience of opening up the box.

The majority of the data here is very technical. This is a serious, heavy-duty book which has proved itself in the best-seller lists. It is now in its thirteenth edition and is just about as up-to-date as it’s possible to be. This is somebody who knows his subject inside-out.

© Roy Johnson 2009

Buy the book at Amazon UK

Buy the book at Amazon US


Scott Mueller, Upgrading and Repairing PCs, Indianapolis (IN): Que, 19th edition, 2009, pp.1176, ISBN: 0789739542


More on technology
More on digital media
More on online learning
More on computers


Filed Under: Computers Tagged With: Computers, Technology, Upgrading and Repairing PCs

Visual Language for the Web

June 27, 2009 by Roy Johnson

Visual Language for the Web is a book about the language of icons, buttons, and navigational aids used in the design of graphical interfaces of computer software programs. The first chapter deals with Mayan hieroglyphs and Chinese ideograms – writing with pictures. This establishes how much information can be conveyed semiotically. Paul Honeywill then looks at how graphical icons are used in interface design – and how well we understand them, particularly on a multi-national level. Some, like the folder icon, have been successful and are now widely used.

Visual Language for the WebOthers seem to be understandable only within the context of the program for which they are designed. Next comes an explanation of the design of icons, taking account of the psychology of visual perception and the technology of rendering images on screen. He explains for instance why colours and font sizes are rendered differently on PCs and Macs.

He offers an introduction to digital font technology which will be useful for anyone who doesn’t already know how serif and sans-serif fonts are used for quite different purposes.

To illustrate the principles on which graphic icons best operate, he presents two case studies of designing business logos. He considers pictographic languages ranging from the natural Mayan hieroglyphics and Sumerian cuneiform, to recent experiments such as Elephant’s Memory. But he seems reluctant to acknowledge their limitations in telling anything but simple narratives.

However, the very absence of any individual authority on the Internet means that any graphic icons which become generally accepted will be those which are commonly understood.

The last part of the book looks at testing recognition of icons – and comes to the unsurprising conclusion that the most effective and best known are those such as the magnifying glass ‘Search’ icon which appears in lots of different programs.

It has to be said that all this is sometimes discussed at a very theoretical level:

the day sign for Manik when it appears without the day sign cartouche in a non-calendrical context is chi

But this will be of interest to anybody concerned with the study of writing systems, as well as graphic designers, usability experts, and information architects.

© Roy Johnson 2000

Visual Language for the Web   Buy the book at Amazon UK

Visual Language for the Web   Buy the book at Amazon US


Paul Honeywill, Visual Language for the World Wide Web, Exeter: Intellect, 1999, pp.192, ISBN: 187151696X


More on information design
More on design
More on media
More on web design


Filed Under: Information Design, Theory, Web design Tagged With: Computers, Product design, Theory, Web design

Weaving the Web

July 3, 2009 by Roy Johnson

the history of the Web – by the man who invented it

Everybody knows that Tim Berners-Lee is the man who invented the World Wide Web – and that he hasn’t become a millionaire. Weaving the Web explains the reasons why. It’s his own account of one of the most profound developments in twentieth century technology – almost as important as the invention of the Net itself. His story begins in one of the spiritual homes of computing – Manchester UK – where his both his parents worked on the first commercial mainframes made by Ferranti in the 1950s.

Weaving the WebHe wrote his first program to link information in 1980 with “no loftier reason than to help me remember the connections amongst the various people, computers and projects at the [CERN] lab”. He was concerned to share information amongst a community of scientists who were equipped with different languages, different computers, and different operating systems – and it’s interesting to note the persistence of this altruism as the development unfolds.

His narrative is the now-familiar one of noble intentions battling against indifference, resistance, and outright opposition. There is a wonderful sense of intellectual excitement in following the step-by-step struggle to convince people that information could be linked and shared. And all this is as recent as 1990.

There were also conceptual difficulties. Ten years ago, people on the Net were regularly ‘lost in Cyberspace’ – an expression you don’t see used much any more. How difficult it was for most of us to conceptualize all this back in the early 1990s. It was not an easily intuitive thing to take in that when you clicked on a link you were ‘going’ to a computer at the other side of the world. Worse still – when the connection dropped, you felt as though you had fallen out of an aeroplane in mid-Atlantic. We’ve learned since not to worry when something disappears off the screen.

He discusses the competing systems such as Gopher and WAIS [remember those] and the strategic advantage of making SGML the base for hypertext markup language [HTML], the lingua franca of the Web. He is also forthright enough to admit his own failings, and even describes a conference paper which was rejected, as well as a rather sadly uneventful meeting with Ted Nelson in 1992. There’s also an explanation of how the rather clumsy term URL came about, though he continues to use URI [Indicator] throughout the book.

Once the Web takes off in the early 1990s, people such as Jim Clark and Marc Andreesson start to come into the picture. But whilst they make their fortunes turning Mosaic into Netscape, Berners-Lee selflessly devotes his energies to keeping the Web universal, out of the control of individual interests.

It has to be said that the story begins to dip a little at this point, with important but less dramatic decisions to be taken about protocols and standards.

Click for details at AmazonBy 1996 we’re deep into the details of the Web Consortium [WC3] and its workings. The story picks up again as he covers the Netscape-Microsoft squabble and the move towards extensible markup language [XML]. He goes on to discuss the problems of encryption, privacy, censorship, domain name registration, and policies which should be in place to protect the individual. He also indulges in a little futurism, speculating about the consequences of permanent online connections [Yes please!] and the benefits of XML, which he strongly endorses. His account culminates with a prediction that the Web will evolve into what he calls the ‘Semantic Web’ – a system whereby information will be more intelligently and qualitatively structured.

It’s a relatively easy book to read. I had the impression that it’s a transcript of interviews. And he ends, rather surprisingly, by revealing his belief in parallels between the Web and a ‘non-religious’ faith he has taken up in the US. But one thing remains constant throughout – his passionate desire to keep the Web an open, international standard – and for that we can all be grateful.

© Roy Johnson 2000

Buy the book at Amazon UK

Buy the book at Amazon US


Tim Berners-Lee, Weaving the Web, London: Orion Books, 1999, pp.244, ISBN 0752820907


More on computers
More on technology
More on digital media


Filed Under: Techno-history Tagged With: Computers, Cultural history, Technology, Tim Berners-Lee, Weaving the Web, World Wide Web

Web Design in a Nutshell

July 12, 2009 by Roy Johnson

comprehensive manual, plus tips and explanations

Web site design manuals are often all screen shots and little substance. These can be quite useful for beginners, who might be intimidated by too many technicalities. At the other extreme there are the dense catalogues of coding definitions issued by the standards authorities which only an expert would ever need to consult. In between are all the rest, which need to present something original or at least interesting to distinguish themselves from the mass. Web Design in a Nutshell manages to combine the best of the intermediate and advanced worlds.

Web Design in a NutshellThey feature a compressed mixture of instruction and reference which cuts out all dross, and offer their usual excellent value. Jennifer Niederst explains that she felt the urge to produce yet another book on Web design for the simplest of motives – her own use.

I was becoming frustrated with the time I was spending on the Web tracking down the answers to little questions: ‘Which tag does that attribute go in?’, ‘Does this browser support that technology?’, ‘What’s the best way to put audio on the Web?’ And I’m not ashamed to admit that I’ve been reduced to tears after hours of battling a table that mysteriously refused to behave, despite my meticulous and earnest efforts. You just can’t keep all this stuff in your head any more.

Niederst is one of their former staff writers and designers [see her recent Learning Web Design]. She explains HTML in a clear and sensible manner, starting with what she calls ‘the web environment’ – how it all works, why you should keep different browsers in mind, and what ‘screen resolution’ really means.

Then there is a very thorough coverage of all the basic elements: HTML coding, text formatting, links and images, tables, frames, and forms; then graphics in .gif, .jpg and .png formats; colours, audio, video, and javascript. The latter part of the book is devoted to what she calls ‘the emergent technologies – cascading style sheets, dynamic HTML, XML, and font embedding.

All the way through, she throws out tips, hints, and warnings which give you confidence that she knows whereof she speaks, and as you would expect in a work of this kind, there are a full range of reference tables – the complete HTML 4.0 specification, ‘deprecated’ and proprietary tags, a glossary of terms, and even an extended table of the latest support for style sheets in a wide range of browsers.

The latest edition has been substantially revamped and extended. Additions include more on printing pages from the Web, using Flash and Shockwave, using SMIL for multimedia presentations, and designing for the wireless web using WML.

At the risk of sounding like an O’Reilly groupie, I have to say that their productions are almost always a bibliographic joy to behold. They are well written, elegantly designed, meticulously edited, and flawlessly printed. This one is no exception.

© Roy Johnson 2001

Buy the book at Amazon UK

Buy the book at Amazon US


Jennifer Niederst, Web Design in a Nutshell: A Desktop Quick Reference, 2nd edition, Sebastopol: O’Reilly & Associates, 2001, pp.640, ISBN: 0596001967


More on web design
More on digital media
More on technology


Filed Under: Web design Tagged With: Computers, CSS, HTML, Web design, Web Design in a Nutshell

Web Site Measurement Hacks

July 14, 2009 by Roy Johnson

tips and tools to help optimise your online business

If you take a serious interest in your web site, once you’ve got over the obsession with how it looks, you’ll want to know how it performs. And if it includes any element of e-commerce, you’ll undoubtedly want to know how to improve that performance. Eric Peterson’s guide Web Site Measurement Hacks is a technical guide to doing that by measuring what is going on – and that means hard figures, the number of visitors you get, and what they do when they arrive at your site.

Web Site Measurement HacksThe first and most important thing is to know the definition of terms in this arcane world – to know the difference between ‘hits’, ‘visitors’, and ‘unique page views’ for instance. He explains these issues really well, and emphasises that you need to understand the technical details if you want to increase your site traffic. Although some of his suggestions are aimed at businesses with big money to spend on web site optimisation, I was glad to see that he included the cheap and even free options available for small and start-up entrepreneurs. This includes programs such as Analog, which I have used myself in the past.

He explains how to understand and analyse web logfiles, and how to get a more accurate picture of which human beings are visiting your site by excluding from the results robot searches and other data which has been pulled from cache. For those who are really technologically ambitious, there are instructions on how to build your own web measurement application, along with the necessary core code and the location of free downloadable add-ons.

As the book progresses it becomes more technical. First he deals with JavaScript page tags, then how to use one-pixel hidden graphic ‘bugs’ to learn more about what visitors do on your site. He also covers learning from errors – that is, understanding (and rectifying) the broken links and the pages which are not delivered on request to your visitors.

After that, he switches to explaining the details of online marketing. This involves a close examination of terms such as ‘click through rate’ and ‘cost per conversion’, as well as how to measure the effectiveness of banner advertising.

Most of his recommendations are sound. On the optimization of web page size he mentions the free service offered by Andy King (author of Speed Up Your Site). I ran a few pages from the site you are visiting now through his analyzer and learned a lot about possible improvements.

The later stages of usability become more and more complex. The hacks he discusses here are for people with serious e-commerce ambition who are prepared to spend time and money on making their site(s) more effective. They include features such as measuring the demographics of your site visitors, analysing their behaviour patterns, and gathering data on their engagement with the retail process.

This is a book which deals with both the technical issues of maintaining your Web’s infrastructure and the business implications of interpreting the data it generates. It’s a technology companion that any serious web entrepreneur will welcome.

© Roy Johnson 2005

Web Site Measurement   Buy the book at Amazon UK

Web Site Measurement   Buy the book at Amazon US


Eric T. Peterson, Web Site Measurement Hacks, Sebastopol, CA: O’Reilly, 2005, pp.405, ISBN: 0596009887


More on eCommerce
More on media
More on publishing
More on technology


Filed Under: e-Commerce, Web design Tagged With: Computers, Optimization, Technology, Web design, Web Site Measurement Hacks

Website Optimization

July 1, 2009 by Roy Johnson

speed, search engine, and conversion rate secrets

Andy King scored a big hit in 2003 with his first book Speed Up Your Site. It’s a guide which still has its own live web site where you can analyse the effectiveness of your web pages. His latest magnum opus Website Optimization goes way beyond that in scope and depth. It’s a guide to maximising every aspect of a website and its performance. It’s an amazingly practical manual, with page after page of ideas, suggestions, and strategies for getting your pages more widely known and read.

Website Optimization On the whole, it’s not too technical, and he supplies snippets of code only when necessary. All the tips are within the grasp of anyone who is used to running a web site, and along the way he explains the principles of search engine optimization (SEO) as well as briefing you on how SEs treat your site. This is an up-to-date account of how search engines such as Yahoo and Google rank your pages and deal with search requests. He also presents real-life case studies in which he shows ‘before and after’ makeovers of professional sites. These are most instructive in that the ‘before’ pages look attractive and professional enough – until their underlying weaknesses are analysed and rectified. The improvements give what are claimed as up to fifty times more site visitors per day, and in the case of a cosmetic dentist the need to employ more staff and move to bigger offices in Philadelphia.

The first half of the book deals with search engine marketing optimization, which can be expensive as one enters the world of paid advertising. But the second concentrates on things which anyone can do and afford – making pages smaller, lighter, and faster by trimming off the surplus fat. In an age of faster and faster broadband connections, web users are simply not prepared to wait more than a couple of seconds for a page to appear – so you’ve got to make important pages lean and speedy:

Web page optimization streamlines your content to maximise display speed. Fast display speed is the key to success with your website. It increases profits, decreases costs, and improves customer satisfaction (not to mention search engine rankings, accessibility, and maintainability).

All of these issues are dealt with in detail – and I particularly liked the fact that he was prepared to repeat some of the techniques when they occurred in different contexts. It’s not always easy to grasp some of these technologies in one simple pass. Especially as – in the case of optimizing images – he explains no less than sixteen possibilities for cutting file size and speeding up downloads.

He’s also keen on the optimization of style sheets and shows an amazing variety of techniques for creating what he calls ‘CSS Architecture’. Here too there are no less than ten strategies explained which offer cleaner, tighter, coding and the use of structural markup to beat browser peculiarities and rendering delays.

Most of his explanations are clearly articulated, but occasionally he lapses into less than elegant repetition and jargon, which could deter the inexperienced:

By converting old-style nonsemantic markup into semantic markup, you can more easily target noncontiguous elements with descendant selectors.

Fortunately, this sort of thing only happens occasionally.
There are some very nifty tricks for creating buttons and rollover techniques using style sheets, which saves the time to download a graphic files button, and thus once again speeds up page rendering.

He puts in two chapters on advanced web performance and optimizing JavaScipt and Ajax on your site which I have to admit went beyond my technical competence. But then it’s back to terra firma with understanding the metrics of your site’s performance – that is, knowing how to analyse the statistical data returned by website analysers such as Google’s Analytics and WebTrends.

I’ve never been able to understand before what page ‘bounce rate’ was until it was explained here – and I was astonished when I saw the results from some of my own pages!

As the search for more detailed information and for planning campaigns goes on – so the process becomes more like a science. There are graphs and formulae scattered around these pages to prove this. It’s the same for Pay Per Click advertising (PPC). All I can say is that if you are in this league, Andy King is your friend, and his advice is here thick on the ground to help you.

© Roy Johnson 2008

Website Optimization   Buy the book at Amazon UK

Website Optimization   Buy the book at Amazon US


Andrew King, Website Optimization, Sebastopol, CA: O’Reilly, 2008, pp.367, ISBN: 0596515081


More on eCommerce
More on media
More on publishing
More on technology


Filed Under: e-Commerce, Web design Tagged With: Computers, e-Commerce, Optimization, SEO, Web design, Website Optimization

  • « Previous Page
  • 1
  • …
  • 3
  • 4
  • 5
  • 6
  • Next Page »

Get in touch

info@mantex.co.uk

Content © Mantex 2016
  • About Us
  • Advertising
  • Clients
  • Contact
  • FAQ
  • Links
  • Services
  • Reviews
  • Sitemap
  • T & C’s
  • Testimonials
  • Privacy

Copyright © 2025 · Mantex

Copyright © 2025 · News Pro Theme on Genesis Framework · WordPress · Log in