Distinguished Lecture Series Archive (1969 to 30/06/11)

Current details on our Distinguished Lecture Series can found in the student handbook.

Details of the series from 1969 to the middle of 2011 can be found below.

Details on Distinguished Lecture Series events since 2011 are available on the School blog.

Date Title
1969Algol '68 (by W van der Poel)
1970Simula '67 (by O J Dahl)
1971Graphics (by A van Dam)
1972Computational Complexity (by W Burkhardt)
1973APL (by K Simillie, W Adams)
1974The Semantics of Databases (by J J Abrial)
1975System Structuring and the Utilisation of Microprogrammable Computers (by H Lawson)
1976Program Production by Successive Transformations (by M Griffiths)
1977The Semantics of Programming Languages with Special Reference to Denotational Semantics (by J Stoy, R D Tennant)
1978Some Thoughts on Computer Systems (by F Sumner, P Kornerup)
1980Database Construction and a Structured Approach to Application Systems (by D McGregor, J Lynn)
1981An Introduction to VLSI and an Application to String Processing (by P Brown)
1981Computer Typesetting with TEX (by H Brown)
1982Applicative Programming (by D Turner, J Darlington)
1983Computer Algebra (by A Norman, J Fitch)
1984Logic Programming (by P Hammond, F Kriwaczek, M Sergot)
1985Functional Programming and Formal Specification (by P Henderson)
1986-05-07Language Paradigms for Programming in the Large (by Peter Wegner, Robin Milner, Malcolm Atkinson, Robert Harper)
1987-04-14Reduction Methods in Programming System Design and Implementation (by Simon Peyton-Jones, J Alan Robinson)
1988-04-13Object-Oriented Languages, Databases and Integrated Project (by Stan Zdonik, Steve Cook, Ian Sommerville)
1989-04-11Database Programming Languages (by Malcolm Atkinson, Dave Stemple)
1990-04-12Operating Systems and Computer Architecture to Support Object Oriented Programming (by John Rosenberg, Roy Campbell)
1991-04-18Neural Networks (by Professor Geoffrey Hinton , Dr David Willshaw )
1992-04-09General Purpose Parallel Computing (by Professor John Gurd , Professor Arvind )
1993-04-15Computational Geometry (by Dr Ralph Martin , Professor Jörg-Rüdiger Sack)
1994-04-21Fault-Tolerant Distributed Systems (by Professor Ozalp Babaoglu, Professor Santosh Shrivastava)
1995-04-20Distributed Multimedia Communications (by Dr Jon Crowcroft )
1996-04-16Genetic Algorithms (by Dr Peter Ross, Dr Colin Reeves)
1997-04-21 Distributed Systems Technologies (by Dr Andy Hopper, Dr Andrew Herbert )
1998-04-15Information Retrieval: its models, its evaluation and its multimedia applications (by Keith van Rijsbergen , Yves Chiaramella)
1998-12-01The Software Engineering Process (by Professor Brian Warboys)
1999-02-18Social Analysis and Software Systems Design (by Professor Ian Sommerville)
1999-11-24People and Computers (by Professor Alan Newell)
2000-03-01Computer Storage Systems (by John Wilkes )
2000-11-28Pervasive Computing: StarTrek? Hogwarts? Reality? (by Dr Dirk Husemann)
2001-04-19Games and Entertainment - picture of the future, novel technologies and usability aspects (by Peter Astheimer, Tim Taylor, Lucy Joyner )
2001-12-12XML - a data standard for well-behaved programmers? (by Professor Richard Connor)
2002-03-14Some Futures in Broadband Communications (by Professor Derek McAuley)
2002-11-21Ubiquitous Computing Environments (by Professor Hans Gellersen)
2003-04-23Where The Hard Problems Are (by Toby Walsh)
2003-12-09Towards Automated Management of Large-Scale Distributed Systems (by Professor Joe Sventek)
2004-04-14Autonomic Computing as a Unifying Framework for Self-Managing Systems (by Ric Telford)
2004-11-29Computational Finance (by Professor Edward Tsang)
2005-03-21The Unreasonable Effectiveness of Logic (by Professor Philip Wadler)
2005-12-02Modern Cryptography (by Dr Matt Robshaw)
2006-05-02Thinking Out of the Computer Science Cargo Cult Box (by Professor Harold Thimbleby)
2006-12-01If Software is the Solution, What is the Problem? (by Professor Bashar Nuseibeh)
2007-05-02Model-Driven, Component Engineering (by Professor Colin Atkinson)
2007-11-22Scheduling Real-time Systems (by Professor Alan Burns)
2008-03-05Market-Based Systems (by Professor Dave Cliff)
2008-11-06Human-Computer Interaction: as it was, as it is, and as it may be (by Professor Alan Dix)
2009-02-23Delay Tolerant and Opportunistic Networks (by Prof. Jon Crowcroft)
2009-11-18Cryptography: From Black Art to Popular Science (by Prof Fred Piper, Prof Peter Wild)
2009-11-18Cryptography: From Black Art to Popular Science (by Prof Fred Piper, Prof Peter Wild)
2010-04-29Parallelism and the Exascale Challenge (by Prof Arthur Trew)
2010-11-15Machines Reasoning about Machines (by J Strother Moore)
2011-04-22From Recommendation to Reputation: Information Discovery Gets Personal (by Barry Smyth)

Algol '68

W van der Poel

W van der Poel

1969

Speaker:
W van der Poel

Affiliation:
University of Amsterdam

AttachmentSize
vanderpoel_picture.jpg92.19 KB

Simula '67

1970

Speaker:
O J Dahl

Affiliation:
University of Oslo

Further information:
http://heim.ifi.uio.no/~olejohan/


Graphics

1971

Speaker:
A van Dam

Affiliation:
Brown University

Further information:
http://www.cs.brown.edu/~avd/


Computational Complexity

1972

Speaker:
W Burkhardt


APL

1973

Speaker 1:
K Simillie

Speaker 2:
W Adams


The Semantics of Databases

1974

Speaker:
J J Abrial


System Structuring and the Utilisation of Microprogrammable Computers

1975

Speaker:
H Lawson


Program Production by Successive Transformations

1976

Speaker:
M Griffiths


The Semantics of Programming Languages with Special Reference to Denotational Semantics

1977

Speaker 1:
J Stoy

Speaker 2:
R D Tennant


Some Thoughts on Computer Systems

1978

Speaker 1:
F Sumner

Speaker 2:
P Kornerup


Database Construction and a Structured Approach to Application Systems

1980

Speaker 1:
D McGregor

Speaker 2:
J Lynn


An Introduction to VLSI and an Application to String Processing

1981

Speaker:
P Brown


Computer Typesetting with TEX

1981

Speaker:
H Brown


Applicative Programming

1982

Speaker 1:
D Turner

Affiliation (Speaker 1):
University of Kent

Further information (Speaker 1):
http://www.cs.kent.ac.uk/people/staff/dat/

Speaker 2:
J Darlington


Computer Algebra

1983

Speaker 1:
A Norman

Speaker 2:
J Fitch


Logic Programming

1984

Speaker 1:
P Hammond

Speaker 2:
F Kriwaczek

Speaker 3:
M Sergot


Functional Programming and Formal Specification

1985

Speaker:
P Henderson

Affiliation:
University of Southampton

Further information:
http://pmh-systems.co.uk/phAcademic/


Language Paradigms for Programming in the Large

Jack Cole, Peter Wegner, Malcolm Atkinson, Robert Harper, Robin Milner, Ron Morrison

Jack Cole, Peter Wegner, Malcolm Atkinson, Robert Harper, Robin Milner, Ron Morrison

Wed, 07 May 1986

Speaker 1:
Peter Wegner

Affiliation (Speaker 1):
Brown University

Further information (Speaker 1):
http://www.cs.brown.edu/~pw/

Speaker 2:
Robin Milner

Affiliation (Speaker 2):
University of Edinburgh

Further information (Speaker 2):
http://www.fairdene.com/picalculus/robinmilner.html

Speaker 3:
Malcolm Atkinson

Affiliation (Speaker 3):
Glasgow University

Speaker 4:
Robert Harper

Affiliation (Speaker 4):
Massachusetts Institute of Techology

Further information (Speaker 4):
http://www.cs.cmu.edu/~rwh/

AttachmentSize
1986 Picture.JPG743.96 KB

Reduction Methods in Programming System Design and Implementation

Simon Peyton-Jones, Ron Morrrison, Tony Davie, Alan Robinson

Simon Peyton-Jones, Ron Morrrison, Tony Davie, Alan Robinson

Tue, 14 Apr 1987

Speaker 1:
Simon Peyton-Jones

Affiliation (Speaker 1):
University of Glasgow

Further information (Speaker 1):
http://research.microsoft.com/Users/simonpj/

Speaker 2:
J Alan Robinson

Affiliation (Speaker 2):
Syracuse University

Further information (Speaker 2):
http://en.wikipedia.org/wiki/J._Alan_Robinson

AttachmentSize
1987Picture.JPG503.12 KB

Object-Oriented Languages, Databases and Integrated Project

Jack Cole, Steve Cook, Stan Zdonik, Ian Sommerville, Ron Morrison

Jack Cole, Steve Cook, Stan Zdonik, Ian Sommerville, Ron Morrison

Wed, 13 Apr 1988

Speaker 1:
Stan Zdonik

Affiliation (Speaker 1):
Brown University

Further information (Speaker 1):
http://www.cs.brown.edu/people/faculty/sbz.html

Speaker 2:
Steve Cook

Further information (Speaker 2):
http://www.domainspecificdevelopment.com/Bios/SteveCook.aspx

Speaker 3:
Ian Sommerville

Affiliation (Speaker 3):
Lancaster University

Further information (Speaker 3):
http://www.cs.st-andrews.ac.uk/~ifs/

AttachmentSize
Sommerville Picture.jpg1014.74 KB

Database Programming Languages

Sasha Zamulin (Russian Academy of Sciences), Malcolm Atkinson, Dave Stemple, Ron Morrison

Sasha Zamulin (Russian Academy of Sciences), Malcolm Atkinson, Dave Stemple, Ron Morrison

Tue, 11 Apr 1989

Speaker 1:
Malcolm Atkinson

Affiliation (Speaker 1):
University of Glasgow

Speaker 2:
Dave Stemple

Affiliation (Speaker 2):
University of Massachusetts

Further information (Speaker 2):
http://www.cs.umass.edu/~stemple/

AttachmentSize
zamlulin Picture.JPG480.73 KB

Operating Systems and Computer Architecture to Support Object Oriented Programming

John Rosenberg, Ron Morrison, Roy Campbell, Graham Pratten (ICL)

John Rosenberg, Ron Morrison, Roy Campbell, Graham Pratten (ICL)

Thu, 12 Apr 1990

Speaker 1:
John Rosenberg

Affiliation (Speaker 1):
University of Sydney

Further information (Speaker 1):
http://www.deakin.edu.au/vc/dvc-academic.php

Speaker 2:
Roy Campbell

Affiliation (Speaker 2):
University of Ilinois

Further information (Speaker 2):
http://www.cs.uiuc.edu/directory/directory.php?name=campbell

AttachmentSize
rosenburg&prattenPicture.JPG536.33 KB

Neural Networks

Thu, 18 Apr 1991

Speaker 1:
Professor Geoffrey Hinton

Affiliation (Speaker 1):
University of Toronto

Further information (Speaker 1):
http://www.provost.utoronto.ca/Awards/uprofessors/current/hinton.htm

Speaker 2:
Dr David Willshaw

Affiliation (Speaker 2):
University of Edinburgh

Further information (Speaker 2):
http://homepages.inf.ed.ac.uk/willshaw/

Venue:
Physics Building, University of St Andrews

Programme:
April 18th

9.00- 9.50 Introduction: Why we are interested in Neural Nets plus simple computational considerations - G.Hinton and D.Willshaw

9.50-10.40 Perceptron theory - D.Wilshaw

10.40-11.00 Coffee

11.00-11.50 Basic Backpropagation with some simple examples - G.Hinton

11.50 -12.40 Associative nets (including Hopfield nets) - D.Wilshaw

14.00-15.00 The theoretical basis of backpropagation and a more complex example - G.Hinton

15.00- 15.30 Tea

15.30- 16.30 Biological and computational extensions of associative nets - D.Willshaw

April 19th

9.00- 9.50 Mean Field Nets - G.Hinton

9.50 - 10.40 Neurobiology and Computation: Tea-trade model, Kohonen net, Elastic net - D.Willshaw

10.40- 11.00 Coffee

11.00 - 11.50 Adaptive elastic nets for character recognition. Communities of competing expert networks - G.Hinton

11.50 - 12.40 Development of neuromuscular connections - D.Willshaw

14.00 - 15.00 Discovering the causes of the sensory input - G.Hinton

15.00- 15.30 Tea

15.30 - 16.30 Summary and the Future - G.Hinton and D.Willshaw

AttachmentSize
1991 Programme 1.pdf73.28 KB

General Purpose Parallel Computing

Mike Livesey, Professor Arvind, John Gurd, Ron Morrison

Mike Livesey, Professor Arvind, John Gurd, Ron Morrison

Thu, 09 Apr 1992

Speaker 1:
Professor John Gurd

Affiliation (Speaker 1):
University of Manchester

Further information (Speaker 1):
http://intranet.cs.man.ac.uk/cnc/staff/john/home.html

Speaker 2:
Professor Arvind

Affiliation (Speaker 2):
Massachusetts Institute of Technology

Further information (Speaker 2):
http://csg.csail.mit.edu/Users/arvind/

Venue:
Mathematical Institute, University of St Andrews

Programme:
April 9th
9.00 - 9.50 Why Most Machines in Use Today are Not Parallel Machines - Professor Arvind

10.00 - 10.40 “Parallel Programming Standards” - Professor J. Gurd

10.40 - 11.00 Coffee

11.00 - 11.50 Implicit Parallel Programming in Id - Professor Arvind

12.00 - 12.40 Fine-grain Parallelism and Non-deterministic Programs - Professor Arvind

14.00 - 15.00 Evolution of Parallel Architectures I - Professor J Gurd

15.00 - 15.30 Tea

15.30 - 16.30 Evolution of Parallel Architectures II - Professor J. Gurd

April 10th

9.00 - 9.50 “Compilation” and Performance - Professor J. Gurd

10.00 - 10.40 Dataflow or Muhithreaded Architectures - Professor Arvind

10.40- 11.00 Coffee

11.00 - 11.50 Compiling Threaded Code from Id - Professor Arvind

12.00 - 12.40 Applications and Algorithms - Professor J. Gurd

14.00 - 15.00 The Monsoon Project - Professor Arvind

15.00 - 15.30 Tea

15.30 - 16.30 The Centre for Novel Computing (CNC) - Professor J. Gurd

AttachmentSize
1992 - 1 Programme.pdf77.16 KB
gurd&arvind Picture.jpg421.41 KB

Computational Geometry

Ursula Martin, Ron Morrison, Ralph Martin, Jörg-Rüdiger Sack, Micheal Atkinson, David Milner (The Dean)

Ursula Martin, Ron Morrison, Ralph Martin, Jörg-Rüdiger Sack, Micheal Atkinson, David Milner (The Dean)

Thu, 15 Apr 1993

Speaker 1:
Dr Ralph Martin

Affiliation (Speaker 1):
University of Wales College Cardiff

Biography (Speaker 1):
Ralph Martin has been working in the field of CADCAM since 1979. He obtained his PhD in 1983 from Cambridge University for a dissertation on “Principal Patches”, and since then has been a Lecturer at the University of Wales College of Cardiff. He has published over 35 papers and 3 books covering such topics as surface modelling, intelligent sketch input, vision based geometric inspection and geometric reasoning. He is a Fellow of the Institute of Mathematics and its Applications, and a Member of the British Computer Society.

Further information (Speaker 1):
http://www.cs.cf.ac.uk/school/staffpage.php?emailname=ralph.martin

Speaker 2:
Professor Jörg-Rüdiger Sack

Affiliation (Speaker 2):
Carleton University, Ottawa

Biography (Speaker 2):
Dr. JJörg-Rüdiger Sack received his first degrees from the RheinischeFriedrich-Wilhelms University, Bonn, Germany, and obtained his doctorate in computational geometry from McGill University, Montreal in 1984. In 1983 he joined the School of Computer Science at Carleton University and assumed various functions including Director of the Ottawa-Carleton Institute for Computer Science. In addition to computational geometry, his research includes data structures and algorithms, and parallel and sequential computation.

He is currently studying computational geometry on neural nets, parallel and sequential link distance problems and vector dominance. He is also directing a research project to build an object-oriented workbench for computational geometry. He has been on various program committees for international conferences and has edited several conference proceedings. In 1991 he took the position of editor-in-chief for the new journal Computational Geometry: Theory and Applications published by North Holland.

Further information (Speaker 2):
http://www.scs.carleton.ca/~sack/

Venue:
Mathematical Institute, University of St Andrews

Programme:
April 15th 9.00- 9.50
10.00-10.40
10.40- 11.00 Coffee
11.00-11.50
12.00-12.40
14.00-15.00
15.00-15.30 Tea
15.30-16.30
April 16th 9.00 - 9.50
10.00- 10.40
10.40-11.00 Coffee
11.00-11.50
12.00-12.40
14.00-15.00
15.00-15.30 Tea
15.30-16.30

AttachmentSize
1993 -1 Programme.pdf85.88 KB
Martin&sack Picture.JPG492.72 KB

Fault-Tolerant Distributed Systems

Ron Morrison, Santosh Shrivastava, Ozalp Babaoglu, Mike Livesey

Ron Morrison, Santosh Shrivastava, Ozalp Babaoglu, Mike Livesey

Thu, 21 Apr 1994

Speaker 1:
Professor Ozalp Babaoglu

Affiliation (Speaker 1):
University of Bologna

Biography (Speaker 1):
Ozalp Babaoglu received his PhD. in Computer Science from the University of California, Berkeley in 1981. Since 1988, he has been a Professor at the University of Bologna and a founding member of the Laboratory for Computer Science. Before moving to Italy, he was an Associate Professor of Computer Science at Cornell University, Ithaca, New York (USA) where he had been a faculty member since 1981. While at Berkeley, Professor Babaoglu was a principal designer and implementer of Berkeley UNIX for which he was awarded the 1982 Sakrison Memorial Award together with Bill Joy. He has been Principal Investigator on a number of research grants in the areas of distributed computing, fault tolerance and operating systems. He is conducting research on large-scale distributed systems for the ESPRIT basic research project BROADCAST. Professor Babaoglu has been chair and member of program committees for numerous conferences. He serves as an editor for the Springer-Verlag Distributed Computing journal and is a consultant to the CEC for Strategy in Distributed Systems. He is the author of over 40 scientific publications.

Further information (Speaker 1):
http://www.cs.unibo.it/~babaoglu/index.html

Speaker 2:
Professor Santosh Shrivastava

Affiliation (Speaker 2):
University of Newcastle upon Tyne

Biography (Speaker 2):
Santosh Shrivastava obtained his PhD in Computer Science from Cambridge in 1975. After several years in industry, he joined the Computing Science Department of the University of Newcastle in 1975 where his present position is Professor of Computing Science. He is currently leading Arjuna and Voltan research groups. The Arjuna group has developed the Arjuna object-oriented fault-tolerant distributed system which supports atomic transactions on persistent objects. Arjuna is forming the basis for further research on flexible transaction processing in large scale distributed systems. He directs ESPRIT Basic Research project BROADCAST on large scale distributed systems. The Voltan group is undertaking research into high integrity real-time systems, which involves investigation of agreement protocols, failure detection and reconfiguration, communication primitives, clock synchronization and real-time scheduling. Some of these ideas have been incorporated in Voltan "fail-controlled" nodes. He has over 50 publications in the areas of fault-tolerance and distributed computing.

Further information (Speaker 2):
http://www.cs.ncl.ac.uk/people/santosh.shrivastava

Venue:
Physics Building, University of St Andrews

Programme:
Thursday, April 21st

10.00-10.30 Registration & Coffee

10.30-11.20 Consistent global states of distributed systems: fundamental concepts and mechanisms - I Professor Babaoglu

11.20-12.10 Structuring fault-tolerant persistent object systems for modularity - Professor Shrivastava

12.10-13.00 Consistent global states of distributed systems: fundamental concepts and mechanisms - II Professor Babaoglu

13.00-14.10 Lunch

14.10-15.00 Design and implementation of Arjuna distributed programming system - Professor Shrivastava

15.00-15.50 Fault-tolerant broadcasts and related problems - I Professor Babaoglu

15.50-16.10 Tea

16.10-17.00 An application of Arjuna - Professor Shrivastava

Friday, April 22nd

09.00-9.50 Fault-tolerant broadcasts and related problems - II Professor Babaoglu

09.50-10.40 Replication management using the state-machine approach - Professor Babaoglu

10.40-11.00 Coffee

11.00-11.50 Active replication of non-deterministic programs - Professor Shrivastava

11.50-12.40 The primary-backup approach - Professor Babaoglu

12.40-14.00 Lunch

14.00-14.50 Object replication in Arjuna - Professor Shrivastava

14.50-15.40 Implementing fail-silent nodes for distributed systems - Professor Shrivastava


AttachmentSize
1994-1 Programme.pdf79.67 KB
babaoglu&shrivastavaPicture.JPG510.83 KB

Distributed Multimedia Communications

Mike Livesey, Jon Crowcroft, Tom Blyth (The Dean), Ron Morrison

Mike Livesey, Jon Crowcroft, Tom Blyth (The Dean), Ron Morrison

Thu, 20 Apr 1995

Speaker:
Dr Jon Crowcroft

Affiliation:
University College London

Biography:
Jon Crowcroft is a Senior Lecturer in the Department of Computer Science, University College London, where he is responsible for a number of European and US funded research projects in Multi-media Communications. He has been working in these areas for over 14 years. He graduated in Physics from Trinity College, Cambridge University in 1979, and gained his MSc in Computing in 1981, and PhD in 1993. He is a member of the ACM, the British Computer Society and the lEE. He was General Chair for the ACM SIGCOMM 94 symposium. He is also on the editorial teams for the Transactions on Networks and the Journal of Internetworking. He is co-author of the forthcoming book ÒWWW: Beneath the SurfÓ and author of the forthcoming Open Distributed Systems, both to be published by UCL Press.

Further information:
http://www.cl.cam.ac.uk/~jac22/

Venue:
Physics Building, University of St Andrews

Abstract:
Distributed Multimedia Communications will cover the following topic areas: Multimedia -- what is it?; Input Media Formats; Data compression; Hardware; Multimedia conferencing; Network support and protocols; Operating system; Synchronisation; Storage and Retrieval; Ergonomics; Evaluation.

Programme:
Thursday, April 20th
10.30-11.00 Registration & Coffee (in Philip Lee Laboratory)
11.00-12.30 What is Multimedia - why is it different?
12.30-14.00 Lunch
14.00-15.30 Video and Audio Characteristics - nitty gritty
15.30-15.45 Tea/coffee
15.45-16.45 Circuits, CODECs, Packets and Workstations
17.00-18.00 Reception in Staff Common Room

Friday, April 21st

10.30-12.00 CCCP - UCLs Model of Distributed Multimedia Control
12.00-14.00 Lunch
14.00-15.30 WWW - Hyper Media
15.30-15.45 Tea/coffee
15.45-16.45 Psychology, Ergonomics, Platitudes etc

AttachmentSize
1995-1 Programme.pdf94.16 KB
crowcroft Picture.JPG395.16 KB

Genetic Algorithms

Peter Ross, Mike Livesey, Colin Reeves, Ron Morrison

Peter Ross, Mike Livesey, Colin Reeves, Ron Morrison

Tue, 16 Apr 1996

Speaker 1:
Dr Peter Ross

Affiliation (Speaker 1):
Department of Artificial Intelligence, University of Edinburgh

Biography (Speaker 1):
Dr Peter Ross was originally a mathematician but joined the Department of Al at the University of Edinburgh in 1978, where he is now a senior lecturer. His research interests include genetic algorithms and neural nets (separately and together) and he heads a very active group researching a broad range of topics in evolutionary computation. He is the author of four books and many papers, and is also the current chairman of SSAISB, the UK’s Al society.

Further information (Speaker 1):
http://www.dcs.napier.ac.uk/~peter/

Speaker 2:
Dr Colin Reeves

Affiliation (Speaker 2):
School of Mathematical and Information Sciences, University of Coventry

Biography (Speaker 2):
Dr Cohn Reeves is a senior lecturer in Operational Research in the School of Mathematical and Information Sciences at Coventry University. His main research interests have been in applications of neural networks to pattern recognition problems, and in heuristic methods for combinatorial optimization (particularly genetic algorithms) on which he has published several papers. He was joint program-chair of the 1993 International Conference on Artificial Neural Networks and Genetic Algorithms, and has participated in many international events on these topics. His current research focuses on non- biological perspectives on genetic algorithms.

Further information (Speaker 2):
http://www.coventry.ac.uk/ec/~colinr/

Venue:
Mathematical Institute, University of St Andrews

Abstract:
Genetic algorithms have stimulated great interest recently, and are the subject of much research activity. Genetic algorithms arose out of work by Holland on the behaviour of evolutionary systems, and they represent the application of these ideas to optimisation problems whereby solutions are “evolved” computationally. Research is particularly concerned with the convergence characteristics of genetic algorithms, and studies the impact on convergence of design issues such as the genetic “operators’ that effect the evolution, the problem space representation, population sizes and breeding policies.

Programme:
Tuesday, 16th April
10.30-11.00 Registration & Coffee (in Philip Lee Laboratory)

11.00-12.30 GA basics: simple population genetics, genetic operators (crossover, etc), fitness, selection GA options: different selection methods, operators, codings, replacement strategies etc. Dr Peter Ross

12.30-14.00 Lunch

14.00-15.30 GA theory: schemata, deception, Walsh ftmctions and experimental design, epistasis, Markov model GA applications: representations, forma analysis (examples for continuous, discrete and sequence-coded functions).
Dr Colin Reeves

15.30-15.45 Tea/coffee

15.45-16.45 Research Topics I: examples of other non-binary encodings, such as applications to neural nets and facility layout and genetic programming.
Dr Peter Ross

Wednesday, 17th April
10.30-12.00 Research Topics II: epistasis analysis and connections to statistics.
Dr Colin Reeves

12.00-14.00 Lunch

14.00-15.30 Research Topics III: examples of applications to scheduling and timetabling.
Dr Peter Ross

15.30-15.45 Tea/coffee

15.45-16.45 Research Topics IV: links to neighbourhood search and landscape analysis.
Dr Colin Reeves

AttachmentSize
1996-1 Programme.pdf114.45 KB
reeves&ross Picture.JPG538.84 KB

Distributed Systems Technologies

Mon, 21 Apr 1997

Speaker 1:
Dr Andy Hopper

Affiliation (Speaker 1):
Computer Laboratory, University of Cambridge

Biography (Speaker 1):
Andy Hopper received the B.Sc. degree from University of Wales in 1974 and the PhD degree from the University of Cambridge in 1978. He was elected a Fellow of the Royal Academy of Engineering in 1996. He is the Reader in Computer Technology at the University of Cambridge and a Fellow of Corpus Christi College. He is Vice President of Research of Ing. C. Olivetti & C. SpA, Italy, Director of the Olivetti & Oracle Research Laboratory (ORL) in Cambridge, Chief Technical Officer of Advanced Telecommunications Modules Limited, Chairman of Telemedia Systems Ltd, and a Director of Acorn Computer Group plc.

His research interests include networking, multimedia, and mobile systems.

Further information (Speaker 1):
http://www.cl.cam.ac.uk/research/dtg/~ah12/

Speaker 2:
Dr Andrew Herbert

Affiliation (Speaker 2):
Technical Director, APM Ltd

Biography (Speaker 2):
Andrew Herbert is Technical Director of APM and chief architect of ANSA. ANSA is an industry-sponsored program of research and advanced development into the use of distributed systems technology to support applications integration in enterprise-wide systems. The current focus of the ANSA work includes support for interactive multi-media services, object technology for World Wide Web applications, distributed systems management and security for electronic commerce.

Andrew is a member of the UK Information Technology and Electronics Foresight panel. He has served on numerable conference programme committees and is a project reviewer and strategy advisor for the European Commission and UK EPSRC. He interacts regularly with senior technical staff in the organizations which have sponsored the ANSA programme, including HP, ICL, BT, Bellcore amongst others. He maintains strong links with the academic research community and holds a Visiting Professorship In the Computer Science Department at the University of Essex, Colchester, England.

Prior to ANSA, Andrew was a lecturer in the Computer Laboratory at the University of Cambridge during the pioneering days of local area networks and before that a research student active in the fields of operating systems and security. He spent a sabbatical at the MIT Laboratory for Computer Science in 1983 at the inception of project Athena and the X-Window system. He wrote his first operating system in 1976 and sent his first RPC in 1978. His first degree, awarded in 1975, was in Computational Science from the University of Leeds. Andrew is a member of Wolfson College Cambridge, BCS, ACM, IEEE, and a liveryman of the City of London Worshipful Company of Information Technologists. His interests beyond ANSA include pyrotechnics and steam railways.

Further information (Speaker 2):
http://research.microsoft.com/~aherbert/

Venue:
Mathematical Institute, University of St Andrews

Abstract:
Technology has moved to a point where both the network systems and the information appliances at the end points can handle many different media types. For example, high quality video and audio are beginning to be used routinely. Such use will be extended to situations where information streams are not only watched by humans but also simultaneously by computers. This will require both systems and algorithmic research support. Such network based audio and video processing agents will augment applications by making or suggesting choices for better use of system resources, or for the convenience of the end user. The emphasis will move away from undersupply of information so as to remain within bandwidth and processing constraints, to cache-based oversupply so as always to have available data for every demand an information consumer might make.

Personalisation of interfaces to computer and communications systems will need to develop as the frequency of use of computer equipment explodes. In a connected world, personalisation information can at all times be obtained through the network. Personalisation can encompass the control part of an interface only, or it can include the data as well. It can be extended to live applications which in effect become 'follow-me' and are never shut down. This advance has begun with text and data and will be extended to multimedia information streams. Perpetuating applications and rerouting the sessions may be manageable in situations where the terminal devices are similar, but it is much more difficult if the end points vary in complexity, and research is needed in this area. Furthermore, where many different terminal types are available and the personalisation takes advantage of agents, the scalability of applications is a major research issue.

One of the fruits of the age of technology, is the proliferation of diverse business structures, operational processes and information systems. The increasing need for enterprises to inter-operate, often globally, places immense challenges upon organisations and their information systems in managing this diversity. It is only recently that enterprises have been discovering that standardisation of procedures and systems is only a temporary and incomplete solution to change, the growth of diversity and the need to inter-operate. Increasingly, enterprises are turning to forms of federation, in order to maintain control, when forced into distributing management, processes, businesses and systems. The ANSA Architecture is founded upon a set of simple principles for building software which can both interoperate across and be managed within federated business structures. This briefing note outlines those principles and their use.

Programme:
Monday, 21st April
10.00-11.15 Andy Hopper: Multimedia and Network Computing
11.15-11.30 Registration & Coffee
11.30-12.45 Andy Hopper: Smart Personalisation
12.45-14.30 Lunch
14.30-15.45 Andrew Herbert: Distributed Object Systems 1
15.45-16.00 Tea/coffee
16.00-17.15 Andrew Herbert: Distributed Object Systems 2

Tuesday, 22nd April
10.00-11.15 Andrew Herbert: Topics in Secure Electronic Commerce 1
11.15-11.30 Tea/coffee
11.30-12.45 Andrew Herbert: Topics in Secure Electronic Commerce 2

AttachmentSize
1997-1 Programme.pdf107.91 KB
1997-1 Lecture1.pdf2.67 MB

Information Retrieval: its models, its evaluation and its multimedia applications

Mike Weir, CJ (Keith) van Rijsbergen, Ron Morrison, Yves Chiaramella

Mike Weir, CJ (Keith) van Rijsbergen, Ron Morrison, Yves Chiaramella

Wed, 15 Apr 1998

Speaker 1:
Keith van Rijsbergen

Affiliation (Speaker 1):
Computing Science Department, University of Glasgow

Biography (Speaker 1):
Keith van Rijsbergen is Head of the Computing Science, Mathematics and Statistics Planning Unit at Glasgow University. He has been involved in Information Retrieval research since the late sixties. He currently manages an IR group within the CS department whose major concerns are

· supporting information seeking and improving system usability representing and managing content of text, images, video, speech
· building new theoretical models to capture relevance of objects
· implementing and evaluating models in experimental settings
· including users and interaction in the evaluation

He is the author of a popular book on IR: //www.dcs.gla.ac.uk/ir/

Further information (Speaker 1):
http://www.dcs.gla.ac.uk/~keith/

Speaker 2:
Yves Chiaramella

Affiliation (Speaker 2):
CLIPS-IMAG, Grenoble

Biography (Speaker 2):
Yves Chiaramella is Professor in Computer Science at the Universit? Joseph Fourier, Grenoble. He is Head of CLIPS-IMAG, a Computer Science Laboratory dedicated to Man-Machine Communication. He has also been involved in Information Retrieval for 15 years, and currently manages a group on Multimedia Information Retrieval within the CLIPS Laboratory. His main interests in the field are:

. indexing models for text, images and video
. indexing models for complex, structured objects
. retrieval model and systems for multimedia data
. logic-based approaches for IR models

Further information (Speaker 2):
http://www-clips.imag.fr/mrim/User/yves.chiaramella/

Venue:
Mathematical Institute, University of St Andrews

Abstract:
Information Retrieval aims to provide techniques and tools to allow fast, effective and efficient access to large amounts of stored information. The need for these has become more apparent with the growth of the world wide web. To sustain this growth and enhance it, many of the early and more recent ideas in information retrieval are applicable. This course will give an introduction to some of the earlier basic research ideas as well as to some of the more recent and pressing problems. The lectures will span the theory, experiment and practice of IR.

Programme:
Wednesday, 15th April

10.00-11.00 Introduction K van Rijsbergen

11.00-11.15 Tea/coffee

11.20-12.30 Two basic models: vector space and probabilistic - K van Rijsbergen

12.30-14.00 Lunch

14.00-15.30 Indexing multimedia information: signal indexing vs symbolic indexing - Y Chiaramella

15.30-15.45 Tea/coffee

15.45-16.45 Indexing multimedia information: the case of structured objects - Y Chiaramella

Thursday, 16th April

10.00-11.00 Cluster-based retrieval and visualisation - K van Rijsbergen

11.00-11.15 Tea/coffee

11.20-12.30 Interactive Retrieval: querying vs browsing hypermedia bases - Y Chiaramella

12.30-14.00 Lunch

14.00-15.30 Advanced models and evaluation - K van Rijsbergen

15.30-15.45 Tea/coffee

15.45-16.45 Multimedia Information Retrieval: the evaluation issue - Y Chiaramella

AttachmentSize
1998-1 Programme.pdf93.23 KB
Picture.jpg428.14 KB

The Software Engineering Process

Ron Morrison, Brian Warboys, Kevin Hammond

Ron Morrison, Brian Warboys, Kevin Hammond

Tue, 01 Dec 1998

Speaker:
Professor Brian Warboys

Affiliation:
Department of Computer Science, University of Manchester

Further information:
http://intranet.cs.man.ac.uk/ipg/people/brianPage.php

Venue:
Mathematical Institute, University of St Andrews

Abstract:
Software Engineering is concerned with the development of large software systems. It is thus a complex group process requiring a disciplined approach to both System Development and Project Management. These lectures will review the measures that have been taken to establish and use sound engineering principles in order to obtain large software systems with some measure of certainty. Modern approaches based on using the notion of exploiting models of the software development process will be the main focus of attention.

Programme:
10.00 - 11.00 The Software Engineering Lifecycle - why is it so difficult?
11.00 - 11.30 Coffee
11.30 - 12.30 The Software Paradigm - is there really one?
14.00 - 15.00 Software Process Modelling - a solution or another false dawn?

AttachmentSize
1998-9-1 Programme.pdf75.72 KB
WarboysPicture.jpg527.86 KB

Social Analysis and Software Systems Design

Ron Morrison, Ian Sommerville, Ursula Martin

Ron Morrison, Ian Sommerville, Ursula Martin

Thu, 18 Feb 1999

Speaker:
Professor Ian Sommerville

Affiliation:
Computing Department, University of Lancaster

Further information:
http://www.cs.st-andrews.ac.uk/~ifs/

Venue:
Mathematical Institute, University of St Andrews

Abstract:
Most existing approaches to software systems design are techno-centric and focus on technical aspects of the problem to be solved and the solution to be developed. However, we know that many software systems that are delivered are either never used at all or require extensive modifications after delivery to make them usable. We argue that one reason for this is that the designers of these systems have not taken account of the organisational environment in which these systems are used nor of the work practices that they must support.

These lectures will discuss work that has been going on at Lancaster since 1990 to address these issues. Its goal is to develop and integrate organisational and social analysis with approaches such as object-oriented analysis so that we have an improved understanding of the real requirements for organisational software systems. The work has been interdisciplinary and has involved cooperation between social scientists and computer scientists.

I will discuss the evolution of our work from initial ethnographic studies that were used to inform the design of an air traffic control system through to our most recent work on an integrated method of social and object-oriented analysis. I will illustrate how methods from the social sciences have been adapted to be practically useful for software systems design and reflect on the advantages and disadvantages of inter-disciplinary working.

Programme:
10.00 - 11.00 Learning from ethnography: surprises from a study of air traffic control
11.00 - 11.30 Coffee
11.30 - 12.30 Viewpoints and concerns: structuring the analysis of complex systems
14.00 - 15.00 Coherence: integrating social and object oriented analysis

AttachmentSize
1998-9-2 Programme.pdf82.7 KB
SommervillePicture.jpg661.88 KB

People and Computers

Ron Morrison, Alan Newell, Alan Dearle

Ron Morrison, Alan Newell, Alan Dearle

Wed, 24 Nov 1999

Speaker:
Professor Alan Newell

Affiliation:
Department of Applied Computing, University of Dundee

Further information:
http://www.computing.dundee.ac.uk/ac_staff/staffdetails.asp?1

Venue:
Physical Sciences, University of St Andrews

Abstract:
Most computer systems are designed to be used by people, and thus the characteristics and requirements of people should be a central focus for the design of computer software and systems. The lecture series will include a consideration of the importance of human computer interface and user centred design. This will be followed by a discussion of the important characteristics of users and the changing nature of the population of potential users of computer software and systems. The concept ordinary and extra-ordinary human computer interface design, developed within the Department of Applied Computing, will be described, and illustrated by descriptions of some research projects. These include computer systems which have been developed for a wide range of potential users including those with severe disabilities.

Programme:
1.00 - 12.00 HCI and User Centred Design
14.00 – 15.00 Ordinary and Extra-ordinary Human Computer Interaction
15.00 - 15.30 Coffee
15.30 - 16.30 Research Frontiers and Demography

AttachmentSize
1999-0-1 Programme.pdf60.68 KB
1999-0-1 Lecture1.pdf358.47 KB
NewellPicture.jpg329.18 KB

Computer Storage Systems

Colin Allison, Joe Sventek, John Wilkes, Ron Morrison

Colin Allison, Joe Sventek, John Wilkes, Ron Morrison

Wed, 01 Mar 2000

Speaker:
John Wilkes

Affiliation:
Hewlett Packard Laboratories, Palo Alto, California

Biography:
John Wilkes is the director of the Storage Systems Program at Hewlett-Packard Labs. His main research interest is in the design and management of fast, highly available, distributed-storage systems; he has also dabbled in network architectures (the Hamlyn sender-based message model), OS design (most recently in the Brevix project), and in learning about early Renaissance art and architecture. He earned a BA and MA in physics and a Diploma and PhD in computer science from the University of Cambridge. He has been at Hewlett-Packard Labs since 1982, where he is now a Laboratory Scientist.

Further information:
http://www.hpl.hp.com/personal/John_Wilkes/

Venue:
Mathematical Institute, University of St Andrews

Programme:
10.00 – 11.15 Introduction to online storage devices
An overview of current storage devices, with an emphasis on disk drives and their technology trends. Introduction to performance issues, including workload analysis - how file systems and databases use storage. Case study: request scheduling for disk-drives.

11.15 – 11.45 Coffee

11.45 – 13.00 Disk arrays
High reliability systems for data storage: the design, properties, and some of the pitfalls of redundant storage in a box. Case study: one or two novel disk array designs.

14.00 – 15.00 Storage area networks
Block, storage-object, and file level interfaces. Storage Area Networks versus Network Attached Storage (and other false dichotomies). Case study: CMU NASD.

15.00 – 15.30 Coffee

15.30 – 16.45 Storage management
The design and configuration of large storage systems. Quality of service guarantees. Goal-directed, self-managing, attribute-based storage systems. Case study: HP's Minerva system.

AttachmentSize
1999-0-2 Programme.pdf57.2 KB
1999-0-2 Lecture1.pdf5.16 MB
1999-0-2 Lecture2.pdf1014.12 KB
1999-0-2 Lecture3.pdf5.84 MB
WilkesPicture.jpg519.22 KB

Pervasive Computing: StarTrek? Hogwarts? Reality?

Professors Ron Morrison, Dirk Husemann, Colin Allison

Professors Ron Morrison, Dirk Husemann, Colin Allison

Tue, 28 Nov 2000

Speaker:
Dr Dirk Husemann

Affiliation:
IBM Research, Zurich

Biography:
Dirk Husemann has been with the IBM Research Division since 1996 and is currently a member of research staff at IBM's Zurich Research Lab in Switzerland. Since 1998 he has been leading IBM's DEAPspace research project on transient ad-hoc networking of pervasive computing devices. Currently he is working on data casting over digital audio broadcast radio channels. He holds both a master degree (Dipl.-Inf., 1991) and a PhD degree (Dr.-Ing., 1995) in computer science from the University of Erlangen-Nürnberg, Germany, and is co-inventor of a number of submitted patents. His research interests include operating systems, distributed systems, and pervasive/ubiquitous computing. Currently he is a member of IEEE Computer, Usenix, TUG, and the German Gesellschaft für Informatik (GI).

Further information:
http://www.zurich.ibm.com/~hud/

Venue:
Mathematical Institute, University of St Andrews

Programme:
10.00 – 11.00 What is Pervasive Computing?
Pervasive computing in many ways addresses age old dreams and visions of human beings. It is relatively difficult to find a concise and well-limited definition of pervasive computing (or its synonym ubiquitous computing). This first lecture shall provide an introduction to and an overview of pervasive computing and the enabling technologies.

11.00 – 11.30 Coffee

11.30 – 12.30 Ad-Hoc Pervasive Computing
A rather interesting part of pervasive computing focuses on ad-hoc networking using either traditional wire networks but also increasingly wireless technology. This lecture will take a look at issues such as wireless techologies, service discovery and applications.

14.15 – 15.15 Current Research in Pervasive Computing
Concluding this mini-series on pervasive computing we shall take a look at a sample of interesting pervasive computing research projects; for example, PEN, CoolTown, Fabric Area Networks, Pollen, and others.

AttachmentSize
2001-1-1 Programme.pdf61.65 KB
2001-1-1 Lecture1.pdf521.81 KB
2001-1-1 Lecture2.pdf273.25 KB
2001-1-1 Lecture3.pdf711.55 KB
HusemannPicture.jpg480.94 KB

Games and Entertainment - picture of the future, novel technologies and usability aspects

Alan Dearle, Peter Astheimer, Lucy Joyner, Ron Morrison, Tim Taylor

Alan Dearle, Peter Astheimer, Lucy Joyner, Ron Morrison, Tim Taylor

Thu, 19 Apr 2001

Speaker 1:
Peter Astheimer

Affiliation (Speaker 1):
IC CAVE, University of Abertay

Biography (Speaker 1):
Peter Astheimer received a Ph.D. @r.-Ing.) for his thesis "sonification of numerical data for visualization and virtual reality" from the Technical University of Darmstadt in 1995. From 1987 to 1996 he had been workmg for the Fraunhofer-Institute for Computer Graphics (FhG-IGD) in Darmstadt, mainly on visualisation and virtual reality technologies and applications. From 1996 to 2000 Peter Astheimer had been working for Siemens AG, Corporate Technology, Innovationfield Information & Communications in Munich. His task was to analyse the future of information and communication, identify and implement business opportunities withn Siemens business units. Since 2000 Peter Astheimer acts as founding Director and Professor of Virtual Reality of the International Centre for Computer Games and Virtual Entertainment (IC CAVE?) at the University of Abertay, Dundee. The centre's goal is to enable new business by applying innovative technology in industry and funded projects. Peter Astheimer has published and co-authored more than 50 papers and held numerous tutorials and lectures worldwide. He is on the program committee for a number of conferences/workshops, a member of lEEE and holds several patents. He is a lifelong member of Historic Scotland.

Speaker 2:
Tim Taylor

Affiliation (Speaker 2):
IC CAVE, University of Abertay

Biography (Speaker 2):
Tim Taylor holds an MA in Natural Sciences from Trinity College, Cambridge, where he specialized in Experimental Psychology and also studied a variety of other subjects from the biological and physical sciences. On graduating from Cambridge he moved to Edmburgh University, where he was awarded an MSc (with distinction) in Artificial Intelligence. After working as a professional computer programmer in London for a couple of years, he returned to Edinburgh University to pursue a PhD in Artificial Life, which he was awarded in 1999. He has subsequently worked in artificial life research for MathEngine PLC in Oxford, and has been a research associate in IC CAVE at the University of Abertay Dundee since March 2000.

Further information (Speaker 2):
http://www.tim-taylor.com/

Speaker 3:
Lucy Joyner

Affiliation (Speaker 3):
IC CAVE, University of Abertay

Biography (Speaker 3):
Lucy Joyner holds a BSc (Hons) in Psychology with Sociology from the University of Bath. As an undergraduate she worked at the University of Dundee on the development of a comprehensive threefactor model of stress states and studied stress responses during simulated driving. After graduating she returned to the University of Dundee specialising in mood state and stress research, including the validation of a comprehensive stress state questionnaire and a questionnaire measure of driver stress and affect. She spent a year working on a project studying the development of social and group identity, and contextual variability in ingroup stereotype while studying for, and gaining, a COSCA certificate in counselling skills. Lucy has been a research associate in IC CAVE at the University of Abertay Dundee since June 2000.

Venue:
Physical Sciences, University of St Andrews

Abstract:
This Distinguished Lecture looks at the current and expected developments and outcomes in the games and entertainment industry. It further highlights in more detail promising technologies like modular augmented computing and artificial life and motivates the importance of human factors covered by the discipline of usability engineering.

Programme:
11.15 - 12.30 The future of Games and Entertainment
Peter Astheimer
Entertainment is a vital part of life at leisure, home and work - and there are 6 Billion people on earth! The leisure system is analysed from a futurist's holistic viewpoint and significant trends and shaping factors are identified. Games and entertainment industry revenues have long surpassed box office and home video retail revenues and the
production of a games title can consume a larger budget than a Hollywood movie. Despite maturisation and consolidation, the industry still enjoys respectable two-digit growth rates. Development of products is quite demanding with vivid competition, bringing new technologies to a variety of platforms and making it enjoyable and affordable for virtually everyone. Many virtual reality technologies are now available on a consumer scale and games are incorporating an increasing number of sensory channels. The combination of Virtual worlds, augmented reality technology and modular wearable computing gear opens a wealth of novel applications. Currently explored in industry for hands-free operation and multivendor1multiproduct service and maintenance it is expected to make a significant impact in conjunction with 3G mobile technologies for personal purposes in the future.

14.00 - 15.15 Evolving Creatures in Virtual Worlds
Tim Tavlor
The realistic physical modelling of creatures in games and virtual worlds is becoming a viable alternative to more traditional animation techniques. Physical modelling can enhance realism and allow users to interact with the world much more freely. However, designing controllers to move physically modeled creatures (e.g. to make a human character walk) is generally a difficult task. Artificial life techniques can be useful in automating this task. For example, artificial evolution can generate suitable controllers for simple behaviours, given only a high level description of that behaviour in terms of a fitness function. In this talk, the state of the art in evolving controllers, and also in evolving the creatures' body shapes, will be described and demonstrated. It will then be suggested that current approaches are unlikely to scale up to more complicated behaviours. A number of possible solutions to this problem will be discussed. The talk will conclude with some predictions of practical applications that are likely, and unlikely, to arise from artificial life research over the next 25 years.

15.15 - 15.45 Coffee

15.45 - 17.00 Applications of Usability Engineering in Video Game Research
Lucy Joyner
Usability engineering has been long established as a discipline dedicated to supporting the process of developing electronic applications, maximising ease of use and robustness for their intended use. By implementing user centred design the discipline offers developers the opportunity to assess the quality and potential success of their application
from the viewpoint of intended users. When considering the process of making video games, developers need to consider not only how usable the game is, but how likely it is that intended users will be attracted to playing the game, enjoying it and purchasing the final product. Through supplementing usability engineering with cognitive and social research methodologies, we can take video game research beyond observation of behaviour, to study cognition, mood states and individual differences associated with game playing. This talk will show the complexity of the gaming situation, the beneficial applications of usability engineering, and its limitations. Ways to overcome these shortfalls will be demonstrated through the practical application of cognitive and social methodologies. Implications for game developers and the future of this research will be discussed in relation to crosscultural markets and expectations.

Generic information:
https://staffres.cs.st-and.andrews/Admin/School_Business/Distinguished Lecture Series/DLS/2000-1/2-Astheimer/Notes/2000-1-2 Lecture1.pdf

AttachmentSize
2000-1-2 Programme.pdf98.57 KB
AstheimerPicture.jpg569.83 KB

XML - a data standard for well-behaved programmers?

Richard Connor, Al Dearle

Richard Connor, Al Dearle

Wed, 12 Dec 2001

Speaker:
Professor Richard Connor

Affiliation:
University of Strathclyde

Biography:
Richard Connor is a Professor of Computer Science at the University of Strathclyde. The interests of his research group include querying semi-structured data resources, autonomous information provision, and highly distributed data stores. His work on distributed information is currently funded by EPSRC, BBSRC, SHEFC, and a global supplier of financial information systems.

Further information:
http://www.cis.strath.ac.uk/cis/staff/index.php

Venue:
Mathematical Institute, University of St Andrews

Abstract:
"XML" is a term generally used to refer to many rapidly emerging standards and technologies. While XML 1.0 is stable, much of the associated set of standards and software is in a very early stage of development. The talks will examine emerging XML-related technology from the perspective of data and applications. While the WWW is the largest high-quality data resource ever to have been available to mankind, it lacks the inherent discipline thought to be necessary for a coherent programming framework. This lack is not necessarily a case of poor design, but rather reflects the inherent autonomy necessary for such a large collection to exist.

One lesson that the emergence of the Web has led many computer scientists to take is that groups of humans can be well-behaved to create major systems that work by convention, rather than by enforcement. XML captures a data model that can be viewed as a database standard, so long as generators and programmers stick to the conventions. The series of talks will examine this viewpoint.

The talks will be accessible to anyone who understands the basics of computer programming and simple type systems; no background knowledge of XML is necessary.

Programme:
10.00 – 11.00 When is a Document not a Document? When it's XML.
The first talk gives a background to the XML standard set, examining the re-emergence of the semi-structured data model and how that fits with the historical coincidence of XML. The value of XML as a data standard, despite its humble origins as a document standard, is highlighted. The value of such a data standard in a context subject to autonomy and evolution is explained.

11.00 – 11.30 Coffee

11.30 – 12.30 When is a Language not a Langauge? When it’s well-typed.
A background to the differing emerging paradigms is given, using a classification based upon program safety. XML does not have a strong type model; one of the most interesting questions is how programmers can write applications that will not fail in unexpected ways at run-time. The class of failure examined in detail is that which could be detected at compile-time using more standard database technology.

14.15 – 15.15 When is a SNAQue not a Snake? When it's a type projection.
Based on the classification given previously, we consider how standard data types may be projected onto XML collections, and show how for some purposes this approach is more promising than
others in terms of program safety. The approach outlined is that of the Strathclyde Novel Architecture for Querying eXtensible Markup Language.

Generic information:
https://Staffres.cs.st-andrews.ac.uk/Admin/School_Business/Distinguished Lecture Series/DLS/2001-2/1-Connor/Notes/IMGP0170.AVI

AttachmentSize
2001-2-1 Programme.pdf33.8 KB
2001-2-1 Lecture1.pdf83.77 KB
2001-2-1 Lecture2.pdf2.01 MB
2001-2-1 Lecture3.pdf149.24 KB
2001-2-1 Lecture4.pdf564.45 KB
2001-2-1 Lecture5.pdf96.95 KB
IMGP0253.jpg543.15 KB
IMGP0254.jpg420.43 KB
ConnorPicture.jpg255.75 KB

Some Futures in Broadband Communications

Colin Allison, Derek McAuley, Alan Ruddle

Colin Allison, Derek McAuley, Alan Ruddle

Thu, 14 Mar 2002

Speaker:
Professor Derek McAuley

Affiliation:
Marconi Laboratories, Cambridge

Biography:
Professor Derek McAuley joined Marconi in January 2001 to establish the new Marconi Labs in Cambridge. He obtained his B.A. in Mathematics from the University of Cambridge in 1982 and his Ph.D. addressing issues in interconnecting heterogeneous ATM networks in 1989. After a further five years at the University of Cambridge Computer Laboratory as a lecturer he moved in 1995 to a chair at the University of Glasgow Department of Computing Science. He returned to Cambridge in July 1997, to help found theCambridge Microsoft Research facility.

His research interests include networking, distributed systems and operating systems. Recent work has concentrated on the support of time dependent mixed media types in both networks and operating systems.

Further information:
http://www.cl.cam.ac.uk/~drm10/

Venue:
Mathematical Institute, University of St Andrews

Programme:
10.00 – 11.00 Networking with the Reverand Bayes.
The realization that in any large scale deployed system something is broken all the time requires the use of defensive coding techniques. When taken together with an understanding of Bayesian statistics, we need to address more than simply coding, rather we should revisit the design of our distributed control algorithms. This talk introduces the theory and describes some examples of how we might use it to address current networking problems.

11.00 – 11.30 Coffee

11.30 – 12.30 An introduction to Optical Switching.
Conversion between optical and electrical signals within high speed communications systems is an expensive business, and consumes a lot of power. The goal of optical switching is to leave the communications signals in optical form as far as is possible. This presents challenges and opportunities to network and switch designers to overcome and exploit the interesting features of optical components.

14.15 – 15.15 Quality of Service - what's going to make it pay?
The type of QoS we see in the form of Service Level Agreements is a big deal for ISPs. However, this is a long way from the traditional multimedia view of QoS where it is expected to be specified for each instance of an application. Likewise, many Operating Systems and Middleware platforms take one of two views on QoS: either that machines are getting faster so who cares, or why not simply use a real time scheduling class. Are we ever going to drive QoS into the mainstream?

AttachmentSize
2001-2-2 Programme.pdf34.67 KB
2001-2-2 Lecture1.pdf1.62 MB
McAuleyPicture.jpg296.05 KB

Ubiquitous Computing Environments

Alan Dearle, H Gellersen, Graham Kirby, Ron Morrison

Alan Dearle, H Gellersen, Graham Kirby, Ron Morrison

Thu, 21 Nov 2002

Speaker:
Professor Hans Gellersen

Affiliation:
Computing Department, Lancaster University

Biography:
Hans Gellersen is Professor of Interactive Systems in the Computing Department at Lancaster University. He obtained both his M.Sc. in Computing and his Ph.D. from University of Karlsruhe, Germany, in 1992 and 1996. He continued to be affiliated with Karlsruhe as Director of the Telecooperation Office (TecO) for another five years, and moved to a chair at Lancaster in March 2001.

His research interest is in ubiquitous computing and novel interactions between people, their physical environment, and computing. Recent work includes research on distributed sensing and context capture, platforms for ubiquitous computing prototyping, and embedding of interactive technologies in everyday objects. He is actively involved in the formation of the ubiquitous computing research community, has initiated the HUC/Ubicomp conference series, and serves as editor for Personal and Ubiquitous Computing.

Further information:
http://www.comp.lancs.ac.uk/~hwg/

Venue:
Mathematical Institute, University of St Andrews

Programme:
10.00 – 11.00 An Introduction to Ubiquitous Computing.
Ubiquitous computing envisions that we will interact with many computers around us while devoting not much explicit attention to them. An important concept to make this possible is to integrate computers with the context in which they are used. This may involve embedding of computers in familiar artefacts, communication with their immediate environment, and use of sensors and actuators that replace traditional interfaces. This initial talk will introduce the ubiquitous computing vision and discuss ways of integrating computing systems with their physical environment.

11.00 – 11.30 Coffee

11.30 – 12.30 Examples of the Disappearing Computer.
An intriguing idea behind physical integration of computing is that familiar artifacts can be made interactive while computers move into the background and virtually disappear. In this talk we discuss three examples that explore the potential of this idea: a wall that is also a network; hallway posters that double as output medium; and a coffee table that is also a sensor and input device.

14.15 – 15.15 Smart-Its: Prototyping the Disappearing Computer
Unfortunately, when computing systems become integrated with their physical environment, they also become more difficult to build, to try out, and to evaluate. This talk analyzes the challenges associated with investigation of ubiquitous computing environments and introduces the Smart-Its platform designed to make prototyping easier. Smart-Its are small computing devices with wireless radio that interact with their environment through a configurable collection of sensors and actuators. Attached to physical objects, they can turn these into smart artefacts with digital identity,contextual awareness, and wireless communication.

AttachmentSize
2002-3-1 Programme.pdf38.29 KB
2002-3-1 Lecture1.pdf3.12 MB
2002-3-1 Lecture2.pdf896.26 KB
2002-3-1 Lecture3.pdf5.33 MB
GellersenPicture.jpg640.19 KB

Where The Hard Problems Are

Steve Linton, Ron Morrison, Toby Walsh, Ian Gent

Steve Linton, Ron Morrison, Toby Walsh, Ian Gent

Wed, 23 Apr 2003

Speaker:
Toby Walsh

Affiliation:
University College Cork

Biography:
Toby Walsh is an SFI Research Professor at University College Cork and Deputy Director of the Cork Constraint Computation Centre. He was previously an EPSRC advanced research fellow at the University of York, a Marie Curie postdoctoral fellow in Italy and France, and a postdoctoral fellow at Edinburgh University. He was program chair of the 6th International Conference on the Principles and Practice of Constraint Programming (CP-2001), and is poster chair of this year's International Joint Conference on AI (IJCAI-2003). He is an associate editor of the Journal of Artificial Intelligence Research, and the Journal of Automated Reasoning. He is on the editorial boards of AI Communications, and the Constraints journal. He was recently elected a trustee of the Conference on Automated Deduction, and will chair the next International Joint Conference on Automated Reasoning (IJCAR-2004).

Further information:
http://www.cse.unsw.edu.au/~tw/index.html

Venue:
Mathematical Institute, University of St Andrews

Programme:
10.00 – 11.00 Where the hard problems are
Some computational problems are easy. Others are hard. In these lectures, I will survey some recent research that helps find where hard computational problems can be found, and throws some light on what makes some problems hard and others easy. Computationalcomplexity gives us some tools (for example, big O analysis and complexity classes) but much of my talk will come from an unusual direction - statistical mechanics. Surprisingly, phase transition behaviour observed in nature is also observed in computation, and problem hardness can often be found at phase boundaries. The talk will be richly illustrated with many examples of such behaviour.

11.00 – 11.30 Coffee

11.30 – 12.30 The Interface between P and NP
The transition from P (problems that can be solved in polynomial time) to NP-complete (problems where our best algorithms are currently exponential) is generally thought to mark the onset of computational intractability. What happens if we look more closely at this transition? I will describe recent research that looks closely at this transition. As in the first lecture, surprising insights again come from statistical mechanics.

14.15 – 15.15 The Impact of Structure
Structure in computational problems is both our friend and our foe. We are often able to exploit structure to solve problems that are, at least in the worst case, computationally intractable. But, as I will show in this talk, structure can also be our e/pnemy, and make problems harder. I will describe two important structural features identified and studied in the last few years. Both are structural features of graphs: small-world topologies and high degree graphs. I will describe how such structures turn up in real world problems and
describe their impact on problem hardness.

AttachmentSize
2002-3-2 Programme.pdf347.61 KB
2002-3-2 Lecture1.pdf347.61 KB
2002-3-2 Lecture2.pdf438.39 KB
2002-3-2 Lecture3.pdf934.99 KB
WalshPicture.jpg525.15 KB

Towards Automated Management of Large-Scale Distributed Systems

Alan Dearle, Joe Sventek, Ron Morrison

Alan Dearle, Joe Sventek, Ron Morrison

Tue, 09 Dec 2003

Speaker:
Professor Joe Sventek

Affiliation:
University of Glasgow

Biography:
Professor Sventek obtained his B.A. in Mathematics from the University of Rochester and his PhD in Nuclear Chemistry from the University of California. He is currently the Professor of Communication Systems in the Department of Computing Science at the University of Glasgow. Prior to joining Glasgow, he had a distinguished career pursuing research into networked and distributed systems and managing research teams at Lawrence Berkeley Laboratory (1979-1986), Hewlett-Packard (1987-1999), and Agilent Technologies (1999-2002). His research interests include programmable networks, embedded systems, closed-loop network management, and distributed system architectures. He has several publications on these topics, as well as holds four patents (with three other patents pending)in these particular areas. Professor Sventek was the principal author of the original OMG CORBA specification as well as several of the Common Object Services (Trading, Events, Naming); he also was the rapporteur for the TeleManagement Forum’s most recent release of the Technology Neutral Architecture document. He has been the general chair for TINA99 and Middleware 2001, programme chair for COOTS98, TINA99, and Middleware 2000, and a member of programme committees too numerous to mention. He is an advisor to the TeleManagement Forum Board, is an adviser to the Wiley Series in Communications Networking and Distributed Systems, and was on the editorial board of the IEE/BCS/IOP Distributed System Engineering Journal.

Further information:
http://www.dcs.gla.ac.uk/people/personal/joe/

Venue:
Mathematical Institute

Programme:
10.15 – 11.15 Lecture 1: Traditional Network Management Systems

Network management systems have been with us for many years, for both telephone and data networks. In order to understand what innovations are needed for network management systems to automatically manage large-scale environments, it is necessary to understand how traditional network management systems (called Operational Support Systems, or OSS’s for short) are constructed and used.

This lecture will describe the traditional structure of OSS’s, how they are typically used, and their scaling characteristics. It will conclude with a discussion of a network management pattern that is observed at several levels of abstraction in modern network management systems, and how these pattern instances must be related to automate management of the system.

11.15 – 11.45 Coffee in John Honey Building

11.45 – 13.00 Lecture 2: A Scalable Control Plane Architecture

As described in the 1st lecture, one aspect of traditional OSS’s that scales poorly is the control plane, since it assumes centralized control AND explicit communication between OSS Central and the components of the network when configurations need to change. As the number of components making up the network increases, this centralized structure simply implodes.

This lecture describes one, out-of-the-box, approach to re-architecting the control plane to eliminate this problem. It assumes that responsibility for being in the appropriate configuration is delegated to the individual components, and focuses on making sure that the system asymptotically approaches the correct configuration over a bounded time span. As such, this control plane architecture takes advantage of emergent behaviour.

14.30 – 15.30 Lecture 3: An Always-On Active Measurement Approach for IP Networks

As described in the 1st lecture, another aspect of traditional OSS’s that causes problems in large-scale IP configurations is the lack of built-in diagnostic measurement information that relates to a user’s traffic. The measurements available to the OSS are low-level and primitive; additionally, they are usually taken after a problem has been detected, usually by out-of-band means.

This lecture describes an active measurement mechanism that we have devised to permit continuous measurements of the behaviour of a user’s flows, such that the measurement load is bounded. It will work in IPv4 networks, but is a natural for IPv6 networks since the measurement payloads can be embedded in IPv6 extension headers. Such measurements are introduced in programmable network elements, and can vary in complexity and sophistication to suit measurement needs.

AttachmentSize
2003-4-1 Programme.pdf49.31 KB
2003-4-1 Lecture1.pdf275.44 KB
2003-4-1 Lecture2.pdf467.3 KB
2003-4-1 Lecture3.pdf436.53 KB

Autonomic Computing as a Unifying Framework for Self-Managing Systems

Alan Dearle, Ric Telford, Ron Morrison

Alan Dearle, Ric Telford, Ron Morrison

Wed, 14 Apr 2004

Speaker:
Ric Telford

Affiliation:
Director of Autonomic Computing Architecture and Standards IBM

Biography:
Ric Telford's professional business career highlights 20 years of software development experience and is noted for bringing innovative approaches to the design and development of key software technologies. Telford joined IBM in 1983,as a developer for PROFS in their software lab in Dallas, Texas. Prior to the acquisition of Lotus, Telford led much of the office systems development for IBM,including distributed calendaring and groupware products. During his tenure at IBM, Telford has played a number of key roles in various software initiatives for IBM, including the imaging products unit, networking and security software, and software mobility products. Ric tends to be at the forefront of emerging technologies at IBM. He served as Director of Technology for the IBM CIO, responsible for the development, implementation and adoption of technologies that hastened the transformation of IBM into an e-business. Ric was the Director of Technology for Intelligent Infrastructure, the precursor in IBM to “e-business on demand”. Most recently, Ric was responsible for defining and delivering software solutions for the service provider market, also known as "xSPs. In his current assignment, Ric is responsible for defining and delivering the architecture, technology and standards for "Autonomic Computing." Autonomic Computing is the set of capabilities required to make a computing system more self-managing, much like the human autonomic system. Ric works across IBM (including servers, software and storage)and the industry to develop an end-to-end, open architecture solution for self-managing systems. Ric holds a Bachelor of Science degree in Computer Science from Trinity University in San Antonio, Texas, graduating magna cum laude and Phi Beta Kappa. He holds several U.S. Patents.

Further information:
http://www.ibm.com

Venue:
School of Computer Science, John Honey Building

Abstract:
Computing technology has progressed rapidly over the last several decades with implementations and applications that were unthinkable a decade ago now commonplace. The rate of progress, however, has brought its own cost. As large IT infrastructures grow more complex the cost of managing these systems has increased rapidly. As a result a greater percentage of the IT budget is going toward maintenance of the infrastructure rather than improving its benefit to the business. The complexity of such a computing infrastructure requires that the environment become more “autonomic” -- that is, self-managing.

Developing self-managing computing resources is not a new problem for computer scientists. For decades system components and software have been evolving to deal with the increased complexity of system control, resource sharing, and operational management. The advent of the Internet and dramatically increased price performance of information technology in the last few years has led to a huge growth in the scale and complexity of computing systems. Autonomic computing is the next logical evolution of these past trends to address the increasingly complex and distributed computing environments of today.

This series of lectures will describe IBM’s vision for autonomic computing and the plan to apply the model of autonomic systems to self-managing systems with a focus on such topics as problem determination, configuration, and optimization.

Programme:
10.15 – 11.15 Lecture 1: The Autonomic Computing Vision
The concept of “Autonomic Computing” takes its name from the human autonomic nervous system. Several years ago, Paul Horn, Sr VP of Research at IBM, made a “call to arms” to the IT industry to start focusing on the complexities of IT by borrowing from the human “self-managing system.” At the current rate of growth, the cost of maintaining and managing IT infrastructure will soon become unaffordable. Autonomic Computing offers an approach to address the costs associated with managing IT. This lecture presents the concept of Autonomic Computing, the business drivers behind it, and the value that a self-managing infrastructure provides. Included in this is the set of constructs and tools that will hasten the adoption of autonomic computing technologies.

11.15 – 11.45 Coffee

11.45 – 13.00 Lecture 2: The Autonomic Computing Architecture
In order to fulfill the promise of self-managing systems, a comprehensive architecture is required. The architecture needs to be abstract enough to allow for adaptation to various environments, but prescriptive enough to ensure interoperability across heterogeneous systems. Finally, the architecture must lend itself to standardization across the industry to ensure broad-based adoption. This lecture describes the overarching architecture for Autonomic computing as being put forth by IBM. The architecture defines the basic constructs of an autonomic computing system, and the interactions of these constructs. The architecture presumes the existence of some “core technologies”, which will also be discussed. Finally, the topic of open standards will be covered, with an overview of the emerging standards in the autonomic computing space.

14.30 – 15.30 Lecture 3: Autonomic Computing in Action
Lectures 2 covered an abstract view of an autonomic computing system, describing the architectural elements, formats, protocols and interfaces. Although it is important to understand this as background, it is much more interesting to examine Autonomic Computing using real scenarios. This lecture describes some examples of autonomic computing “in action” at the system level. It will discuss how the architectural elements are instantiated and give examples of possible flows between elements. The scenarios discussed will cover a broad range of autonomic capabilities including examples of self-healing and selfoptimizing systems.

AttachmentSize
2003-4-2 Programme.pdf54.6 KB
2003-4-2 Lecture1.pdf3.48 MB
2003-4-2 Lecture2.pdf3 MB
2003-4-2 Lecture3.pdf4.36 MB
TelfordPicture.jpg594.79 KB

Computational Finance

Ian Miguel, Ian Gent, Edward Tsang, Ron Morrison, Tom Kelsey

Ian Miguel, Ian Gent, Edward Tsang, Ron Morrison, Tom Kelsey

Mon, 29 Nov 2004

Speaker:
Professor Edward Tsang

Affiliation:
Computing Department, University of Essex

Biography:
Edward Tsang holds a first degree in Business Administration (major in Finance) and a PhD in Computer Science. He is currently a Professor in Computer Science at University of Essex. He is also the Deputy Director of Centre for Computational Finance and Economic Agents (CCFEA, http://www.cfea-labs.net). CCFEA is an interdisciplinary research centre, which applies artificial intelligence methods to problems in finance and economics. It is supported by City Associates, which is led by HSBC.

Edward Tsang has broad interest in artificial intelligence, including heuristic search,computational finance, economic agents, constraint satisfaction, combinatorial optimisation, scheduling, evolutionary computation and automated bargaining. He created and leads the Constraint Satisfaction and Optimisation Research Group and the Computational Finance research group at University of Essex.

Edward Tsang is an editor of the Constraints journal, the Scheduling journal, IEEE Transactions in Evolutionary Computation and The Journal of Management and Economics. He has been a member of the Computing College of Engineering and Physical Sciences Research Council (EPSRC, UK) since 1997. He chairs the IEEE Computational Intelligence Society's Technical Committee in Computation Finance and Economics. He has served committees and panels to many major international conferences and workshops.

Further information:
http://www.bracil.net/edward/

Venue:
School of Computer Science, Jack Cole Building

Programme:
10.00 – 11.00 Computational Finance – An Overview
Advances in hardware and software enable research in finance and economics that was not possible before. For example, today's hardware allows us to examine more complex economic models and do larger simulations in shorter time. Advances in evolutionary computation enable us to search the space of models more efficiently. Some research in computational finance challenge the fundamentals of economics. Others attempt to gain an insight into financial markets, or to explore business opportunities. This talk briefly outlines the scope and agenda of computational finance research.

Reference:
E.P.K. Tsang & S.Martinez-Jaramillo, Computational Finance,IEEE Computational Intelligence Society Newsletter, August 2004,3-8

11.00 – 11.30 Coffee

11.30 – 12.30 Eddie: EDDIE Beats the stock market
In this talk, I shall describe the EDDIE (which stands for Evolutionary Dynamic Data Investment Evaluator)project. EDDIE is a financial forecasting tool developed at University of Essex. It is based on genetic programming, a branch of evolutionary computation techniques. As a tool, EDDIE works with the user in the following way. The user supplies EDDIE with a set of factors or opinions(collectively we call them indicators)that he/she believes is relevant to forecasting. By using historical data, EDDIE helps the user to explore possible interactions between these indicators. Throughsupervised learning, EDDIE builds decision trees that can be used for forecasting. EDDIE is not designed to replace experts; it needs expert knowledge to succeed. Experts will play the role of identifying relevant indicators and evaluating the decision trees.

EDDIE has been tested extensively on a large amount of data for identifying investment and arbitrage opportunities.

References:
E.P.K. Tsang & J. Li, EDDIE for financial forecasting, in S-H. Chen(ed.), Genetic Algorithms and Programming in Computational Finance, Kluwer Series in Computational Finance, 2002, Chapter 7, 161-174

E.P.K.Tsang, P.Yung & J.Li, EDDIE-Automation, a decision support tool for financial forecasting, Journal of Decision Support Systems, Special Issue on Data Mining for Financial Decision Making, Vol.37, No.4, 2004

AttachmentSize
2004-5-1 Programme.pdf2.51 MB
2004-5-1 Lecture1.pdf2.51 MB
2004-5-1 Lecture2.pdf1.43 MB
2004-5-1 Lecture3.pdf1.99 MB
TsangPicture.jpg523.61 KB

The Unreasonable Effectiveness of Logic

Ian Gent, Phil Wadler, Pedro Vasconcelos, Kevin Hammond

Ian Gent, Phil Wadler, Pedro Vasconcelos, Kevin Hammond

Mon, 21 Mar 2005

Speaker:
Professor Philip Wadler

Affiliation:
University of Edinburgh

Biography:
Philip Wadler likes to introduce theory into practice, and practice into theory. Two examples of theory into practice: GJ, the basis for Sun's new version of Java with generics, derives from quantifiers in second-order logic. His work on XQuery marks one of the first efforts to apply mathematics to formulate an industrial standard.

An example of practice into theory: Featherweight Java specifies the core of Java in less than one page of rules. He is a principal designer of the Haskell programming language, and he co-authored Introduction to Functional Programming, which has been translated into Dutch, German, and Japanese. He appears in position 67 of Citeseer's list of most-cited authors in Computer Science.

Philip Wadler is a Professor of Theoretical Computer Science at the University of Edinburgh, and holds a Royal Society-Wolfson Research Merit Fellowship. Previously, he worked or studied at Avaya Labs, Bell Labs, Glasgow, Chalmers, Oxford, CMU, Xerox Parc, and Stanford, and lectured as a guest professor in Paris, Sydney, and Copenhagen. He served as Editor in Chief of the Journal of Functional Programming, published by Cambridge University Press, and sits on the Executive Committee of the ACM Special Interest Group on Programming Languages. He has been invited to speak in Aizu, Buenos Aires, Copenhagen, Denver, Edinburgh, Florham Park, Gdansk, London, Montreal, New Delhi, Oxford, Portland, Rome, Santa Fe, Sydney, Talinn, Ullapool, Victoria, Williamstown, Yorktown Heights, and Zurich.

Further information:
http://homepages.inf.ed.ac.uk/wadler/

Venue:
School of Computer Science, Jack Cole Building

Programme:
10.00 – 11.00 The unreasonable effectiveness of logic
People might be forgiven for thinking that computing is not so much a science as an industry. Ask someone to name a prominent computer scientist and you are more likely to hear the name Bill Gates than Alan Turing. In fact, computing is both a science and an industry, each stimulating the other.

Everyone knows that logic and computing have something to do with each other, but few understand the remarkable correspondence that links them. A model of logic and a model of computing, each published at the dawn of the computer era, turned out, half a century later, to coincide exactly. We will follow this correspondence through three strands of work, connecting three varieties of logic, three researchers at Edinburgh's Laboratory for the Foundations of Computer Science, and three applications to web technology.

11.00 – 11.30 Coffee

11.30 – 12.30 Call-by-value is dual to call-by-name
The rules of classical logic may be formulated in pairs corresponding to De Morgan duals: rules about "and" are dual to rules about "or". A line of work, including that of Filinski (1989), Griffin (1990), Parigot (1992), Danos, Joinet, and Schellinx (1995), Selinger (1998,2001), and Curien and Herbelin (2000), has led to the startling conclusion that call-by-value is the de Morgan dual of call-by-name.

This lecture presents a dual calculus that corresponds to the classical sequent calculus of Gentzen (1935) in the same way that the lambda calculus of Church (1932,1940) corresponds to the intuitionistic natural deduction of Gentzen (1935). It includes crisp formulations of call-by-value and call-by-name that are obviously dual; no similar formulations appear in the literature. The paper gives a CPS translation and its inverse, and shows that the translation is both sound and complete, strengthening a result in Curien and Herbelin (2000).

14.15 – 15.15 The Girard-Reynolds isomorphism (second edition)
Jean-Yves Girard and John Reynolds independently discovered the second-order polymorphic lambda calculus, F2. Girard additionally proved a Representation Theorem: every function on natural numbers that can be proved total in second-order intuitionistic predicate logic, P2, can be represented in F2. Reynolds additionally proved an Abstraction Theorem: every term in F2 satisfies a suitable notion of logical relation; and formulated a notion of parametricity satisfied by well-behaved models.

We observe that the essence of Girard's result is a projection from P2 into F2, and that the essence of Reynolds's result is an embedding of F2 into P2, and that the Reynolds embedding followed by the Girard projection is the identity. We show that the inductive naturals are exactly those values of type natural that satisfy Reynolds's notion of parametricity, and as a consequence characterize situations in which the Girard projection followed by the Reynolds embedding is also the identity.

An earlier version of this work used a logic over untyped terms. This version uses a logic over typed term, similar to ones considered by Abadi and Plotkin and Takeuti, which better clarifies the relationship between F2 and P2.

AttachmentSize
2004-5-2 Programme.pdf73.58 KB
2004-5-2 Lecture1.pdf1.14 MB
2004-5-2 Lecture2.pdf409.39 KB
2004-5-2 Lecture3.pdf303.86 KB
WadlerPicture.jpg433.67 KB

Modern Cryptography

Ian Gent, Matt Robshaw, Graham Kirby, Steve Linton

Ian Gent, Matt Robshaw, Graham Kirby, Steve Linton

Fri, 02 Dec 2005

Speaker:
Dr Matt Robshaw

Affiliation:
France Telecom

Biography:
Dr Robshaw graduated from St Andrews in 1988 and completed his PhD at Royal Holloway University of London. During that time his research focused on cryptography in general and stream ciphers in particular. In 1993 he took up a research position with RSA Data Security in California.

After more than six years working on a variety of cryptographic projects, he left RSA as Principal Research Scientist and returned to academia. Joining the staff of Royal Holloway University of London in 2000 he became a Reader in Information Security. Recently Dr Robshaw decided to return to industry and took a position at France Telecom Research and Development based in Paris.

Further information:
http://www.isg.rhul.ac.uk/~mrobshaw/

Venue:
AM School of Computer Science Jack Cole Building, PM Physics & Astronomy Building

Abstract:
Over the course of three lectures we will look at the development of modern cryptography. We will review the range of techniques available and the much smaller set of algorithms in every-day use.

We will also consider some new and developing research directions that are likely to influence our daily life in years to come.

Programme:
10.00 – 11.00 The Development of Modern Cryptography
2nd Year Laboratory
Jack Cole Building

11.0 – 11.30 Coffee
Coffee area
Jack Cole Building

11.30 – 12.30 The Deployment of Modern Cryptography
2nd Year Laboratory
Jack Cole Building

14.30 – 15.30 New Directions
Physics Lecture Theatre B
Physics & Astronomy Building

AttachmentSize
2005-6-1 Programme.pdf74.42 KB
2005-6-1 Lecture1.pdf694.2 KB
RobshawPicture.jpg288.39 KB

Thinking Out of the Computer Science Cargo Cult Box

Ian Gent, Harold Thimbleby, James McKinna

Ian Gent, Harold Thimbleby, James McKinna

Tue, 02 May 2006

Speaker:
Professor Harold Thimbleby

Affiliation:
University of Wales Swansea

Biography:
Harold Thimbleby is Professor of Computer Science, Swansea University. He joined the Department in 2005, and he directs the Future Interaction Technologies Laboratory. Harold published his first paper, on menu selection, in 1978, and has since written over 360 refereed papers and articles in many forms - from newspaper articles to Encyclopedia Britannica. He wrote User Interface Design, published in the ACM Press Frontier Series in 1990; he is currently writing his fifth book, Press On, to be published by MIT Press.

He is a Royal Society-Wolfson Research Merit Award Holder. He was 28th Gresham Professor of Geometry. He was awarded the British Computer Society Wilkes Medal, and won a Toshiba Year of Invention prize. He is a visiting professor at UCL and Middlesex University.

Further information:
http://www.cs.swan.ac.uk/~csharold/

Venue:
School of Computer Science, Jack Cole Building

Programme:
Tuesday 2nd May 2006
10.00 - 11.00 The cargo cult of everyday computing (and a cure)
We are surrounded by embedded computers in interactive devices – mobile phones, car radios, airplanes, medical devices, to name just a few. Although these are massive markets and certainly meet many consumer needs, the computer science behind them is seriously flawed, and makes these devices unnecessarily hard to use.

11.00 – 11.30 Coffee

11.30 – 12.30 The cargo cult of mobile phones (and a cure)
The user interface of a mobile phone allows the user to search for phone functions, such as setting ring tones, dialing, texting, and so on. From a computer science perspective, this just requires a searching algorithm. We therefore compare typical phone algorithms with standard computer science algorithms -- and find that current phones are feeble compared to what they could be!

14.30 – 15.30 The cargo cult of calculators (and a cure)
Finally, we take a single, extended example of cargo cult computer science: despite their huge market and evident success, calculators are shown to be veritable weapons of maths destruction. After reviewing their problems, and diagnosing them as failures of applying elementary computer sicence, we show how and what solutions can be used to make them much better. A new and exciting approach (which we exhibited at the Royal Society summer science exhibition in July) will be demonstrated: indeed, a new weapon of maths construction.

Departmental Seminar
Room 1.33 a Jack Cole Building

Wednesday 3rd May, 2006
14.00 – 15.00 The cargo cult of scientific computing (and a cure)
After exposing widespread problems in the consumer market of embedded computer systems in the previous day's three lectures, we now turn to the academic domain of serious computer science to seek refreshment... Unfortunately, we find the same problems, of unreliable and flawed results widespread throughout the scientific research literature... a cargo cult computer science indeed! More importantly, we move on to discuss what we can do about it.

AttachmentSize
2005-6-2 Programme.pdf23.54 KB
2005-6-2 Lecture1.pdf3.12 MB
2005-6-2 Lecture2.pdf2.51 MB
2005-6-2 Lecture3.pdf1.63 MB
2005-6-2 Lecture4.pdf1.19 MB
ThimblebyPicture.jpg281.87 KB

If Software is the Solution, What is the Problem?

Ron Morrison, Bashar Nuseibeh, Ian Sommerville

Ron Morrison, Bashar Nuseibeh, Ian Sommerville

Fri, 01 Dec 2006

Speaker:
Professor Bashar Nuseibeh

Affiliation:
The Open University

Biography:
Bashar Nuseibeh is a Professor and Director of Research in Computing at The Open University (OU), UK, and a Visiting Professor at Imperial College London and the National Institute of Informatics, Tokyo. Previously he was a Reader at Imperial College and Head of its Software Engineering Laboratory. His research interests are in software requirements engineering and design, software process modelling and technology, and technology transfer. He has published over 100 refereed papers and consulted widely with industry, working with organisations such as the UK National Air Traffic Services (NATS), Texas Instruments, Praxis Critical Systems, Philips Research Labs, and NASA. He has also served as Principal or Co-Investigator on a number of UK and EU-funded research projects on software engineering, security engineering, and learning technologies.

Professor Nuseibeh is Editor-in-Chief of the Automated Software Engineering Journal, Associate Editor of IEEE Transactions on Software Engineering, and a member of the Editorial Board of five other international journals. He was a founder and Chairman of the British Computer Society's Requirements Engineering Specialist Group (1994-2004), and is currently Chair of IFIP Working Group 2.9 (Software Requirements Engineering). He has served as Programme Chair of major conferences in his field, including the 13th IEEE International Conference on Automated Software Engineering (ASE’98), the 5th IEEE International Symposium on Requirements Engineering (RE’01), and the 27th ACM/IEEE International Conference on Software Engineering (ICSE-2005).

Professor Nuseibeh holds an MSc and PhD in Software Engineering from Imperial College London, and a First Class Honours BSc in Computer Systems Engineering from the University of Sussex, UK. He received a 2002 Philip Leverhulme Prize for outstanding research achievements in software engineering, an ICSE-2003 "Most Influential Paper" award, and a number of other best paper and service awards. In 2005 he was awarded a Senior Research Fellowship of the Royal Academy of Engineering and The Leverhulme Trust. He is a Fellow of the British Computer Society and a Chartered Engineer (C.Eng.).

Further information:
http://mcs.open.ac.uk/ban25/

Venue:
School of Computer Science, Jack Cole Building

Programme:
Over the course of three lectures, we will look at the discipline of requirements engineering, for the development of software-intensive systems. We will begin with a review of this multi-disciplinary field in the first lecture, examine some problem-oriented analysis techniques in the second lecture, and reflect on new research directions in the emerging field of security requirements engineering in the third lecture.

Generic information:
https://staffres.cs.st-andrews.ac.uk/2006_2007/Teaching/Teaching_UG/Hons/Level_3/CS3051-SE/StudRes/Lectures/DLS-Audio/

AttachmentSize
2006-7-1 Programme.pdf80.35 KB
2006-7-1 Lecture1.pdf3.19 MB
NuseibehPicture.jpg362.73 KB

Model-Driven, Component Engineering

Ron Morrison, Colin Atkinson, Ian Sommerville

Ron Morrison, Colin Atkinson, Ian Sommerville

Wed, 02 May 2007

Speaker:
Professor Colin Atkinson

Affiliation:
University of Mannheim

Biography:
Colin Atkinson currently holds the chair of Software Engineering at the University of Mannheim in Germany. Before that he held a joint position as a professor at the University of Kaiserslautern and project leader at the affiliated Fraunhofer Institute for Experimental Software Engineering. From 1991 until 1997 he was an Assistant Professor of Software Engineering at the University of Houston – Clear Lake. His research interests are focused on the use of model-driven and component based approaches in the development of dependable computing systems. He received a Ph.D. and M.Sc. in computer science from Imperial College, London, in 1990 and 1985 respectively, and received his B.Sc. in Mathematical Physics from the University of Nottingham 1983.

Further information:
http://swt.informatik.uni-mannheim.de/index.php?id=9,5,0,0,1,0

Venue:
School of Computer Science, Jack Cole Building

Abstract:
Over the course of three lectures we will look at how component and service-based systems can be developed in a model-driven way. We will review the state-of-the art in component-based development, service-oriented architectures and model-driven development and will examine how these are integrated in the KobrA method. We will then look at how these paradigms might evolve in the future.

AttachmentSize
2006-7-2 Programme.pdf77.96 KB
2006-7-2 Lecture1.pdf209.14 KB
2006-7-2 Lecture2.pdf251.64 KB
2006-7-2 Lecture3.pdf551.99 KB
AtkinsonPicture.jpg756.48 KB

Scheduling Real-time Systems

Ron Morrison, Alan Burns, Ian Sommerville

Ron Morrison, Alan Burns, Ian Sommerville

Thu, 22 Nov 2007

Speaker:
Professor Alan Burns

Affiliation:
University of York

Biography:
Professor Alan Burns has worked for a many years on a number of different aspects of real-time systems engineering. He joined the University of York in January 1990 and was subsequently promoted to a Personal Chair in 1994.

His research activities have covered a number of aspects of real-time and safety critical systems including: requirements for such systems, the specification of safety and timings needs, systems architectures appropriate for the design process, the assessment of languages for use in the real-time safety critical domain, distributed operating systems, the formal specification of scheduling algorithms and implementation strategies, and the design of dependable user interfaces to safety critical applications.

He has authored/co-authored over 350 papers/reports and 8 books. His teaching activities include courses in Operating Systems, Scheduling and Real-time Systems.

Further information:
http://www.cs.york.ac.uk/people/bio.php?person=burns

Abstract:
Real-time systems are required to satisfy constraints over when computation takes place. For example, control loops must meet stringent periodicity and jitter requirements, and signal processing procedures must complete by defined deadlines. The main obstacle to meeting these requirements is the limited resources on which real-time systems are usually implemented. Scheduling theory is concerned with the design, implementation and evaluation of resource management algorithms.

In these talks the general notion of processor scheduling will be described with details been given of the commonly used fixed priority scheduling approach. Results from this approach will be outlined as will the current topics being addressed by the real-time scheduling research community. These topics include probabilistic approaches to execution time analysis, and the challenges presented by multi-core platforms.


Generic information:
http://studres.cs.st-andrews.ac.uk/Library/Distlec/2007-1

AttachmentSize
2007-8-1 Programme.pdf105.18 KB
2007-8-1 Lecture1.pdf144.15 KB
2007-8-1 Lecture2.pdf117.93 KB
2007-8-1 Lecture3.pdf314.62 KB
2007-8-1 Picture.jpg748.65 KB

Market-Based Systems

Alan Dearle, Dave Cliff, Ian Sommerville

Alan Dearle, Dave Cliff, Ian Sommerville

Wed, 05 Mar 2008

Speaker:
Professor Dave Cliff

Affiliation:
University of Bristol

Biography:
Dave Cliff is a Professor of Computer Science at the University of Bristol. He has a BSc in Computer Science and an MA and PhD in Cognitive Science. He has previously worked in faculty posts at the University of Sussex (UK), at the MIT Artificial Intelligence Lab (USA), and at the University of Southampton (UK). He also spent seven years working in industry: initially as a senior research scientist at the Hewlett-Packard Labs European Research Centre in Bristol, UK, where he founded and led HP's Complex Adaptive Systems research group. At HP, he developed adaptive autonomous trading algorithms and automatic optimization and design techniques for market mechanisms and online exchanges. He has also been a Director for Deutsche Bank's Foreign Exchange Complex Risk Group on Deutsche's Foreign Exchange trading floor in the City of London. In October 2005, Dave was appointed Director of a UK national research and training initiative, addressing issues in the science and engineering of Large-Scale Complex IT Systems (LSCITS). He is author or co-author on over 70 academic publications, inventor or co-inventor on 15 patents, and he has undertaken advisory and consultancy work for a number of major companies and for the UK Government. He's given well over 100 invited keynote lectures and seminars; and he and his work has frequently been featured in the press and on TV and radio.

Further information:
http://www.cs.bris.ac.uk/People/personal.jsp?person=143218

Venue:
Jack Cole Building, School of Computer Science

Abstract:
Over the last ten years, computer giants IBM and Hewlett-Packard have each invested significant research effort in developing algorithms that embody strategies for trading in "electronic marketplaces", and in algorithms that offer radical new types of electronic marketplace. This industrial research has been paralleled internationally by a number of academic research groups with similar ambitions. Some of this research is motivated by the desire to create autonomous agents for e-commerce applications, some of it is aimed at doing better resource allocation and control in large-scale distributed data-centers and grid systems, and some of it is aimed at creating predictive models of real financial systems. As it happens, in the last few years there has been an explosion of interest in using such techniques in the global financial markets.

Absolutely no previous knowledge of economics is required.

Programme:
These three lectures take a selective walk through the motivation, the background, the key results, the state of the art, and end with some wild hand-wavy speculations on where things will go next.

10.00– 11.00 Lecture 1: Rationale and Background
Here we'll find out why computer scientists should care about market-based systems, review some notable applications, and also cover some of the background economics. They call economics "the dismal science" for a reason, so that background economics stuff won't delay us too long...
2nd Year Laboratory
Jack Cole Building

11.00 – 11.30 Coffee
Coffee area
Jack Cole Building

11.30 – 12.30 Lecture 2: Artificial Trading Agents for Fun and Profit
This lecture tells the story of some of the best-known algorithms used for autonomous "trader-robots", and how they were found to consistently beat human traders.
2nd Year Laboratory
Jack Cole Building

14.30 – 15.30 Lecture 3: What's hot, what's not, and where next: Tales from the City
Looks at work on automatic optimization and design of trader-agents, and online market mechanisms, with particular reference to the current hot topics in the automated trading technology in the financial markets.
2nd Year Laboratory
Jack Cole Building

AttachmentSize
DCliff_1_050308.pdf6.5 MB
DCliff_2_050308.pdf2.59 MB
DCliff_3_050308.pdf3.35 MB
2007-8-2 Picture.JPG2.62 MB
2007-8-2 Programme.pdf37.11 KB

Human-Computer Interaction: as it was, as it is, and as it may be

Thu, 06 Nov 2008

Speaker:
Professor Alan Dix

Affiliation:
Lancaster University

Biography:
Professor Alan Dix is Professor of Human Computer Interaction at Lancaster University. Starting as a mathematician, Alan became involved with HCI in the 1980s when at York University. He has worked at Huddersfield and Staffordshire Universities and has been involved in two startup companies before he came to Lancaster in 2000. He is the principal author of a leading HCI textbook, now in its 3rd edition. Alan has a wide range of HCI interests from formal methods through mobile systems and CSCW to interactive art and social technologies.

Further information:
http://www.comp.lancs.ac.uk/~dixa/

Venue:
School of Computer Science, Jack Cole Building

Programme:
10.00 – 11.00 Lecture 1: Whose Computer Is It Anyway
Computers only exist because they do things for people. In that sense there is no computation that is not to some extent directly or indirectly part of human-computer interaction. Sometimes the computer is so buried in automatic systems that this human connection can be ignored, but for the majority of computer systems that today's students are likely to be working on during their careers, the impacts on the user are at the heart of what it means to be effective. For most of the current generation of computer users, the image of using a computer is will be centred around the graphical user interface of windows and icons, and the metaphorical 'desktop' set upon a real desktop somewhere.

In this first lecture I will discuss some of the well known, and less known origins of the graphical interface and also the now established methods of design and evaluation, supporting software architectures, and techniques for formal analysis

11.00 – 11.30 Coffee, Coffee area, Jack Cole Building


11.30 – 12.30 Lecture 2: The Great Escape
Increasingly the computer is escaping both the office desktop and the desktop interface. For half the world's population in China, the Indian subcontinent and Africa, it is likely that their primary, and maybe only, means of accessing the global world of information and computation will be through a phone. In our own homes we interact with computers in the kitchen, living room and even bathroom. Even when we feel we are in the contained world of a traditional desktop computer sitting on a desktop, still the computation spills out across the internet. Computation has no physical bounds and human interaction with information and computation is dispersed throughout our world and our lives.

In this second lecture we will see how research in human-computer interaction is addressing these challenges in areas such as mobile interfaces, ubiquitous computing and social networking.


14.00 – 15.30 Lecture 3: Connected, but Under Control, Big, but Brainy?
Our academic work, our social life, even our personal memories live not only in our computers, but out in the 'cloud'. And out there the whole web of human information is becoming linked data, semantically defined and interconnected. This same web has the information capacity and the computational power of a human brain, and yet often seems more like a giant haystack than an intelligent aid.

In this last lecture I will present some of the work I have been personally involved with that seeks to extract structure from informal human data and reason in a more humane way using highly structured formal data. I will draw on both academic and commercial experience in constructing web interfaces that allow the strengths of human and computer intelligence to work together. We will see how personal ontologies could help computers help us, and perhaps soon to reason over the whole web ... just to help you order a pizza.

Generic information:
You can find the video clips related to this lecture series by following the link https://studres.cs.st-andrews.ac.uk/Library/DistLec/2008-1/. (you have to be in the school to access them)

AttachmentSize
readme.txt76 bytes
StA-L1-whose-computer-v0-2.ppt2.31 MB
human_interaction_Picture.jpg314.69 KB
StA-L2-great-escape-v0.ppt1.92 MB
StA-L3-big-and-brainy-v1-1.ppt1.16 MB

Delay Tolerant and Opportunistic Networks

Mon, 23 Feb 2009

Speaker:
Prof. Jon Crowcroft

Affiliation:
University of Cambridge

Biography:
Jon Crowcroft is the Marconi Professor of Networked Systems in the Computer Laboratory, of the University of Cambridge. Prior to that he was professor of networked systems at UCL in theComputer Science Department. He is a Fellow of the ACM, a Fellow of the British Computer Society and a Fellow of the IEE and a Fellow of the Royal Academy of Engineering, as well as a Fellow of the IEEE. He was a member of the IAB 96-02, and went to the first 50 IETF meetings; was general chair for the ACM SIGCOMM 95-99. He has published 5 books - the latest is the Linux TCP/IP Implementation, published by Wiley in 2001. He is the Principle Investigator in the Computer Lab for the EU Haggle Project in DTN, and the EU Social Networks project, the EPSRC TINA project on location sensors and wireless networking of airports, and for the ITA project in next generation wireless networks. Industrial Experience Worked for Bloomsbury Computer Consortium for 2 years. Sabbatical at Hewlett Packard Research Labs Bristol Technical Advisory Board for 10 startups (Ensim, Orchestream, Bandwiz, Nexthop, Interprovider, Corvil, Ethos, Hidden Footprints and others). On Technical Advisory Board for Microsoft ResearchCambridge, and MPI, and previously for DoCoMo Labs, California and visiting faculty at Intel research. Technical reviewer for corporate datanetwork stratetgy, Ericsson Consulting to Reuters, BBC, Nortel, Cisco, Oftel (now Ofcomm) amongst others. Research Interests Communications and Multimedia Systems, but especially Internet related. Currently on Sabbatical at IMDEA Networks, Madrid.

Further information:
http://www.cl.cam.ac.uk/users/jac22/

Venue:
Cole 0.35 MSc Lab

Abstract:
We are so used to networks that are "always there", so called infrastructural networks such as the phone system, Internet, the cellular (GSM, CDMA, 3G) and so on that we forget that once upon a time (why, only in the 1970s) computer communications was fraught with problems of reliability, and challenged by very high cost (or availability) of connectivity and capacity. One we had UUCP and E-Mail, which predate any of today's infrastructures, but coped very well with these challenges. Now, it appears that it is worth revisiting these ideas for a variety of reasons: it looks like we cannot afford to build a Solar System wide Internet just yet; it looks like one can build effective end-to-end mobile applications out of wireless communication opportunities that arise out of infrequent and short contacts between devices carried by people in close proximity, and then wait til these people move on geographically to the next hop; it is interesting to speculate that these systems may actually have much higher potential capacity than infrastructural wireless access networks, although they present other challenges (notably higher delay). This set of talks will be about the last 10 years of work leading up to our current understanding of how to build Delay Tolerant and Opportunistic Networks, and how to model their performance.

Programme:
10.00 – 11.00 Lecture 1: Delay Tolerant Networking - it really is rocket science.
Room 0.35
(Jack Cole Building)
In this lecture we review the DTN work over the last 10 years, starting from the origins as an initiative to provide a commodity network system for the planned manned mission to Mars, and ending up with an architecture for any network that is challenged by frequent disruptions, from oceanographic sensing, to disaster relief communications when everything else has failed, and on to building useful services in places where the population cannot (yet) afford an infrastructure.
11.00 – 11.30 Coffee
Coffee Room
(Jack Cole Building)
Coffee & Tea with Biscuits 11.30 – 12.30 Lecture 2: Opportunistic Networking - Making people network.
Room 0.35
(Jack Cole Building)
4 billion people have cell phones. Most have not just a radio for voice communication, but also a bluetooth, short range radio which allows devices to communicate directly (without involving a cell tower). Increasingly, devices also have WiFi which can also be used without any recourse to a provider. We can build networks that use encounters between devices carried by people, and then use the natural mobility of humans (walking, cycling, driving, in trains, planes etc) to carry stored data to the next hop. Such systems can be used in a wide variety of scenarios for disaster communication when the infrastructure is broke, for networks in developing regions (or out in the middle of the ocean, or in space) where there isn't any infrastructure anyway, and for applications which may enjoy high capacity, but do not mind higher (or even uncertain) delays. In the process of designing and building such systems, we may accidentally (on purpose) design systems tat simply work better in the now more traditional setting of the Internet, but cope more seamlessly with the occasional glitches that show up there. We may find it easier to build applications on such systems that tolerate occasional (or frequent) disruptions. One interesting synergy I will touch on here is that applications for opportunistic networks often entail unspecified sender or recipient (i.e. they are data dissemination applications) which resemble some of the new ideas in Data Oriented networking in the Internet. At the same time as resource pooling and multipath routing, and interest-based delivery are being explored for the Internet to support this, such approaches have already proved natural in designing data forwarding schemes in Opportunistic Networks. We'll look at one such protocol.
14.30 – 15.30 Lecture 3: How much delay must I tolerate in my DTN/Oppnet?
Room 0.35
(Jack Cole Building)
It turns out that we have a wealth of data emerging from measurement made by wireless networking researchers (e.g. on Crawdad) but also increasingly of interest to social scientists (anthropologists trying to understand human society) and medical researchers (epidemiologists trying to understand the spread and evolution of diseases).
In this final part of the lecture, I will look at the emerging models we have both for delivery success and delay, and for capacity of DTNs.

Generic information:
You can find the video clips related to this lecture series by following the link https://studres.cs.st-andrews.ac.uk/Library/DistLec/2009-1/. (you have to be in the school to access them)




AttachmentSize
Crowcroft Picture.JPG969.08 KB
Lecture 0.ppt550.5 KB
Lecture 1.pdf4.86 MB
Lecture 2.ppt2.4 MB
Lecture 3 (01-13).ppt2.05 MB
Lecture 3 (14-25).ppt3.4 MB
Lecture 3 (26-37).ppt2.63 MB

Cryptography: From Black Art to Popular Science

Wed, 18 Nov 2009

Speaker 1:
Prof Fred Piper

Affiliation (Speaker 1):
Royal Holloway University London

Biography (Speaker 1):
Prof Fred Piper BSc PhD (London) CEng CMath FIEE ARCS DIC FIMA was appointed Professor of Mathematics at the University of London in 1975 and has worked in information security since 1979. In 1985, he formed a company, Codes & Ciphers Ltd, which offers consultancy advice in all aspects of information security. He has acted as a consultant to over 80 companies including a number of financial institutions and major industrial companies in the UK, Europe, Asia, Australia, South Africa and the USA. The consultancy work has been varied and has included algorithm design and analysis, work on EFTPOS and ATM networks, data systems, security audits, risk analysis and the formulation of security policies. He has lectured worldwide on information security, both academically and commercially, has published more than 100 papers and is joint author of Cipher Systems (1982), one of the first books to be published on the subject of protection of communications, Secure Speech Communications (1985), Digital Signatures - Security & Controls (1999) and Cryptography: A Very Short Introduction (2002).

Fred has been a member of a number of DTI advisory groups. He has also served on a number of Foresight Crime Prevention Panels and task forces concerned with fraud control, security and privacy. He is currently a member of the Scientific Council of the Smith Institute, the Board of Trustees for Bletchley Park and the Board of the Institute of Information Security professionals. He is also a member of (ISC)2’s European Advisory Board, the steering group of the DTI’s Cyber Security KTN, ISSA’s advisory panel and the BCS’s Information Security Forum.

In 2002, he was awarded an IMA Gold Medal for “services to mathematics” and received an honorary CISSP for “leadership in Information Security”. In 2003, Fred received an honorary CISM for “globally recognised leadership” and “contribution to the Information Security Profession”.

Speaker 2:
Prof Peter Wild

Affiliation (Speaker 2):
Royal Holloway University London

Biography (Speaker 2):
Prof Peter Wild BSc (Adelaide) PhD(London) received his B.Sc. (Hons) degree in Pure Mathematics in 1976 from the University of Adelaide, and the Ph.D. degree in Mathematics in 1980 from the University of London. He has worked at the Ohio State University, Columbus, Ohio; the University of Adelaide; and with the CSIRO, Australia. In 1984 he joined Royal Holloway where he is currently employed as a Professor in Mathematics. His research interests are in combinatorics, design theory, cryptography and coding theory. He has acted as a data security consultant for a number of companies offering advice in algorithm analysis, key management and user identification protocols.

Venue:
Purdie Building LTB

AttachmentSize
Programme dls_sem1 09.pdf32.27 KB
lec Picture.jpg278.21 KB
Slides from Third Lecture (PDF)341.73 KB
Slides from first lecture (PDF)259.25 KB
Slides from second lecture (PDF)267.53 KB

Cryptography: From Black Art to Popular Science

Wed, 18 Nov 2009

Speaker 1:
Prof Fred Piper

Affiliation (Speaker 1):
Royal Holloway University London

Biography (Speaker 1):
Prof Fred Piper BSc PhD (London) CEng CMath FIEE ARCS DIC FIMA was appointed Professor of Mathematics at the University of London in 1975 and has worked in information security since 1979. In 1985, he formed a company, Codes & Ciphers Ltd, which offers consultancy advice in all aspects of information security. He has acted as a consultant to over 80 companies including a number of financial institutions and major industrial companies in the UK, Europe, Asia, Australia, South Africa and the USA. The consultancy work has been varied and has included algorithm design and analysis, work on EFTPOS and ATM networks, data systems, security audits, risk analysis and the formulation of security policies. He has lectured worldwide on information security, both academically and commercially, has published more than 100 papers and is joint author of Cipher Systems (1982), one of the first books to be published on the subject of protection of communications, Secure Speech Communications (1985), Digital Signatures - Security & Controls (1999) and Cryptography: A Very Short Introduction (2002).

Fred has been a member of a number of DTI advisory groups. He has also served on a number of Foresight Crime Prevention Panels and task forces concerned with fraud control, security and privacy. He is currently a member of the Scientific Council of the Smith Institute, the Board of Trustees for Bletchley Park and the Board of the Institute of Information Security professionals. He is also a member of (ISC)2’s European Advisory Board, the steering group of the DTI’s Cyber Security KTN, ISSA’s advisory panel and the BCS’s Information Security Forum.

In 2002, he was awarded an IMA Gold Medal for “services to mathematics” and received an honorary CISSP for “leadership in Information Security”. In 2003, Fred received an honorary CISM for “globally recognised leadership” and “contribution to the Information Security Profession”.

Speaker 2:
Prof Peter Wild

Affiliation (Speaker 2):
Royal Holloway University London

Biography (Speaker 2):
Prof Peter Wild BSc (Adelaide) PhD(London) received his B.Sc. (Hons) degree in Pure Mathematics in 1976 from the University of Adelaide, and the Ph.D. degree in Mathematics in 1980 from the University of London. He has worked at the Ohio State University, Columbus, Ohio; the University of Adelaide; and with the CSIRO, Australia. In 1984 he joined Royal Holloway where he is currently employed as a Professor in Mathematics. His research interests are in combinatorics, design theory, cryptography and coding theory. He has acted as a data security consultant for a number of companies offering advice in algorithm analysis, key management and user identification protocols.

Venue:
Purdie Building LTB

AttachmentSize
Programme dls_sem1 09.pdf32.27 KB
lec Picture.jpg278.21 KB
Slides from Third Lecture (PDF)341.73 KB
Slides from first lecture (PDF)259.25 KB
Slides from second lecture (PDF)267.53 KB

Parallelism and the Exascale Challenge

Thu, 29 Apr 2010

Speaker:
Prof Arthur Trew

Affiliation:
Director of EPCC, University of Edinburgh

Biography:
Originally an astrophysicist, Arthur Trew became increasingly interested
in the use of computers to solve problems in physics and astronomy which
were not amenable to traditional techniques. In 1990 he worked with a
small group in the Department of Physics and Astronomy at Edinburgh to
found EPCC as a research institute for novel computational science
research and development. Today, EPCC has 75 staff and is one of the
leading centres in Europe, undertaking a wide range of collaborative
projects with academic researchers and industry. In 1995, Arthur became
Director of EPCC; in this role one of his key aims is exploiting
computational science linkages between academic disciplines and between
academia and industry.

In 2001, he became the Deputy Director of the National e-Science
Centre (NeSC) which aims to understand better how to manage, and extract
information from, large
scientific sets.e is also a Director of UOE HPCx Ltd, a wholly-owned
subsidiary of the University of Edinburgh, which was formed to manage
the £54M HPCx, and more recently the £113M HECToR, projects. As the
Service Director, he is the main link to the Research Councils for both
facilities.

Today, he holds a wide range of Research Grants and contracts, and,
since 2006, has the Chair of Computational Science at the University of
Edinburgh.


Venue:
PHYSICS (lecture theatre A)

Abstract:
Work is underway to develop computers able to perform a million, million, million basic numerical operations per second (1 exaflops, in the jargon) and manage a million, million, million bytes of date (1 exabyte). By enabling expensive, dangerous or impossible experiments to be replaced by computer simulations, these computers will be central to the development of many scientific fields in the 2020s. These computers will inevitably be massively parallel, dividing their work among vast numbers of "ordinary" microprocessors. This short course of lectures will review the challenges of building, managing and, especially, programming these giant future computers, and explore some of the approaches currently being developed to meet the "Exascale Challenge".

Programme:
Lecture 1: Computer Simulation: Setting the Scene

Computer Simulation is now generally regarded as the third scientific paradigm, complementing theory and experiment and of particular use when the system is too complex for theoretical investigation and too large,
too small, too fast, too slow, or simply too expensive to experiment on.
Meeting the computational needs in many fields has already driven us towards parallel computing approaches, but scientific needs are continuing to outstrip our ability to deliver fast enough computers -
especially now that microprocessor clock frequencies have peaked. This lecture will review the scientific drivers and the challenges we face in achieving increasing performance.

Lecture 2: The Exascale Challenge

Today, the fastest computers in the world struggle to deliver petaflops (10E15 flops) performance but there is already design work underway to design an exaflops (10E18 flops) computer and the applications to run on it. Achieving such an increase in performance will require a quantum leap in computer technology and hence research into a wide variety of computer science problems from fault-tolerance,to power-aware system software, from new progamming paradigms to validation methods. This lecture will consider the key challenges to be
faced over the next 5 - 10 years if we are to build a workable exascale computer.

Lecture 3: The Exascale Solution(?)

Building a useable exascale computer will require us to face some challenges on a scale which are unprecedented. In other cases,problems, such as fault tolerance, which have been considered solved for
many years recur. In all cases, the solution must be accomplished within extremely tight limits on bandwidths and power consumption. This lecture will review recent studies into exascale computer design.

AttachmentSize
Programme dls_sem2 10.pdf31.44 KB
Lecture1.pdf5.51 MB
Lecture 2.pdf2.87 MB
Lecture 3.pdf1.19 MB
Trew_Picture.jpg462.79 KB
IMGP3719.JPG2.13 MB
IMGP3729.JPG2.18 MB
IMGP3732.JPG1.74 MB
IMGP3740.JPG2.14 MB
IMGP3741.JPG1.65 MB
IMGP3742.JPG1.69 MB
IMGP3743.JPG2.05 MB
IMGP3745.JPG2.1 MB
IMGP3746.JPG1.88 MB
Lecture1.aac9.87 MB
Lecture2.aac12.24 MB
Lecture3.aac9.74 MB

Machines Reasoning about Machines

Mon, 15 Nov 2010

Speaker:
J Strother Moore

Affiliation:
Admiral B.R. Inman Centennial Chair in Computing Theory at the University of Texas at Austin

Biography:
J Strother Moore holds the Admiral B.R. Inman Centennial Chair in Computing Theory at the University of Texas at Austin. He is also a Visiting Professor at the University of Edinburgh, where he spends several months each year. He is the author of many books and papers on automated theorem proving and mechanical verification of computing systems. Along with Boyer he is a co-author of the Boyer-Moore theorem prover and the Boyer-Moore fast string searching algorithm.
With Matt Kaufmann he is the co-author of the ACL2 theorem prover. Moore got his SB from MIT in 1970 and his PhD from the University of Edinburgh in 1973. Moore was a founder of Computational Logic, Inc., and served as its chief scientist for ten years. He served as chair of the UT Austin CS department for eight years. He and Bob Boyer were awarded the McCarthy Prize in 1983 and the Current Prize in Automatic Theorem Proving by the American Mathematical Society in 1991. In 1999, they were awarded the Herbrand Award for their work in automatic theorem proving. Boyer, Moore, and Kaufmann were awarded the 2005 ACM Software Systems Award for the Boyer-Moore theorem prover.
Moore is a Fellow of both the American Association for Artificial Intelligence and the ACM and is a member of the National Academy of Engineering.

Venue:
Cole 133a/b, Physics (Lecture Theatre B)

Abstract:
Computer hardware and software can be modeled precisely in mathematical logic. If expressed appropriately, these models can be executable.
The ``appropriate'' logic is an axiomatically formalized functional programming language. This allows models to be used as simulation engines or rapid prototypes. But because they are formal they can be manipulated by symbolic means:
theorems can be proved about them, directly, with mechanical theorem provers. But how practical is this vision of machines reasoning about machines?
In this highly personal talk, I will describe the 40 year history of the ``Boyer-Moore Project'' and discuss progress toward making formal verification practical. Among other examples I will describe important theorems about commercial microprocessor designs, including parts of processors by AMD, Motorola, IBM, Rockwell-Collins and others. Some of these microprocessor models execute at 90% the speed of C models and have had important functional properties verified. In addition, I will describe a model of the Java Virtual Machine, including class loading and bytecode verification and the proofs of theorems about JVM methods. In the latter half of this 3-hour seminar we will look closely at how such machines are formalized and how the theorem prover is ``taught'' to reason about them, by looking at simpler examples drawn from list processing and a ``toy'' version of the JVM.

Programme:
There will be 3 one-hour sessions, starting at 11.00am.

AttachmentSize
Programme_dls_sem1_10.pdf29.1 KB
lecture1.pdf552.97 KB
lecture2.pdf119.26 KB
lecture3.pdf99.66 KB
lecture1-session-log.txt29.73 KB
lecture2-session-log.txt53.34 KB
lecture3-session-log.txt127.65 KB

From Recommendation to Reputation: Information Discovery Gets Personal

Fri, 22 Apr 2011

Speaker:
Barry Smyth

Affiliation:
University College Dublin

Biography:
Prof. Barry Smyth holds the Digital Chair of Computer Science in University College Dublin.He is the Director of CLARITY

Abstract:
These lectures will focus on how personalization techniques and recommender systems are being used in response to the information overload problem that face web users everyday. Personalization research brings together ideas from artificial intelligence, user profiling, information retrieval and user-interface design to provide users with more proactive and intelligent information services that are capable of predicting the needs of individuals and adapting to their implicit preferences. We will review core ideas from recommender systems research, drawing on the many practical examples that have underpinned modern web success stories, from e-commerce to mobile applications. In addition we will explore how the next generation of web search is likely to be influenced by recommender systems techniques that can facilitate a more social and collaborative approach to web search, which complements the purely algorithmic focus of contemporary search engines.

Programme:
Physics: Lecture Theatre B: 11.00-12.00noon
Purdie: Lecture Theatre A:14.0-17.00

Generic information:
The slides for this lecture are available here.

AttachmentSize
Programme dls_sem2 11.pdf260.59 KB