We are starting to work on this right now, and as usual will use the blog as a platform for discussion. Some background for the workshop is to be found in the post “Starting a GSS reading list”, and there in particular the paper “Algorithms, Games and the Internet” by Papadimitriou. The internet – which includes the users! – is a paradigmatic example of a global system, and studying it by means of iterated games in large populations with randomized matches between players – which include computers ! – turns out to be a very fruitful approach. Remarkably, the same approach makes a lot of sense when studying markets, including the marketplaces that play an essential role in the dynamics of the global urban system.
Some pointers regarding Tuesday AM discussion (posted by SB)
– Discussion about how shocks can get amplified in a network economy
Bak, P., Chen, K., Scheinkman, J., Woodford, M., 1993. Aggregate fluctuations from independent sectoral shocks: self-organized criticality in a model of production and inventory dynamics. Ricerche
Economiche 47, 3–30.
– Related to this. Is there any cascade model of how the subprime was able to trigger the financial crisis? One argument is the following: the subprime market volume was very small compared to the overall global financial assets. However, there has been a building up of instability due the fact that risk associated with all mortgages related securities had been underestimated. The default of subprime mortgages triggered a shift of beliefs in the market players regarding a much larger portion of the market.
Information, contributions and presentations for the Workshop on Urban Development and Global Systems Science, organized by EUNOIA (Brussels, Feb. 13-14, 2013), are now available.
The Centre for Systems Studies, based in the Business School at the University of Hull (UK), has a strong international reputation for its cutting edge work on the theory, methodology and practice of systems thinking. The Business School is currently advertising 11 PhD scholarships: 7 paid for by the University, and 4 paid for by the Business School itself (the terms and conditions of the two types of scholarship are slightly different – see below).
We would like to strongly encourage people with exciting research ideas in the broad field of systems thinking to apply for these scholarships. While there will be applicants across all the research interests of the Business School, and the scholarships are allocated on merit, the Centre for Systems Studies has always previously been able to put forward several strong candidates, and we hope to be able to do so again in 2013.
Please click on the links below for details of the two types of scholarship. Applicants need to apply for both a PhD place and a scholarship by 11 January 2013, with a view to starting a full-time PhD in September 2013.
For details on how to apply to our PhD programme go to:
For the 7 University Scholarships, see:
For the 4 Business School Scholarships, see:
Notes from The Saturday ICT workshop (by Patrik Jansson, 2012-11-22)
(Ilan Chabay started out with a summary of the Thursday and Friday Narratives workshops – that part is reported elsewhere.)
Second topic was introduced by Jeremy Gibbons. We need robust modelling – we cannot assume a single shared context. Even for a long-lived single-person project, but more urgently for larger collaborations. We need assumptions to be explicit, documented, transparent, checkable. Challenge 1: make computational science results transparent and repeatable. Challenge 2: provide languages which let you write a high-level model of your program and let the computer generate the low-level code.
Third Michael Resch talked about “Verification and Validation of Simulation Models”. There is a chain (or tower) of models from theory, through modelling, numeric modelling (like discretization), programming, running and interpreting the results. To be sure about the validity of the results we need Challenge 3: validation and verification at each step (each level). This is a major challenge with many sub-parts. If we carefully explain all the potential “bugs” which could in principle invalidate our results we could easily project the image that “they have no credibility”. Thus there is the pedagogical Challenge 4: how to present results with uncertainties? There is also a historical dimension as science moves forward and consensus changes (due to improvements of theory, models and data). Journalists dig up old results (which we now know are incorrect) and make headlines based on the “contradictions” found.
Last discussion topic was introduced by David De Roure: “Knowledge Infrastructure for Global Systems Science”. This comes back to the transparency and repeatability (and multiple meanings of that) mentioned by Jeremy. The main message was that methods are as important as the data. Bundles of workflows, documents and data make up “computational research objects”. An important Challenge 5 here is how to represent these research objects so that they can be mixed and matched freely. Some support for automatic curation and repair would also be needed.
Saturday ICT chairs + presenters
- Patrik Jansson – Chalmers Univ. of Techn., email@example.com
- Co-chair of “Models and Narratives in GSS”
- Ilan Chabay
- Co-chair and talk: “Models and Narratives in Global Systems Science”
- Jeremy Gibbons
- Talk: Dependable Modelling
- Michael Resch
- Talk: Verification and Validation of Simulation Models
- David De Roure;
- Talk: “Knowledge Infrastructure for Global Systems Science”
- Ulf Dahlsten (first hour)
- Ralph Dum
- David Tabara
- several others (unfortunately I did not make a list)
Notes from The Friday ICT workshop (by Patrik Jansson, 2012-11-22)
The Friday ICT workshop started with a welcome by the chair, Patrik Jansson (Chalmers), presenting the workshop theme “Computer Science meets Global Systems Science” and the participants. After a brief round of presentations the two hour workshop slot was split into three rounds of presentation + discussion.
First out was Martin Elsman (HIPERFIT) talking about Domain Specific Languages in general and how computer science can help modellers handle risk modelling and predictions in particular. The examples given were about modelling complex financial contracts within one bank. In the discussion Ulf identified the challenge “simulate the global financial markets and different regulations”. (Several workshop participants could be part of a consortium around this topic.) Another challenge identified was to avoid the “tower of Babel” problem – making sure that the different DSLs and models have well defined semantics and share common infrastructure.
The second part was introduced by Zhangang Han (Beijing) talking about Human Machine Integrated Decision Making. (Picture: big command centre with a common large screen and many terminals. Could be used for a natural disasters command and control or for more long-term planning.) Lots of data as input, models are run, domain experts interpret it and help decision makers decide next actions. Challenge 1: find a compatible framework which can incorporate the workers in different roles so that they can communicate in an efficient way. Challenge 2: Challenge: real world data always contradict each other – what is the correct interpretation? Challenge 3: Visualise what is going on and map the data and interpretation to policies.
The discussion raised a few other points as well: When we present results to real decision makers we often get feedback which contradicts our results. Based on the assumption that the policy-maker really knows things we don’t we can sometimes adjust parameters to make these results coincide (but sometimes it is just confusing).
As a recent example Z. Han’s group modelled the education in Beijing, China where the number of pupils was going down and they need to merge (close down) some schools. We could then show some simulation and alternatives.
Patrik: we seem to need a whole infrastructure (or ecosystem) of raw data, analysed data at different levels, simulation results and interpretations. Perhaps new standards, data collection agencies, etc. are needed. Chris Barrett remarked that Data provenance – meta-data management schemes are important. Survey science, survey statistics have methods which should be adaptable here.
The third topic (education) was introduced by Johan Jeuring (Utrecht) with a presentation on “Technology for learning modelling languages”. Two examples were presented: an online mathematical problem solving tool and an automatic programming tutor, giving hints for next steps or checks of your own steps. Challenge 1: How do we set up generic systems for learning modelling languages (which can be specialised to many modelling languages). Challenge 2: How do we give feedback? Question: Should awareness/dissemination be done via Massive Online Courses? Challenge 3: How do we handle multiple natural languages? Merijn: learning is also a social experience – in an online interactive course the students interact both with the system and with their peers.
Friday ICT chairs + presenters
- Patrik Jansson; Chalmers Univ. of Techn., firstname.lastname@example.org
- Chair of “Computer Science meets GSS”
- Martin Elsman; DIKU, email@example.com
- Talk: “Domain Specific Languages”
- Zhangang Han; Beijing Normal Univ., firstname.lastname@example.org
- Talk: Human Machine Integrated Decision Making
- Johan Jeuring, Utrecht University and Open University, CS prof, DSL, teaching technology
- Talk: Technology for Learning Modelling Languages
- Colin Harrison; IBM’s Enterprise Initiatives Team
- Plenary talk “Information Society and Energy Addiction”
Other Friday ICT workshop participants:
- Chris Barrett, Virginia Bioinformatics Institute
- David de Roure, director of the eResearch center, Oxford, (digital social science)
- Jeff Johnson, Open Univ., UK
- John Sutcliffe-Braithwaite, PublicComputing BV, Computer engineer, work for small company, spec.
- Jonathan Reades, UCL, London, large data sets
- José Javier Ramasco, Mallorca, Complex networks
- Katarzyna Szkuta, crossover, Tech4i2, ltd
- Luis Bettencourt, Santa fe
- Mario Rasetti, Institute of Scientific Interchange Foundation (ISI), Torino, emeritus, Physicist
- Maxi San Miguel, CSIC, Mallorca, Spain, Physicist by training, com
- Mereijn Terheggen, Factlink company, collective knowledge platform
- Per Öster, CSC, Finland (Director of research environments, ICT tools for research)
- Qian Ye, Beijing Normal University, Integrated risk governance
- Steven Bishop, Math
- Ulf Dahsten
- Vittorio Loreto,
- Wanglin Yan, Keio University, Tokyo, GIS, Sustainablility