Category Archives: 1st Open GSS Conference

GSS conference – two plenary summary transcripts already available

Dear good colleagues,

Please find attached two summary transcripts of the plenaries we held in our exciting conference. I think this is prime reflective material for all us in this emergent community. There will be more and I’ll put all them in a consolidated report soon. But for the time being this gives you a taste of what is to come and the major challenges we have ahead!

Thanks for all your contributions!

J. David Tabara

 

Reports form the conference 1st Open Global Systems Science Conference, November 8th–10th, 2012, Brussels, Belgium:

 

ICT challenges for GSS, part 3

Notes from The Saturday ICT workshop (by Patrik Jansson, 2012-11-22)

(Ilan Chabay started out with a summary of the Thursday and Friday Narratives workshops – that part is reported elsewhere.)

Second topic was introduced by Jeremy Gibbons. We need robust modelling – we cannot assume a single shared context. Even for a long-lived single-person project, but more urgently for larger collaborations. We need assumptions to be explicit, documented, transparent, checkable. Challenge 1: make computational science results transparent and repeatable. Challenge 2: provide languages which let you write a high-level model of your program and let the computer generate the low-level code.

Third Michael Resch talked about “Verification and Validation of Simulation Models”. There is a chain (or tower) of models from theory, through modelling, numeric modelling (like discretization), programming, running and interpreting the results. To be sure about the validity of the results we need Challenge 3: validation and verification at each step (each level). This is a major challenge with many sub-parts. If we carefully explain all the potential “bugs” which could in principle invalidate our results we could easily project the image that “they have no credibility”. Thus there is the pedagogical Challenge 4: how to present results with uncertainties? There is also a historical dimension as science moves forward and consensus changes (due to improvements of theory, models and data). Journalists dig up old results (which we now know are incorrect) and make headlines based on the “contradictions” found.

Last discussion topic was introduced by David De Roure: “Knowledge Infrastructure for Global Systems Science”. This comes back to the transparency and repeatability (and multiple meanings of that) mentioned by Jeremy. The main message was that methods are as important as the data. Bundles of workflows, documents and data make up “computational research objects”. An important Challenge 5 here is how to represent these research objects so that they can be mixed and matched freely. Some support for automatic curation and repair would also be needed.

Saturday ICT chairs + presenters

  • Patrik Jansson – Chalmers Univ. of Techn., patrikj@chalmers.se
    • Co-chair of “Models and Narratives in GSS”
  • Ilan Chabay
    • Co-chair and talk: “Models and Narratives in Global Systems Science”
  • Jeremy Gibbons
    • Talk: Dependable Modelling
  • Michael Resch
    • Talk: Verification and Validation of Simulation Models
  • David De Roure;
    • Talk: “Knowledge Infrastructure for Global Systems Science”

Other participants:

  • Ulf Dahlsten (first hour)
  • Ralph Dum
  • David Tabara
  • several others (unfortunately I did not make a list)

ICT challenges for GSS, part 2

Notes from The Friday ICT workshop (by Patrik Jansson, 2012-11-22)

The Friday ICT workshop started with a welcome by the chair, Patrik Jansson (Chalmers), presenting the workshop theme “Computer Science meets Global Systems Science” and the participants. After a brief round of presentations the two hour workshop slot was split into three rounds of presentation + discussion.

First out was Martin Elsman (HIPERFIT) talking about Domain Specific Languages in general and how computer science can help modellers handle risk modelling and predictions in particular. The examples given were about modelling complex financial contracts within one bank. In the discussion Ulf identified the challenge “simulate the global financial markets and different regulations”. (Several workshop participants could be part of a consortium around this topic.) Another challenge identified was to avoid the “tower of Babel” problem – making sure that the different DSLs and models have well defined semantics and share common infrastructure.

The second part was introduced by Zhangang Han (Beijing) talking about Human Machine Integrated Decision Making. (Picture: big command centre with a common large screen and many terminals. Could be used for a natural disasters command and control or for more long-term planning.) Lots of data as input, models are run, domain experts interpret it and help decision makers decide next actions. Challenge 1: find a compatible framework which can incorporate the workers in different roles so that they can communicate in an efficient way. Challenge 2: Challenge: real world data always contradict each other – what is the correct interpretation? Challenge 3: Visualise what is going on and map the data and interpretation to policies.

The discussion raised a few other points as well: When we present results to real decision makers we often get feedback which contradicts our results. Based on the assumption that the policy-maker really knows things we don’t we can sometimes adjust parameters to make these results coincide (but sometimes it is just confusing).

As a recent example Z. Han’s group modelled the education in Beijing, China where the number of pupils was going down and they need to merge (close down) some schools. We could then show some simulation and alternatives.

Patrik: we seem to need a whole infrastructure (or ecosystem) of raw data, analysed data at different levels, simulation results and interpretations. Perhaps new standards, data collection agencies, etc. are needed. Chris Barrett remarked that Data provenance – meta-data management schemes are important. Survey science, survey statistics have methods which should be adaptable here.

The third topic (education) was introduced by Johan Jeuring (Utrecht) with a presentation on “Technology for learning modelling languages”. Two examples were presented: an online mathematical problem solving tool and an automatic programming tutor, giving hints for next steps or checks of your own steps. Challenge 1: How do we set up generic systems for learning modelling languages (which can be specialised to many modelling languages). Challenge 2: How do we give feedback? Question: Should awareness/dissemination be done via Massive Online Courses? Challenge 3: How do we handle multiple natural languages? Merijn: learning is also a social experience – in an online interactive course the students interact both with the system and with their peers.

Friday ICT chairs + presenters

  • Patrik Jansson; Chalmers Univ. of Techn., patrikj@chalmers.se
    • Chair of “Computer Science meets GSS”
  • Martin Elsman; DIKU, mael@diku.dk
    • Talk: “Domain Specific Languages”
  • Zhangang Han; Beijing Normal Univ., zhan@bnu.edu.cn
    • Talk: Human Machine Integrated Decision Making
  • Johan Jeuring, Utrecht University and Open University, CS prof, DSL, teaching technology
    • Talk: Technology for Learning Modelling Languages
  • Colin Harrison; IBM’s Enterprise Initiatives Team
    • Plenary talk “Information Society and Energy Addiction”

Other Friday ICT workshop participants:

  • Chris Barrett, Virginia Bioinformatics Institute
  • David de Roure, director of the eResearch center, Oxford, (digital social science)
  • Jeff Johnson, Open Univ., UK
  • John Sutcliffe-Braithwaite, PublicComputing BV, Computer engineer, work for small company, spec.
  • Jonathan Reades, UCL, London, large data sets
  • José Javier Ramasco, Mallorca, Complex networks
  • Katarzyna Szkuta, crossover, Tech4i2, ltd
  • Luis Bettencourt, Santa fe
  • Mario Rasetti, Institute of Scientific Interchange Foundation (ISI), Torino, emeritus, Physicist
  • Maxi San Miguel, CSIC, Mallorca, Spain, Physicist by training, com
  • Mereijn Terheggen, Factlink company, collective knowledge platform
  • Per Öster, CSC, Finland (Director of research environments, ICT tools for research)
  • Qian Ye, Beijing Normal University, Integrated risk governance
  • Steven Bishop, Math
  • Ulf Dahsten
  • Vittorio Loreto,
  • Wanglin Yan, Keio University, Tokyo, GIS, Sustainablility

ICT challenges for GSS, part 1

Notes from The Thursday ICT workshop (by Patrik Jansson, 2012-11-22)

The Thursday ICT workshop theme was introduced by Ulf Dahlsten already in the plenary: “The ICT challenges to Global Systems Science”. The workshop started with Per Öster (CSC-IT, Finland) talking about e-Science and European Grid Computing. Complex science (with Global Systems Science as an example) puts new demands on ICT tools. The same questions come up: How to handle data? How to access computing resources? How to control access (easy to use authentication)? Examples of existing infrastructure: EUDAT.eu: Collaborative Data Intrastructure, EDI.eu: European Grid Infrastructure. Science gateways provide low entry threshold.

Even when the basic infrastructure is in place, there is still a lot of work needed for a new field to be well supported. And it cannot be constructed by the implementors alone – co-development is important (users + implementors). We work with the research communities to build systems which work for them.

Challenge 1: Develop “science gateways” suited for Global Systems Science.

Challenge 2: Handle the uneven access (globally) to data sets (much more is available in the developed world). We need to identify data sources and quality control of them.

——

Next was Vittorio Loreto (everyaware.eu) on “Turning citizens into sensors”, expanding on the earlier plenary talk. The example was: how to enhance public awareness of climate issues? (An interesting side-line: a recent paper shows that “Environmental awareness does not lead to smaller carbon footprint”.) The measurement data collection works fine (position data, sensor box for pollution measurements etc.) Challenge 3: How do we (automatically) handle unstructured input (like users recording comments, writing down their comments, etc.) in connection with the structured data?

Z. Han: We have the technology and the expertise to collect data, but management is very important. Examples from China show that many sectors collect data without releasing it to the public. Open source slogan “release early, release often” is not easy to apply to (politically) sensitive data.

Trista Patterson:
Challenge 4: How do we create communities with a joint language and trust to enable rapid feedback pre-publication?
Crowd-sourcing successes like wikipedia are inspiring but leads us to
Challenge 5: How do we get representability of the contributors (currently 87% male for example). Improving diversity is important.

Thursday ICT chairs + presenters

  • Ulf Dahlsten
    • Chair of “The ICT challenges to Global Systems Science”
    • Plenary talk: “Global Systems and The Challenges”
  • Per Öster
    • Talk: “e-Science and European Grid Computing”
  • Vittorio Loreto
    • Plenary talk: “Participation awareness and learning”
    • Talk: “Enhance environmental awareness through social information technologies”
  • Christopher Barrett
    • Plenary talk: “Simulation of Very Large Systems”

Other Thursday ICT workshop participants (incomplete list):

  • Merijn Terheggen; FactLink.com,
  • Martin Elsman; DIKU,
  • José Javier Ramasco; IFISC,
  • Trista Patterson;
  • Luís Bettencourt;

some random notes and thoughts from GSS conference

random notes from the GSS conference

 

Some quotes that characterise GSS very well

‘ICT will not lead to sustainability it will lead to behavioural change towards sustainability’. The role of technology cannot be to help make our unsustainable lifestyle sustainable; it has to be to help us change this lifestyle.

‘A research agenda of GSS should be driven by ‘Pasteur’s principle’; that it is global challenges (like global warming, energy crisis, financial crisis) that drive a highly ambitious research agenda in close connection with society. Scientists should not be in love with their models but with the eventual usefulness of them.

GSS should contribute via models and data and their dissemination to global reasoning on global challenges. 

‘Prediction’: What does it mean in a constantly branching world (no repeatable experiment)?  (‘You cannot buy Apple stock today at the price of 10 years ago’)

GSS should link the top-down and bottom-up aspects of governance and decision making in society. GSS should make use of institutions (they are of use in a society and evolved as something useful) and make use of citizen-driven initiatives.

Data are artefacts: They are created for use and their functionality is defined by usage

Data are social objects: They cannot be seen independently of their use

Simulation can be wrong: How to account for that in policy decisions? What to do with probabilistic results in policy decisions? What about lower and upper bounds of prediction?

At the core of GSS is an understanding what it means and how to interpret results of models and data analysis.

Software is doing what it is programmed to do not what it is supposed to do

GSS and ICT

Models and data: big, unstructured, and constantly growing. Cost per GB of data is going constantly down (see eg Human genome) and cost has shifted towards data analytics.

Data platforms on data form third world countries (most of data that could drive a GSS agenda are form developed countries)

Ocean of unstructured data: How to make sense of these data (need for data analytics is today more urgent than need for data gathering).

Data are as much numerical as procedural. In particular agent-based modelling strongly depends on procedural data.

 

Simulation should not be over interpreted  beyond their validity range. One has to understand underlying assumptions.

Need for model integration

‘Eternity of models and data’: How to ensure that models and data are usable in the future.  What about data (like Twitter) that cannot be stored?

Workshop on narratives

‘Narratives are a primary means for people to make sense of the world’

A very useful distinction regarding the use of ICT in narratives is to distinguish

ICT as a tool to analyse narratives – ICT as a tool to help create/form narratives

The analysis part includes ICT tools of analysing online media (blogs, twitter etc) in terms of expressions of public sentiment. The ICT tools here are semantic analysis, network analysis, Bayesian methods and more.

The use of I CT in creation of narratives includes data visualisation, games, use of online media to enhance messages etc. Often the use of ICT can be seen less as a means to create narratives but as a means to spread and enhance narratives (e.g. youtube).

WRT the use of narratives as a means to convey messages, there seems a strong feeling of possible abuse of narratives.  It might in the light of this fear be good to distinguish between prescriptive and descriptive narratives (or more highflying normative and ontological). That is narratives that are supposed to make clearer a concept and narratives as a means to incite to action.

Narratives are used in the linkage of decisions in society and models/data driven knowledge in several ways:

– condense the message form the findings of models and data analysis (at the risk of oversimplifying). This need to condense the message is related in part to the fact that there is little evidence that policy makers base their decisions regularly on existing data. Mainly probably because data are not presented to them in a way that would allow them to guide their decision.

-to help decisions makers navigate in uncertain situations, narratives as a form of heuristics.

-narratives as a tool for traders

-Narratives as a guiding principle of action within a community. Banking was mentioned and the fact that the banking narratives evolved over the years (from making a reasonable living towards making an insane amount of money) and is now becoming a societal narrative no longer restricted to banking profession.

Colin Harrisson:

Can we develop a science of cities and what is the role of GSS in cities planning.

What can ICT do to reduce energy consumption (it is more to that than putting sensors everywhere)

Workshop on global Markets

EC official from DG MARKT (the directorate General in charge among others of financial regulation) were present and (as should be the case in GSS) there needs were driving the discussion of a possible research agenda.

Their main immediate need is a better understanding of the highly entangled banking networks. Their suggestion is to develop a research programme that identifies  the data needed (and thereby would guide EC regulation on what data should be made obligatory to provide for banks) and based on this data allows EC to better understand banking networks (in particular in shadow banking and the networks of transfer of risks).

From this immediate need (responding to which would be a case in point for GSS) there could be more long-term agendas like better understanding the issue of trust in financial systems via agent models.