Paticipatory modeling: citizens as environmental modelers

Can we complement or even replace physical laws by the experience of local people to model environmental phenomena? An article from [1] showed that an air quality map generated by local people could be of a good quality compared to one generated by a physical model? In the context of a participatory and cost-effective process to monitoring  natural resources, pollution or other kinds of commons, can we also empower local communities to form an emergent computation model of a local phenomenon using their tacit knowledge coupled with real measurements?

With the democratization of sensors and their integration in smartphones, participatory sensing approach brings new interesting perspectives compared to traditional sensor networks and opens new urban usages: mobility & openness of the monitoring network, person centric & low-cost  data, involvement & empowerment  of the public & potential reduction of the sensing -acting loops.

Challenge: covering large areas with sensors

Aside from data quality problem, a remaining issue in sensor network domain is the sparse spatio-temporal coverage of measurements. Amongst other things, the participatory approach tackles this issue with the assumption that large amount of low quality data (+ statistical methods to get reliable outputs) could replace or outperform limited amount of high quality data as it is currently the case.

1rst limited solution: optimizing the collective strategy of sensing

In the case of a “controlled participation” (i.e. an organized campaign with participants accepting tasks) a solution is to optimize data gathering using multi-agents patrolling strategy [2] partitioning and distributing sampling tasks to N agents (“citizen watchers”) having spatio-temporal preferences, to maximize the coverage of a region. We can also take into account other related factors like the sampling resolution. We could provide for instance a monitoring network system with a dynamic resolution by deploying automatically these agents to the places where they are needed most: higher density of sensors for areas requiring temporary higher granularity, and few sensors for uninteresting/stable regions.

But, even in this best case (controlled participation), it is not certain that a solution exists to fully monitor a region using only patrolling strategies because of the variability of the phenomenon, size of the covered region, number of participants/agents, number of available sensor devices,etc. -I don’t say we need to monitor continuously everywhere, but just to get a representative vision of a phenomenon).

2nd solution: Combining sensors + physical modelling

A partial solution is to interpolate or extend measurements with computational models. The common case is to use physical model simulating the phenomenon. “Data assimilation” is the process by which measurements and model predictions are combined to obtain an accurate representation of the state of the modeled system. Data assimilation is recognized as essential in weather, oceanography, climate analysis, and forecast activities.

related issues

But generally a “plug-and-play modelling system” doesn’t exist for the general public or have a high entry barrier..

  • Existence:First cities/local communities needs to ensure the existence of numerical simulators modeling the given phenomenon and integrating data assimilation with sensor measurements. a big step.
  • Granularity: They need to ensure that such model is enough accurate at a local (city level) or hyper local (district/street level) scale so that local measurement practices are relevant.
  • External input data:In general such simulators require external input data e.g. geographical topology, meteology. The model could have accurate algorithms and enough local measurements, if complementary input data are missing (inexistent, private or costly) or of poor quality, the results could be unaccurate/uncertain. For instance, on top of air sensor networks, air pollution modeling requires data from sources of emission like traffic density. Recently some research [4] integrated traffic flow sensor with air modeling to predict where air pollution would become concentrated by tracking traffic. “By showing when and where pollutants exceed targets, low emission zones could be created, by banning traffic from sensitive areas of the city”. if such data are of poor quality or missing, the forecasting of local air quality index won’t be accurate at all.
  • Experts & high cost:Simulators are not directly accessible to the general public or cities because 1/experimental or 2/private and 3/quite complex to parameter and thus creating dependency with experts and 4/ requiring high computational power

Optimizing data gathering using patrolling strategy + physical modeling is a hybrid solution quite difficult to set up in real situations at a local/city scale because not only its internal complexity but also the dependancies with external factors (complementary data, required experts). How reducing such dependancies?

3rd Proposed Solution: Combining Sensors + Empirical Modeling

A complementary and pragmatic approach, maybe under exploited, would be the use of local knowledge to model phenomena. How could we use this amount of local knowledge that people have from their experience and transfer it into a computational form? I’m not talking about subjective measurement (e.g. “bad” or “good weather”) but more about qualitative modeling. A lot of environmental phenomena can be observed directly or indirectly via visible proxies. Local people accumulate observations, learn and build their own cognitive model of complex phenomena (e.g. raining, fishery, noise/air pollution, traffic flow). They are nevertheless valuable because they are based on observations over long time periods , sometimes incorporate subtle multivariate cross checks for environmental change. In the context of a cost-effective and lightweight modelling, can we transfer their distributed cognitive models using fuzzy logic techniques into an aggregated computational model able to approximate phenomena and interpolate measurements?

Example: noise pollution

For instance, for the noisetube project [5] (a participatory approach to monitor noise pollution), local people know that the noise level is the same during all the morning at a given location, or that the noise is globally the same in given part of the street. I think that such observation could be used to reduce the coverage of measurements. By just measuring 10 minutes at a given location to get a representative observation and extending the observation to the whole street or a slice of the day using such local knowledge, we cover a larger spatio-temporal space, limiting the effort of data gathering, without the need of a simulator implementing a sound propagation model. Such a model is implicitly built in the brain of the people.

This figure represents a potential user interface on the mobile application. After having measured the sound level at a given place with his phone , the user is invited to use his knowledge to extend his observation by indicating with his finger(touch screen) the area representative of this observation, idem for the time.

Of course, In such approach we have conflict resolution to solve at a collective level. But my point here is to emphasize the new role of the public: people are not just carriers of environmental sensors to form a mobile network but are also modelers, able to predicte the state of phenomena at a place or at a time where no measurement exists. Note: In such empirical model, the spatio-temporal space will certainly not be partitioned as a homogeneous grid of spatio-temporal cells as it is commonly the case in physical modelling (e.g. weather or climate see figure -a), but will be closer to the cognitive representation people have of their environmment e.g. in urban space = a space delimited by streets, districts with a notion of “morning” etc.

a) Partition of space generally used by physical modeling vs. b) a more human one

Back to the future with Indigenous knowledge

To conclude about the understanding of complex phenoma by local people, I present an example I read in an article talking about how traditional people deals with ecosystem complexity [3].This example is the management of a reef and lagoon tenure systems in Oceania by traditional societies

Some societies have experience in reading environmental variables to deal with ecological complexity without techniques and quantitative tools as the Western science and use of ‘‘rules of thumb’’ based on a historical and ever-expanding cultural understanding of the environment.

To anounce opening and closing dates for permissible harvest, many indigenous groups of the Pacific Northwest of North America have “Salmon ceremony”. A ritual leader and his helpers would watch the salmon swimming up the river and presumably make a qualitative assessment of a sufficient number of mature fish (spawners) escaping upstream to perpetuate the population. Then the fishery would be declared open and the event marked by a ceremony. The author noted that “[..] It may be argued that such a system, led by an experienced leader, can produce results similar to one achieved by a biological management system with population models, counting fences, daily data management and harvest quota enforcement – but without the whole research infrastructure, quantitative data needs, and associated costs.” interesting.

Note

A related question is: Why do we need people to sens an area? Why not deploying lots of small low-cost sensors at fixed locations in the city (urban sensing vision), or using low-cost mobile machines if we are interested by a dynamic monitoring system? About air pollution, If we are only interested to find a low-cost and reliable solution, we could imagine an infrastructure with a set of autonomous helium balloons equipped with air sensors+GPS and an internet cnx, a solution maybe more sustainable and realistic to continuously monitor a large area than relying on the idea that lots of people will wear such kind of air sensors in the future (e.g. the montre verte project). Maybe. But in this case we also lose the freedom and empowerment aspect that people could have , and the difference in the nature of data (people centric data). I think these points have strong implication due to the direct connections they create between the commons and the public. They are probably the shortest path to change human behavior, because we shouldn’t forget sustainability-related issues are mainly anthropogenic. (I will expand this point and talk about possible reflexive/closed-loop approaches in a future post)

References

  • [1] – Cinderby, S., Snell, C., & Forrester, J. (2008). Participatory GIS and its application in governance: the example of air quality and the implications for noise pollution. Local Environment, 13(4), 309-320. doi: 10.1080/13549830701803265.
  • [2] -Almeida, A., Ramalho, G., Santana, H., & Tedesco, P. (2004). Recent Advances on Multi-Agent Patrolling. Operations Research, 526- 535.
  • [3] – Costabile, F. and Allegrini, I. (2008). A new approach to link transport emissions and air
    quality: An intelligent transport system based on the control of traffic air pollution. Environmental
    Modelling and Software. 23(3): 258-267.
  • [4] -Berkes, F., & Berkes, M. K. (2009). Ecological complexity , fuzzy logic , and holism in indigenous knowledge. Soil Science Society of America Journal, 41, 6-12. doi: 10.1016/j.futures.2008.07.003
  • [5] -NoiseTube project


Comments are closed.