Recent Publications

He M, Vafeiadis V, Qin S, Ferreira JF.  2016.  Reasoning about Fences and Relaxed Atomics. Search Results 24th Euromicro International Conference on Parallel, Distributed, and Network-Based Processing. Abstract2016-pdp-gpsfences.pdf

For efficiency reasons, weak (or relaxed) memory is now the norm on modern architectures. To cater for this trend, modern programming languages are adapting their memory models. The new C11 memory model allows several levels of memory weakening, including non-atomics, relaxed atomics, release-acquire atomics, and sequentially consistent atomics. Under such weak memory models, multithreaded programs exhibit more behaviours, some of which would have been inconsistent under the traditional strong (i.e. sequentially consistent) memory model. This makes the task of reasoning about concurrent programs even more challenging. The GPS framework, recently developed by Turon et al., has made a step forward towards tackling this challenge. By integrating ghost states, per-location protocols and separation logic, GPS can successfully verify programs with release-acquire atomics. In this paper, we present a program logic, an enhancement of the GPS framework, that can support the verification of a bigger class of C11 programs, that is, programs with release-acquire atomics, relaxed atomics and release-acquire fences. Key elements of our proposed logic include two new types of assertions, a more expressive resource model and a set of newly-designed verification rules.

Even C, Bosser A-G, Ferreira JF, Buche C, Stéphan F, Cavazza M, Lisetti C.  2016.  Supporting Social Skills Rehabilitation with Virtual Storytelling. 29th International FLAIRS Conference. Abstract12953-57659-1-pb.pdf

Social skills training (SST) has recently emerged as a typical application for emotional conversational agents (ECA). While a number of prototypes have targeted the general population, fewer have been used for psychiatric patients despite the widely recognised potential of ECAs technologies in the field of mental health. Social cognition impairment is however a widely shared symptom in psychiatric patients suffering from pathologies such as schizophrenia. Going further than SST, rehabilitation programmes involving role-play, but also problem solving have been successfully used by clinicians, drastically improving the quality of life of patients suffering from such disabilities. One of the challenges of these programmes is to ensure that the patients will be able to adapt their behaviour when the situation varies, rather than training them with the appropriate behaviour for a set of specific situations.
In this paper, we describe a novel approach for the development of a serious game supporting rehabilitation programmes for social skills, which will primarily target schizophrenia patients. We propose to use an ECA in combination with a narrative generation engine issued from interactive storytelling research to provide varied situations. This approach reflects the combination of both role-play and problem solving exercises on which remediation therapies rely, and has the potential to support patient's progress and motivation through the rehabilitation programme.

Couto R, Ribeiro AN, Campos JC.  2016.  Validating an approach to formalize use cases with ontologies. Proceedings of the 13th International Workshop on Formal Engineering Approaches to Software Components and Architectures. 205:1-15. Abstract1603.08632v1.pdf

Use case driven development methodologies put use cases at the center of the software development process. However, in order to support automated development and analysis, use cases need to be appropriately formalized. This will also help guarantee consistency between requirements specifications and the developed solutions. Formal methods tend to suffer from take up issues, as they are usually hard to accept by industry. In this context, it is relevant not only to produce languages and approaches to support formalization, but also to perform their validation. In previous works we have developed an approach to formalize use cases resorting to ontologies. In this paper we present the validation of one such approach. Through a three stage study, we evaluate the acceptance of the language and supporting tool. The first stage focusses on the acceptance of the process and language, the second on the support the tool provides to the process, and finally the third one on the tool's usability aspects. Results show test subjects found the approach feasible and useful and the tool easy to use.

Moreno CB, Almeida PS, Lerche C.  2016.  The problem with embedded CRDT counters and a solution. PaPoC '16 Proceedings of the 2nd Workshop on the Principles and Practice of Consistency for Distributed Data. abstractcounterpapocfinal.pdf
Zawirski M, Moreno CB, Zawirski M, Preguiça N, Shapiro M.  2016.  Eventually Consistent Register Revisited. Proceeding PaPoC '16 Proceedings of the 2nd Workshop on the Principles and Practice of Consistency for Distributed Data. mvreg_papoc_camera.pdf
Campos JC, Fayollas C, Martinie C, Navarre D, Palanque P, Pinto M.  2016.  Systematic Automation of Scenario-Based Testing of User Interfaces. In Proceedings of the 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, pages 138-148. Abstractfp0148-paper.pdf

Ensuring the effectiveness factor of usability consists in ensuring that the application allows users to reach their goals and perform their tasks. One of the few means for reaching this goal relies on task analysis and proving the compatibility between the interactive application and its task models. Synergistic execution enables the validation of a system against its task model by co-executing the system and the task model and comparing the behavior of the system against what is prescribed in the model. This allows a tester to explore scenarios in order to detect deviations between the two behaviors. Manual exploration of scenarios does not guarantee a good coverage of the analysis. To address this, we resort to model-based testing (MBT) techniques to automatically generate scenarios for automated synergistic execution. To achieve this, we generate, from the task model, scenarios to be co-executed over the task model and the system. During this generation step we explore the possibility of including considerations about user error in the analysis. The automation of the execution of the scenarios closes the process. We illustrate the approach with an example.

Harrison M, Campos JC, Ruksenas R, Curzon P.  2016.  Modelling information resources and their salience in medical device design. In Proceedings of the 8th ACM SIGCHI Symposium on Engineering Interactive Computing Systems, pages 194-203. Abstractocbeics16pub.pdf

The paper describes a model that includes an explicit description of the information resources that are assumed to guide use, enabling a focus on properties of "plausible interactions". The information resources supported by an interactive system should be designed to encourage the correct use of the system. These resources signpost a user's interaction, helping to achieve desired goals. Analysing assumptions about information resource support is particularly relevant when a system is safety critical that is when interaction failure consequences could be dangerous, or walk-up-and-use where interaction failure may lead to reluctance to use with expensive consequences. The paper shows that expressing these resource constraints still provides a wider set of behaviours than would occur in practice. A resource may be more or less salient at a particular stage of the interaction and as a result potentially overlooked. For example, the resource may be accessible but not used because it does not seem relevant to the current goal. The paper describes how the resource framework can be augmented with additional information about the salience of the assumed resources. A medical device that is in common use in many hospitals is used as illustration.

Bernardeschi C, Domenici A, Masci P.  2016.  Modeling Communication Network Requirements for an Integrated Clinical Environment in the Prototype Verification System. ICTS4eHealth - 1st IEEE Workshop on ICT solutions for eHealth. Abstractmasci-icts4ehealth-cr.pdf

Health care practices increasingly rely on complex technological infrastructure, and new approaches to the integration of information and communication technology in those practices lead to the development of such concepts as integrated clinical environments and smart intensive care units. These concepts refer to hospital settings where therapy relies heavily on inter-operating medical devices, supervised by clinicians assisted by advanced monitoring and coordinating software. In order to ensure safety and effectiveness of patient care, it is necessary to specify the requirements of such socio-technical systems in the most rigorous and precise way. This paper presents an approach to the formalization of system requirements for communication networks deployed in integrated clinical environment, based on the higher-order logic language of a theorem-proving environment, the Prototype Verification System.

Coelho F, Pereira JO, Vilaça R, Oliveira R.  2016.  Holistic Shuffler for the Parallel Processing of SQL Window Functions. Distributed Applications and Interoperable Systems - 16th {IFIP} {WG} 6.1 International Conference, {DAIS} 2016, Held as Part of the 11th International Federated Conference on Distributed Computing Techniques, DisCoTec 2016, Heraklion, Crete, Greece, June. :75–81. Abstractholistic-proceedings.pdf

Window functions are a sub-class of analytical operators that allow data to be handled in a derived view of a given relation, while taking into account their neighboring tuples. Currently, systems bypass parallelization opportunities which become especially relevant when considering Big Data as data is naturally partitioned.
We present a shuffling technique to improve the parallel execution of window functions when data is naturally partitioned when the query holds a partitioning clause that does not match the natural partitioning of the relation. We evaluated this technique with a non-cumulative ranking function and we were able to reduce data transfer among parallel workers in 85% when compared to a naive approach.

Coelho F, Pereira JO, Vilaça R, Oliveira R.  2016.  Reducing Data Transfer in Parallel Processing of SQL Window Functions. Proceedings of the 6th International Conference on Cloud Computing and Services Science. :343-347. Abstractdatadiversityconvergence_2016_1_copy.pdf

Window functions are a sub-class of analytical operators that allow data to be handled in a derived view of a given relation, while taking into account their neighboring tuples. We propose a technique that can be used in the parallel execution of this operator when data is naturally partitioned. The proposed method benefits the cases where the required partitioning is not the natural partitioning employed. Preliminary evaluation shows that we are able to limit data transfer among parallel workers to 14\% of the registered transfer when using a naive approach.

Maia F, Matos M, Coelho F.  2016.  Towards Quantifiable Eventual Consistency. Proceedings of the 6th International Conference on Cloud Computing and Services Science. :368-370. Abstractdatadiversityconvergence_2016_5.pdf

In the pursuit of highly available systems, storage systems began offering eventually consistent data models. These models are suitable for a number of applications but not applicable for all. In this paper we discuss a system that can offer a eventually consistent data model but can also, when needed, offer a strong consistent one.

Lourenço CB, Frade MJ, Pinto JS.  2016.  Formalizing Single-Assignment Program Verification: An Adaptation-Complete Approach. Programming Languages and Systems - 25th European Symposium on Programming, {ESOP} 2016, Held as Part of the European Joint Conferences on Theory and Practice of Software, {ETAPS} 2016, Eindhoven, The Netherlands, April 2-8, 2016, Proceedings. 9632:41–67. Abstractesop2016.pdf

Deductive verification tools typically rely on the conversion of code to a single-assignment (SA) form. In this paper we formalize pro- gram verification based on the translation of While programs annotated with loop invariants into a dynamic single-assignment language with a dedicated iterating construct, and the subsequent generation of compact, indeed linear-size, verification conditions. Soundness and completeness proofs are given for the entire workflow, including the translation of an- notated programs to SA form. The formalization is based on a program logic that we show to be adaptation-complete. Although this important property has not, as far as we know, been established for any existing program verification tool, we believe that adaptation-completeness is one of the major motivations for the use of SA form as an intermediate lan- guage. Our results here show that indeed this allows for the tools to achieve the maximum degree of adaptation when handling subprograms.

Ramos MVM, de Queiroz RJGB, Moreira N, Almeida JB.  2016.  On the Formalization of Some Results of Context-Free Language Theory. Logic, Language, Information, and Computation – 23rd International Workshop, WoLLIC 2016. 9803 Abstract16wollic.pdf

This work describes a formalization effort, using the Coq proof assistant, of fundamental results related to the classical theory of context-free grammars and languages. These include closure properties (union, concatenation and Kleene star), grammar simplification (elimi- nation of useless symbols, inaccessible symbols, empty rules and unit rules), the existence of a Chomsky Normal Form for context-free grammars and the Pumping Lemma for context-free languages. The result is an important set of libraries covering the main results of context-free language theory, with more than 500 lemmas and theorems fully proved and checked. This is probably the most comprehensive formalization of the classical context-free language theory in the Coq proof assistant done to the present date, and includes the important result that is the formalization of the Pumping Lemma for context-free languages.

Cledou MG, Barbosa LS.  2016.  An Ontology for Licensing Public Transport Services. Proceedings of the 9th International Conference on Theory and Practice of Electronic Governance. :230–239. Abstractan_ontology_for_licensing_public_transport_services.pdf

By 2050 it is expected that 66% of the world population will reside in cities, compared to 54% in 2014. One particular challenge associated to urban population growth refers to transportation systems, and as an approach to face it, governments are investing significant efforts enhancing public transport services. An important aspect of public transport is ensuring that licensing of such services fulfill existing government regulations. Due to the differences in government regulations, and to the difficulties in ensuring the fulfillment of their specific features, many local governments develop tailored Information and Communication Technology (ICT) solutions to automate the licensing of public transport services. In this paper we propose an ontology for licensing such services following the REFSENO methodology. In particular, the ontology captures common concepts involved in the application and processing stage of licensing public bus passenger services. The main contribution of the proposed ontology is to define a common vocabulary to share knowledge between domain experts and software engineers, and to support the definition of a software product line for families of public transport licensing services.

Gonçalves RC, Pereira JO, Jimenez-Peris R.  2016.  An RDMA Middleware for Asynchronous Multi-stage Shuffling in Analytical Processing. DAIS '16: Proceedings of the 16th IFIP International Conference on Distributed Applications and Interoperable Systems. Abstract

A key component in large scale distributed analytical processing is shuffling, the distribution of data to multiple nodes such that the computation can be done in parallel. In this paper we describe the design and implementation of a communication middleware to support data shuffling for executing multi-stage analytical processing operations in parallel. The middleware relies on RDMA (Remote Direct Memory Access) to provide basic operations to asynchronously exchange data among multiple machines. Experimental results show that the RDMA-based middleware developed can provide a 75 % reduction of the costs of communication operations on parallel analytical processing tasks, when compared with a sockets middleware.