Journal Articles

Silva JMC, Carvalho P, Lima SR.  2017.  A Modular Traffic Sampling Architecture: Bringing Versatility and Efficiency to Massive Traffic Analysis. Journal of Network and Systems Management. :1–26. AbstractWebsite
n/a
Bernardeschi C, Domenici A, Masci P.  2017.  A PVS-Simulink Integrated Environment for Model-Based Analysis of Cyber-Physical Systems. IEEE Transactions on Software Engineering. (In Press) Abstractcosim-tse17-preprint.pdfhttps://doi.org/10.1109/TSE.2017.2694423

This paper presents a methodology, with supporting tool, for formal modeling and analysis of software components in cyber-physical systems. Using our approach, developers can integrate a simulation of logic-based specifications of software components and Simulink models of continuous processes. The integrated simulation is useful to validate the characteristics and functionalities of discrete system components early in the development process. The same logic-based specifications can also be formally verified using the Prototype Verification System (PVS), to gain additional confidence that the software design is compliant with specific safety requirements. Modeling patterns are defined for generating the logic-based specifications from the more familiar automata-based formalism. The ultimate aim of this work is to facilitate the introduction of formal verification technologies in the software development process of cyber-physical systems, which typically requires the integrated use of different formalisms and tools. A case study from the medical domain is used throughout the paper to illustrate the approach. A PVS model of a pacemaker is interfaced with a Simulink model of the human heart. The overall cyber-physical system is co-simulated to validate design requirements through exploration of relevant test scenarios. Formal verification with the PVS theorem prover is demonstrated for the pacemaker model for specific safety aspects of the pacemaker design.

Harrison MD, Masci P, Campos JC, Curzon P.  2017.  Verification of User Interface Software: the Example of Use-Related Safety Requirements and Programmable Medical Devices. IEEE Transactions on Human-Machine Systems. Abstractmedthmsv10.pdf

One part of demonstrating that a device is acceptably safe, often required by regulatory standards, is to show that it satisfies a set of requirements known to mitigate hazards. This paper is concerned with how to demonstrate that a user interface software design is compliant with use-related safety requirements. A methodology is presented based on the use of formal methods technologies to provide guidance to developers about addressing three key verification challenges: (i) how to validate a model, and show that it is a faithful representation of the device; (ii) how to formalize requirements given in natural language, and demonstrate the benefits of the formalization process; (iii) how to prove requirements of a model using readily available formal verification tools. A model of a commercial device is used throughout the paper to demonstrate the methodology. A representative set of requirements are considered. They are based on US Food and Drug Administration (FDA) draft documentation for programmable medical devices, and on best practice in user interface design illustrated in relevant international standards. The methodology aims to demonstrate how to achieve the FDA's agenda of using formal methods to support the approval process for medical devices.

Campos JC, Sousa M, Alves M, Harrison M.  2016.  Formal Verification of a Space System's User Interface with the IVY workbench. IEEE Transactions on Human-Machine Systems. 46(2):303-316. Abstractthms-paper-author_version.pdf

This paper describes the application of the IVY workbench to the formal analysis of a user interface for a safety-critical aerospace system. The operation manual of the system was used as a requirement document, and this made it possible to build a reference model of the user interface, focusing on navigation between displays, the information provided by each display, and how they are interrelated. Usability-related property specification patterns were then used to derive relevant properties for verification. This paper discusses both the modeling strategy and the analytical results found using the IVY workbench. The purpose of the reference model is to provide a standard against which future versions of the interface may be assessed.

Gonçalves RC, Batory D, Sobral JL.  2016.  ReFlO: an interactive tool for pipe-and-filter domain specification and program generation. Software and Systems Modeling. 15(2):377-395. Abstract22.pdf

ReFlO is a framework and interactive tool to record and systematize domain knowledge used by experts to derive complex pipe-and-filter (PnF) applications. Domain knowledge is encoded as transformations that alter PnF graphs by refinement (adding more details), flattening (removing modular boundaries), and optimization (substituting inefficient PnF graphs with more efficient ones). All three kinds of transformations arise in reverse-engineering legacy PnF applications. We present the conceptual foundation and tool capabilities of ReFlO, illustrate how parallel PnF applications are designed and generated, and how domain-specific libraries of transformations are developed.

Jongmans S-STQ, Clarke D, Proença J.  2016.  A procedure for splitting data-aware processes and its application to coordination. Science of Computer Programming. 115-116:47-78. Abstractsplitreo-scp.pdfWebsite

n/a

J P, JO P.  2016.  Efficient Deduplication in a Distributed Primary Storage Infrastructure. ACM Transactions on Storage (TOS). 12(4):35.pp16.pdf
Lourenço CB, Frade MJ, Pinto JS.  2016.  A Single-Assignment Translation for Annotated Programs. CoRR. abs/1601.00584 Abstractsatranslation.pdfWebsite

We present a translation of While programs annotated with loop invariants into a dynamic single-assignment language with a dedicated iterating construct. We prove that the translation is sound and complete. This is a companion report to our paper Formalizing Single-assignment Program Verification: an Adaptation-complete Approach.

Machado N, Quinta D, Lucia B, Rodrigues L.  2016.  Concurrency Debugging with Differential Schedule Projections. ACM Transactions on Software Engineering and Methodology (TOSEM). 25(2)17.pdfWebsite
Macedo N, Jorge T, Cunha A.  2016.  A feature-based classification of model repair approaches. IEEE Transactions on Software Engineering. repair16.pdf
Almeida JB, Barbosa M, Pacheco H, Pereira V.  2016.  A Tool-Chain for High-Assurance Cryptographic Software. ERCIM News. 2016 Abstract16ercimnews.pdfWebsite

Cryptography is an inherently interdisciplinary area and the development of high-quality cryptographic software is a time-consuming task drawing on skills from mathematics, computer science and electrical engineering, only achievable by highly skilled programmers. The challenge is to map high-level cryptographic specifications phrased using mathematical abstractions into efficient implementations at the level of C or assembly that can be deployed on a target computational platform, whilst adhering to the specification both in terms of correctness and security. The High Assurance Software Laboratory at INESC-TEC maintains a domain-specific toolchain for the specification, implementation and verification of cryptographic software centred on CAO, a cryptography analyses and operations-aware language.

Fontaínhas EG, Andrade F, Almeida JB.  2016.  Do consentimento para a utilização de testemunhos de conexão (cookies). SCIENTIA IVRIDICA. 65(341):173-206. Abstract16scientiaivridica.pdf

O n.º 3 do artigo 5.º da Diretiva da Privacidade Eletrónica ( Directiva 2002/58/CE) estabelece os requisitos para o armazenamento e acesso a informação armazenada no terminal do utilizador ou assinante. Esta norma aplica-se à utilização de testemunhos de conexão (cookies), entendidos na aceção da definição dada pela norma RFC 6265 da Internet Engineering Task Force (IETF).

Na sua versão original, a Diretiva da Privacidade Eletrónica permitia a utilização de redes de comunicações eletrónicas para a armazenagem de informações ou para obter acesso à informação armazenada no equipamento terminal de um assinante ou utilizador, na condição de serem prestadas ao assinante ou utilizador informações claras e completas, nomeadamente sobre as finalidades do processamento e de, cumulativamente, lhe ser garantido o direito de recusar o tratamento (direito de autoexclusão ou direito de opt-out). Em 2009, a Diretiva dos Cidadãos (Diretiva 2009/136/CE) veio dar uma nova redação ao n.º 3 do artigo 5.º da Diretiva da Privacidade Eletrónica e passou a fazer depender a utilização de cookies da prévia obtenção do consentimento da pessoa em causa (direito de opt-in).

O novo requisito de consentimento veio abalar as práticas correntes no que respeita à utilização de cookies e está na base de um aceso debate sustentado pelas dúvidas acerca da sua interpretação e condições de implementação prática.

Procuramos, com este trabalho, contribuir para o esclarecimento dos conceitos de cookies e de consentimento enquanto fundamento legitimante para a sua utilização.

Ramos MVM, Almeida JB, Moreira N, de Queiroz RJGB.  2016.  Formalization of the Pumping Lemma for Context-Free Languages. Journal of Formalized Reasoning. 9(2):53-68. Abstract16jfr.pdfWebsite

Context-free languages are highly important in computer language processing technology as well as in formal language theory. The Pumping Lemma is a property that is valid for all context-free languages, and is used to show the existence of non context-free languages. This paper presents a formalization, using the Coq proof assistant, of the Pumping Lemma for context-free languages.

Madeira A, Neves R, Martins MA.  2016.  An exercise on the generation of many-valued dynamic logics. J. Log. Algebr. Meth. Program.. 85:1011–1037. Abstract8.pdfWebsite

n/a

Neves R, Madeira A, Martins MA, Barbosa L{\'ı}s S.  2016.  Proof theory for hybrid(ised) logics. Sci. Comput. Program.. 126:73–93. AbstractWebsite
n/a
Madeira A, Neves R, Barbosa L{\'ı}s S, Martins MA.  2016.  A method for rigorous design of reconfigurable systems. Sci. Comput. Program.. 132:50–76. AbstractWebsite
n/a
Neves R, Barbosa LS, Hofmann D, Martins MA.  2016.  Continuity as a computational effect. J. Log. Algebr. Meth. Program.. 85:1057–1085. Abstractnbhm16.pdfWebsite

The original purpose of component-based development was to provide techniques to master complex software, through composition, reuse and parametrisation. However, such systems are rapidly moving towards a level in which software becomes prevalently intertwined with (continuous) physical processes. A possible way to accommodate the latter in component calculi relies on a suitable encoding of continuous behaviour as (yet another) computational effect. This paper introduces such an encoding through a monad which, in the compositional development of hybrid systems, may play a role similar to the one played by 1+, powerset, and distribution monads in the characterisation of partial, nondeterministic and probabilistic components, respectively. This monad and its Kleisli category provide a universe in which the effects of continuity over (different forms of) composition can be suitably studied.

Neves R, Madeira A, Martins MA, Barbosa LS.  2016.  Proof theory for hybrid(ised) logics. Sci. Comput. Program.. 126:73–93. Abstractnmmb16.pdfWebsite

Hybridisation is a systematic process along which the characteristic features of hybrid logic, both at the syntactic and the semantic levels, are developed on top of an arbitrary logic framed as an institution. In a series of papers this process has been detailed and taken as a basis for a specification methodology for reconfigurable systems. The present paper extends this work by showing how a proof calculus (in both a Hilbert and a tableau based format) for the hybridised version of a logic can be systematically generated from a proof calculus for the latter. Such developments provide the basis for a complete proof theory for hybrid(ised) logics, and thus pave the way to the development of (dedicated) proof support. Decidability results for hybrid(ised) logics.Generation of Hilbert calculi for hybrid(ised) logics.Generation of Tableau systems for hybrid(ised) logics.

Madeira A, Neves R, Barbosa LS, Martins MA.  2016.  A method for rigorous design of reconfigurable systems. Sci. Comput. Program.. 132:50–76. Abstractmnbm2016.pdfWebsite

Reconfigurability, understood as the ability of a system to behave differently in different modes of operation and commute between them along its lifetime, is a cross-cutting concern in modern Software Engineering. This paper introduces a specification method for reconfigurable software based on a global transition structure to capture the system's reconfiguration space, and a local specification of each operation mode in whatever logic (equational, first-order, partial, fuzzy, probabilistic, etc.) is found expressive enough for handling its requirements. In the method these two levels are not only made explicit and juxtaposed, but formally interrelated. The key to achieve such a goal is a systematic process of hybridisation of logics through which the relationship between the local and global levels of a specification becomes internalised in the logic itself.

Silva JMC, Carvalho P, Lima SR.  2016.  Inside packet sampling techniques: exploring modularity to enhance network measurements. International Journal of Communication Systems. AbstractWebsite

n/a

Almeida PS, Moreno CB, Farach-Colton M, Jesus P, Mosteiro MA.  2016.  Fault-tolerant aggregation: Flow-Updating meets Mass-Distribution. Distributed Computing. AbstractFull paper

Flow-Updating (FU) is a fault-tolerant technique that has proved to be efficient in practice for the distributed computation of aggregate functions in communication networks where individual processors do not have access to global information. Previous distributed aggregation protocols, based on repeated sharing of input values (or mass) among processors, sometimes called Mass-Distribution (MD) protocols, are not resilient to communication failures (or message loss) because such failures yield a loss of mass. In this paper, we present a protocol which we call Mass-Distribution with Flow-Updating (MDFU). We obtain MDFU by applying FU techniques to classic MD. We analyze the convergence time of MDFU showing that stochastic message loss produces low overhead. This is the first convergence proof of an FU-based algorithm. We evaluate MDFU experimentally, comparing it with previous MD and FU protocols, and verifying the behavior predicted by the analysis. Finally, given that MDFU incurs a fixed deviation proportional to the message-loss rate, we adjust the accuracy of MDFU heuristically in a new protocol called MDFU with Linear Prediction (MDFU-LP). The evaluation shows that both MDFU and MDFU-LP behave very well in practice, even under high rates of message loss and even changing the input values dynamically.

Cunha A, Garis A, Riesco D.  2015.  Translating between Alloy Specifications and UML Class Diagrams Annotated with OCL. Software and Systems Modeling. 14(1):5-25. Abstractalloymda.pdf

Model-driven engineering (MDE) is a software engineering approach based on model transformations at different abstraction levels. It prescribes the development of software by successively transforming the models from abstract (specifications) to more concrete ones (code). Alloy is an increasingly popular lightweight formal specification language that supports automatic verification. Unfortunately, its widespread industrial adoption is hampered by the lack of an ecosystem of MDE tools, namely code generators. This paper presents a model transformation from Alloy to UML class diagrams annotated with OCL (UML+OCL) and shows how an existing transformation from UML+OCL to Alloy can be improved to handle dynamic issues. The proposed bidirectional transformation enables a smooth integration of Alloy in the current MDE contexts, by allowing UML+OCL specifications to be transformed to Alloy for validation and verification, to correct and possibly refine them inside Alloy, and to translate them back to UML+OCL for sharing with stakeholders or to reuse current model-driven architecture tools to refine them toward code.

Macedo H, Oliveira JN.  2015.  A linear algebra approach to OLAP. Formal Aspects of Computing. 27(2):283-307. Abstractart3a10.10072fs00165-014-0316-9.pdf

Inspired by the relational algebra of data processing, this paper addresses the foundations of data analytical processing from a linear algebra perspective. The paper investigates, in particular, how aggregation operations such as cross tabulations and data cubes essential to quantitative analysis of data can be expressed solely in terms of matrix multiplication, transposition and the Khatri–Rao variant of the Kronecker product. The approach offers a basis for deriving an algebraic theory of data consolidation, handling the quantitative as well as qualitative sides of data science in a natural, elegant and typed way. It also shows potential for parallel analytical processing, as the parallelization theory of such matrix operations is well acknowledged.

Cunha J, Fernandes JP, Mendes J, Saraiva JA.  2015.  Embedding, Evolution, and Validation of Spreadsheet Models in Spreadsheet Systems. IEEE Transactions on Software Engineering. 41(3):241-263. Abstracttse14.pdf

This paper proposes and validates a model-driven software engineering technique for spreadsheets. The technique that we envision builds on the embedding of spreadsheet models under a widely used spreadsheet system. This means that we enable the creation and evolution of spreadsheet models under a spreadsheet system. More precisely, we embed ClassSheets, a visual language with a syntax similar to the one offered by common spreadsheets, that was created with the aim of specifying spreadsheets. Our embedding allows models and their conforming instances to be developed under the same environment. In practice, this convenient environment enhances evolution steps at the model level while the corresponding instance is automatically co-evolved. Finally, we have designed and conducted an empirical study with human users in order to assess our technique in production environments. The results of this study are promising and suggest that productivity gains are realizable under our model-driven spreadsheet development setting.