Publications

Machado P, Campos JC, Abreu R.  2013.  MZoltar: automatic debugging of Android applications. Proceedings of the 2013 International Workshop on Software Development Lifecycle for Mobile - DeMobile. :9–16. Abstractpedromachado-demobile-2013.pdf

Automated diagnosis of errors and/or failures detected during software testing can greatly improve the eciency of the debugging process, and thus help to make applications more reliable. In this paper, we propose an approach, dubbed MZoltar, o ering dynamic analysis (namely, spectrum- based fault localization) of mobile apps that produces a diagnostic report to help identifying potential defects quickly. The approach also o ers a graphical representation of the diagnostic report, making it easier to understand. Our experimental results show that the approach requires low runtime overhead (5.75% on average), while the tester needs to inspect 5 components (statements in this paper) on average to a nd the fault.

Campos JC, Abreu R, Fraser G, d'Amorim M.  2013.  Entropy-based test generation for improved fault localization. IEEE/ACM 28th International Conference on Automated Software Engineering - ASE). :257–267. Abstractcampos-etal-ase2013.pdf

Spectrum-based Bayesian reasoning can effectively rank candidate fault locations based on passing/failing test cases, but the diagnostic quality highly depends on the size and diversity of the underlying test suite. As test suites in practice often do not exhibit the necessary properties, we present a technique to extend existing test suites with new test cases that optimize the diagnostic quality. We apply probability theory concepts to guide test case generation using entropy, such that the amount of uncertainty in the diagnostic ranking is minimized. Our ENTBUG prototype extends the search-based test generation tool EVOSUITE to use entropy in the fitness function of its underlying genetic algorithm, and we applied it to seven real faults. Empirical results show that our approach reduces the entropy of the diagnostic ranking by 49% on average (compared to using the original test suite), leading to a 91% average reduction of diagnosis candidates needed to inspect to find the true faulty one.

Gouveia C, Campos JC, Abreu R.  2013.  Using HTML5 visualizations in software fault localization. First IEEE Working Conference on Software Visualization - VISSOFT. :1–10. Abstractcarlosgouveia-vissoft-2013.pdf

Testing and debugging is the most expensive, error-prone phase in the software development life cycle. Automated software fault localization can drastically improve the efficiency of this phase, thus improving the overall quality of the software. Amongst the most well-known techniques, due to its efficiency and effectiveness, is spectrum-based fault localization. In this paper, we propose three dynamic graphical forms using HTML5 to display the diagnostic reports yielded by spectrum-based fault localization. The visualizations proposed, namely Sunburst, Vertical Partition, and Bubble Hierarchy, have been implemented within the GZOLTAR toolset, replacing previous and less-intuitive OpenGL-based visualizations. The GZOLTAR toolset is a plug-and-play plugin for the Eclipse IDE to ease world-wide adoption. Finally, we performed an user study with GZOLTAR and confirmed that the visualizations help to drastically reduce the time needed in debugging (e.g., all participants using the visualizations were able to pinpoint the fault, whereas of those using traditional methods only 35% found the fault). The group that used the visualizations took on average 9 minutes and 17 seconds less than the group that did not use them.

Harrison M, Masci P, Campos JC, Curzon P.  2013.  Automated theorem proving for the systematic analysis of interactive systems. Proceedings of the 5th International Workshop on Formal Methods for Interactive Systems. Abstractharrisonmcc.pdf

This paper describes the use of an automated theorem prover to analyse properties of interactive behaviour. It offers an alternative to model checking for the analysis of interactive systems. There are situations, for example when demonstrating safety, in which alternative complementary analyses provide assurance to the regulator. The rigour and detail offered by theorem proving makes it possible to explore features of the design of the interactive system, as modelled, beyond those that would be revealed using model checking. Theorem proving can also speed up proof in some circumstances. The paper illustrates how a theory generated as a basis for theorem proving (using PVS) was developed systematically from a MAL model used to model check the same properties. It also shows how the CTL properties used to check the original model can be translated into theorems.

Couto R, Ribeiro AN, Campos JC.  2012.  A Java based PSM/PIM and pattern inference approach. 35th annual Software Engineering Workshop - SEW. Abstracteeum_di_dissertacao_pg15456.pdf

Due to the constant increase in the number of platforms and languages available to software developers, we are reaching high levels of complexity. To abstract the complexity that underlies it, the development of new techniques is needed. A solution to this problem was presented by the Object Management Group (OMG) by specifying the Model Driven Engineering (MDE). The MDE bases its development process in models definition and transformation, specifically Computation Independent Models (CIM), Platform Independent Models (PIM) and Platform Specific Models (PSM). The Unified Model Language (UML) allows to create Platform Specific Models (PSM) and Platform Independent Models (PIM), or even more specific diagrams as class diagrams. Some years before the MDE appearance, Erich Gamma et al. catalogued a set of correct means of producing software. These means are called design patterns, and its importance has already been widely recognized. These patterns are not only useful in software developing, but also in the software analysis process. Based on Java programs, this document presents the feasibility to transform source code on MDE models. This code will be transformed into PIM and PSM diagrams, in which will be inferred design patterns. As such, a tool which implements these functionalities will be specified. Implemented as a plugin, it maps the information on a metamodel to obtain an intermediate information representation. Based on that representation it provides information abstraction, by transforming PSM on PIM models. The design patterns inference is possible due to the representation of information contained in the metamodel as Prolog facts, which will be the basis for the design pattern search. Being a reverse engineering process, it allows the process to be started from the source code (and not in models, as predicted by MDE).

Silva CE, Campos JC.  2012.  Can GUI implementation markup languages be used for modelling? Human Centred Software Engineering - HCSE. 7623:112-129. Abstractsilvac-hcse2012-final_submission.pdf

The current diversity of available devices and form factors increases the need for model-based techniques to support adapting applications from one device to another. Most work on user interface modelling is built around declarative markup languages. Markup languages play a relevant role, not only in the modelling of user interfaces, but also in their implementation. However, the languages used by each community (modellers/developers) have, to a great extent evolved separately. This means that the step from concrete model to final interface becomes needlessly complicated, requiring either compilers or interpreters to bridge this gap. In this paper we compare a modelling language (UsiXML) with several markup implementation languages. We analyse if it is feasible to use the implementation languages as modelling languages.

Harrison M, Campos JC, Masci P, Thomas N.  2012.  Modelling and systematic analysis of interactive systems. Proceedings of the Workshop on Formal Methods in Human-Machine Interaction (Formal H). :25-28. Abstractformalh.2012.proceedings.pdf

Two aspects of our research concern the application of formal methods in human-computer interaction. The first aspect is the modelling and analysis of interactive devices with a particular emphasis on the user device dyad. The second is the modelling and analysis of ubiquitous systems where there are many users, one might say crowds of users.The common thread of both is to articulate and prove properties of interactive systems, to explore interactive behaviour as it influences the user, with a particular emphasis on interaction failure. The goal is to develop systematic techniques that can be packaged in such a way that they can be used effectively by developers. This “whitepaper” will briefly describe the two approaches and their potential value as well as their limitations and development opportunities.

Silva JL, Campos JC, Harrison M.  2012.  Formal analysis of Ubiquitous Computing environments through the APEX framework. Symposium on Engineering Interactive Computing Systems - EICS. :131-140. Abstract2012-eics.pdf

Ubiquitous computing (ubicomp) systems involve complex interactions between multiple devices and users. This complexity makes it difficult to establish whether: (1) observations made about use are truly representative of all possible interactions; (2) desirable characteristics of the system are true in all possible scenarios. To address these issues, techniques are needed that support an exhaustive analysis of a system's design. This paper demonstrates one such exhaustive analysis technique that supports the early evaluation of alternative designs for ubiquitous computing environments. The technique combines models of behavior within the environment with a virtual world that allows its simulation. The models support checking of properties based on patterns. These patterns help the analyst to generate and verify relevant properties. Where these properties fail then scenarios suggested by the failure provide an important aid to redesign. The proposed technique uses APEX, a framework for rapid prototyping of ubiquitous environments based on Petri nets. The approach is illustrated through a smart library example. Its benefits and limitations are discussed.

Couto R, Ribeiro AN, Campos JC.  2012.  A Patterns Based Reverse Engineering Approach for Java Source Code. 35th Annual Software Engineering Workshop - SEW. :140-147. Abstractsew_35.pdf

The ever increasing number of platforms and languages available to software developers means that the software industry is reaching high levels of complexity. Model Driven Architecture (MDA) presents a solution to the problem of improving software development processes in this changing and complex environment. MDA driven development is based on models definition and transformation. Design patterns provide a means to reuse proven solutions during development. Identifying design patterns in the models of a MDA approach helps their understanding, but also the identification of good practices during analysis. However, when analyzing or maintaining code that has not been developed according to MDA principles, or that has been changed independently from the models, the need arises to reverse engineer the models from the code prior to patterns' identification. The approach presented herein consists in transforming source code into models, and infer design patterns from these models. Erich Gamma's cataloged patterns provide us a starting point for the pattern inference process. MapIt, the tool which implements these functionalities is described.

Couto R, Ribeiro AN, Campos JC.  2012.  MapIt: A Model Based Pattern Recovery Tool. 8th International Workshop on Model-based Methodologies for Pervasive and Embedded Software - MOMPES. 7706 Abstractcoutorc12-mompes2012.pdf

Design patterns provide a means to reuse proven solutions during development, but also to identify good practices during analysis. These are particularly relevant in complex and critical software, such as is the case of ubiquitous and pervasive systems. Model Driven Engineering (MDE) presents a solution for this problem, with the usage of high level models. As part of an effort to develop approaches to the migration of applications to mobile contexts, this paper reports on a tool that identifies design patterns in source code. Code is transformed into both platform specific and independent models, and from these design patterns are inferred. MapIt, the tool which implements these functionalities is described.

Campos JC, Mendes S.  2011.  FlexiXML - A portable user interface rendering engine for UsiXML. User Interface Extensible Markup Language - UsiXML. :158-168. Abstractflexixmlv7.pdf

A considerable amount of effort in software development is dedicated to the user interaction layer.Given the complexity inherent to the development of this layer, it is important to be able to analyse the concepts and ideas being used in the development of a given user interface. This analysis should be performed as early as possible. Model- based user interface development provides a solution to this problem by providing developers with tools that enable both modeling, and reasoning about, user interfaces at different levels of abstraction. Of particular interest here, is the possibility of animating the models to generate actual user interfaces. This paper describes FlexiXML, a tool that performs the rendering and animation of user interfaces described in the UsiXML modeling language.

Campos JC, Machado J.  2011.  Supporting requirements formulation in software formal verification. Latin-American Symposium on Dependable Computing - LADC - suplemental proceedings . Abstract7_-_81412.pdf

Formal verification tools such as model checkers have reached a stage were their applicability in the development process of dependable and safety critical systems has become viable. While the formal verification step in tools such as model checkers is fully automated, writing appropriate models and properties is a skillful process. In particular, a correct understanding of the logics used to express properties is needed to guarantee that properties correctly encode the original requirements. In this paper we illustrate how a patterns-based tool can help in simplifying the process of generating logical formulae from informally expressed requirements.

Machado J, Campos JC.  2011.  Partial Plant Models in Formal Verification of Industrial Automation Discrete Systems. Latin-American Symposium on Dependable Computing - ICMD - suplemental proceedings. Abstract8_-_81433.pdf

The use of a plant model for formal verification of Industrial Automation systems controllers is realistic because all automation systems are composed by a controller and a plant. Therefore, if the plant model is not used, there is a part of the system that is not considered. However, if there are some cases where the use of a plant model makes the formal verification results more realistic and robust, there are other cases where this does not always happen. Moreover, the discussion presented in this paper is related with the need of using a Plant Model considering, not all of the Plant Model, but Partial Plant models in order to facilitate formal verification tasks of Industrial Automation Discrete Event Systems Controllers.

Barbosa A, Paiva A, Campos JC.  2011.  Test case generation from mutated task models. Proceedings of the 3rd Symposium on Engineering Interactive Computing Systems - EICS. :175-184. Abstractbarbosapc2011.pdf

This paper describes an approach to the model-based testing of graphical user interfaces from task models. Starting from a task model of the system under test, oracles are generated whose behaviour is compared with the execution of the running system. The use of task models means that the effort of producing the test oracles is reduced. It does also mean, however, that the oracles are confined to the set of expected user behaviours for the system. The paper focuses on solving this problem. It shows how task mutations can be generated automatically, enabling a broader range of user behaviours to be considered. A tool, based on a classification of user errors, generates these mutations. A number of examples illustrate the approach.

Freire L, Arezes P, Campos JC.  2011.  A importância das avaliações qualitativas em sistemas E-learning. Occupational Safety and Hygiene - SHO. :274-278. Abstractfreireac2011.pdf

É cada vez mais frequente encontrarem-se investigações sobre a usabilidade dos sistemas de E-learning. Como resultados destas investigações, parece ser claro que grande parte delas aponta para a necessidade de uma discussão crítica acerca de métodos para avaliação de usabilidade específicos para os sistemas educativos, nomeadamente, sobre a importância do foco nos métodos direccionados para os utilizadores principais. Entretanto, parece também ser urgente e fundamental investigar a forma com que as análises ergonómicas tem sido aplicadas, assim como, as adaptações necessárias a tais métodos de usabilidade, de modo a definirem-se avaliações mais coerentes com a natureza do sistema. Sendo assim, com base na observação dos sistemas de E-learning - neste artigo será apresentada uma revisão bibliográfica quanto às principais formas de avaliação de usabilidade em sistemas educativos. Como resultado desta revisão, os estudos de Ergonomia levam-nos a considerar possibilidades de análise da usabilidade através de um ponto de vista mais humano, onde se equaciona a influência que o background do utilizador poderá ter sobre a interacção com o sistema e, consequentemente, que atributos de usabilidade serão mais significativos para aquele determinado perfil de utilizador. Neste sentido, o presente estudo pretende argumentar, através de uma visão geral do estado-da-arte, em prol da necessidade e da importância que o investigador deve ter sobre a noção exacta dos instrumentos de que dispõe, por forma a averiguar os dados que o utilizador poderá fornecer-lhe e qual a relevância dos resultados obtidos para a introdução de melhorias no sistema. Ao longo do artigo é efectuada uma discussão sobre métodos qualitativos de avaliação de usabilidade, baseados em experiências com sistemas de E-learning, académicos e/ou comerciais, ao incluir utilizadores com origens em diversos países e culturas. Dito isto, este artigo pretende retratar uma mudança de paradigma de avaliação que tem sido observada nos últimos anos, tanto na área de ergonomia, como na área do design de sistemas educativos.