Oliveira JN.  2014.  Relational Algebra for Just Good Enough Hardware. RAMiCS - 14th International Conference on Relational and Algebraic Methods in Computer Science. 8428:119-138. Abstractramics14.pdf

Device miniaturization is pointing towards tolerating imperfect hardware provided it is “good enough”. Software design theories will have to face the impact of such a trend sooner or later.
A school of thought in software design is relational: it expresses specifications as relations and derives programs from specifications using relational algebra.
This paper proposes that linear algebra be adopted as an evolution of relational algebra able to cope with the quantification of the impact of imperfect hardware on (otherwise) reliable software.
The approach is illustrated by developing a monadic calculus for component oriented software construction with a probabilistic dimension quantifying (by linear algebra) the propagation of imperfect behaviour from lower to upper layers of software systems.

Ferreira JF, Mendes A, Cunha A, Moreno CB, Silva P, Barbosa LS, Oliveira JN.  2011.  Logic Training through Algorithmic Problem Solving. Tools for Teaching Logic. 6680:62-69. Abstractticttl11.pdf

Although much of mathematics is algorithmic in nature, the skills needed to formulate and solve algorithmic problems do not form an integral part of mathematics education. In particular, logic, which is central to algorithm development, is rarely taught explicitly at pre university level, under the justi cation that it is implicit in mathematics and therefore does not need to be taught as an independent topic. This paper argues in the opposite direction, describing a one week workshop done at the University of Minho, in Portugal, whose goal was to introduce to high-school students calculational principles and techniques of algorithmic problem solving supported by calculational logic. The work shop resorted to recreational problems to convey the principles and to software tools, the Alloy Analyzer and Netlogo, to animate models.

Macedo H, Oliveira JN.  2010.  Matrices As Arrows! A Biproduct Approach to Typed Linear Algebra. 10th International Conference Mathematics of Program Construction. 6120:271–287. Abstract

Motivated by the need to formalize generation of fast running code for linear algebra applications, we show how an index-free, calculational approach to matrix algebra can be developed by regarding matrices as morphisms of a category with biproducts. This shifts the traditional view of matrices as indexed structures to a type-level perspective analogous to that of the pointfree algebra of programming.The derivation of fusion, cancellation and abide laws from the biproduct equations makes it easy to calculate algorithms implementing matrixmultiplication, the kernel operation of matrix algebra, ranging from its divide-and-conquer version to the conventional, iterative one. From errant attempts to learn how particular products and coproducts emerge from biproducts, we not only rediscovered block-wise matrix combinators but also found a way of addressing other operations calculationally such as e.g. Gaussian elimination. A strategy for addressing vectorization along the same lines is also given.

Oliveira JN.  2009.  Extended Static Checking by Calculation using the Pointfree Transform. LerNet ALFA Summer School. 5520:195–251. Abstract

The pointfree transform offers to the predicate calculus what the Laplace transform offers to the differential/integral calculus: the possibility of changing the underlying mathematical space so as to enable agile algebraic calculation. This paper addresses the foundations of the transform and its application to a calculational approach to extended static checking (ESC) in the context of abstract modeling. In particular, a calculus is given whose rules help in breaking the complexity of the proof obligations involved in static checking arguments. The close connection between such calculus and that of weakest pre-conditions makes it possible to use the latter in ESC proof obligation discharge, where pointfree notation is again used, this time to calculate with invariant properties to be maintained. A connection with the “everything is a relation” lemma of Alloy is established, showing how close to each other the pointfree and Alloy notations are. The main advantage of this connection is that of complementing pen-and-paper pointfree calculations with model checking support wherever validating sizable abstract models.

Oliveira JN.  2008.  Transforming Data by Calculation. Generative and Transformational Techniques in Software Engineering 2007. 5235:134-195. Abstract

This paper addresses the foundations of data-model transformation. A catalog of data mappings is presented which includes abstraction and representation relations and associated constraints. These are justified in an algebraic style via the pointfree-transform, a technique whereby predicates are lifted to binary relation terms (of the algebra of programming) in a two-level style encompassing both data and operations. This approach to data calculation, which also includes transformation of recursive data models into “flat” database schemes, is offered as alternative to standard database design from abstract models. The calculus is also used to establish a link between the proposed transformational style and bidirectional lenses developed in the context of the classical view-update problem.

Oliveira JN, Rodrigues C.  2006.  Pointfree Factorization of Operation Refinement. FM - Formal Methods 2006 . 4085:236–251. Abstract

The standard operation refinement ordering is a kind of “meet of op- posites”: non-determinism reduction suggests “smaller” behaviour while increase of definition suggests “larger” behaviour. Groves’ factorization of this ordering into two simpler relations, one per refinement concern, makes it more mathe- matically tractable but is far from fully exploited in the literature. We present a pointfree theory for this factorization which is more agile and calculational than the standard set-theoretic approach. In particular, we show that factorization leads to a simple proof of structural refinement for arbitrary parametric types and ex- ploit factor instantiation across different subclasses of (relational) operation. The prospect of generalizing the factorization to coalgebraic refinement is discussed.

Cunha A, Oliveira JN, Visser J.  2006.  Type-safe two-level data transformation . FM 2006: Formal Methods. 4085:284-299. Abstract

A two-level data transformation consists of a type-level transformation of a data format coupled with value-level transformations of data instances corresponding to that format. Examples of two-level data transformations include XML schema evolution coupled with document migration, and data mappings used for interoperability and persistence.
We provide a formal treatment of two-level data transformations that is type-safe in the sense that the well-formedness of the value-level transformations with respect to the type-level transformation is guarded by a strong type system. We rely on various techniques for generic functional programming to implement the formalization in Haskell.
The formalization addresses various two-level transformation scenarios, covering fully automated as well as user-driven transformations, and allowing transformations that are information-preserving or not. In each case, two-level transformations are disciplined by one-step transformation rules and type-level transformations induce value-level transformations. We demonstrate an example hierarchical-relational mapping and subsequent migration of relational data induced by hierarchical format evolution.

Cortes B, Oliveira JN.  2006.  Relational Sampling for Data Quality Auditing and Decision Support. Enterprise Information Systems V. :82–88. Abstract

This paper presents a strategy for applying sampling techniques to relational databases, in the context of data quality auditing or decision support processes. Fuzzy cluster sampling is used to survey sets of records for correctness of business rules. Relational algebra estimators are presented as a data quality-auditing tool.

Oliveira JN, Rodrigues C.  2004.  Transposing Relations: from Maybe Functions to Hash Tables. MPC - 7th International Conference on Mathematics of Program Construction. 3125:334-356. Abstract

Functional transposition is a technique for converting relations into functions aimed at developing the relational algebra via the algebra of functions. This paper attempts to develop a basis for generic transposition. Two instances of this construction are considered, one applicable to any relation and the other applicable to simple relations only. Our illustration of the usefulness of the generic transpose takes advantage of the free theorem of a polymorphic function. We show how to derive laws of relational combinators as free theorems of their transposes. Finally, we relate the topic of functional transposition with the hashing technique for efficient data representation.

Oliveira JN, Macedo H.  2017.  The Data Cube As a Typed Linear Algebra Operator. Proc. of the 16th Int. Symposium on Database Programming Languages. :6:1–6:11. Abstract
Oliveira JN.  2015.  Metaphorisms in Programming. RAMiCS - 15th International Conference on Relational and Algebraic Methods in Computer Science. Abstractmetaph.pdf

This paper introduces the metaphorism pattern of relational specification and addresses how specification following this pattern can be refined into recursive programs. Metaphorisms express input-output relationships which preserve relevant information while at the same time some intended optimization takes place. Text processing, sorting, representation changers, etc., are examples of metaphorisms. The kind of metaphorism refinement proposed in this paper is a strategy known as change of virtual data structure. It gives sufficient conditions for such implementations to be calculated using relation algebra and illustrates the strategy with the derivation of quicksort as example.

Pontes R, Matos M, Oliveira JN, Pereira JO.  2015.  Implementing a Linear Algebra Approach to Data Processing. Grand Timely Topics in Software Engineering - International Summer School {GTTSE} 2015, Braga, Portugal, August 23-29, 2015, Tutorial Lectures. 10223:215–222. Abstract
Cunha A, Oliveira JN, Macedo N, Pacheco H.  2013.  Composing least-change lenses. Proceedings of the First International Workshop on Bidirectional Transformations (BX 2012). 57:19. Abstractbx13.pdf

Non-trivial bidirectional transformations (BXs) are inherently ambiguous, as there are in general many different ways to consistently translate an update from one side to the other. Existing BX languages and frameworks typically satisfy fundamental first principles which ensure acceptable and stable (well-behaved) translation. Unfortunately, these give little insight about how a particular update translation is chosen among the myriad possible. From the user perspective, such unpredictability may hinder the adoption of BX frameworks.
The problem can be remedied by imposing a “principle of least change” which, in a state-based framework, amounts to translating each update in a way such that its result is as close as possible to the original state, according to some distance measure.
Starting by formalizing such BXs focusing on the particular framework of lenses, this paper discusses whether such least-change lenses can be defined by composition, an essential construct of BX frameworks. For sequential composition, two (dual) update translation alternatives are presented: a classical deterministic one and a nondeterministic. A key ingredient of the approach is the elegant formalization of the main concepts in relation algebra, which exposes several similarities and dualities.

Oliveira JN.  2012.  Typed Linear Algebra for Weighted (Probabilistic) Automata. CIAA - 17th International Conference on Implementation and Application of Automata . 7381:52-65. Abstract_ol12.pdf

There is a need for a language able to reconcile the recent upsurge of interest in quantitative methods in the software sciences with logic and set theory that have been used for so many years in capturing the qualitative aspects of the same body of knowledge. Such a lingua franca should be typed, polymorphic, diagrammatic, calculational and easy to blend with traditional notation.
This paper puts forward typed linear algebra (LA) as a candidate notation for such a role. Typed LA emerges from regarding matrices as morphisms of suitable categories whereby traditional linear algebra is equipped with a type system.
In this paper we show typed LA at work in describing weighted (prob- abilistic) automata. Some attention is paid to the interface between the index-free language of matrix combinators and the corresponding index- wise notation, so as to blend with traditional set theoretic notation.

Macedo H, Oliveira JN.  2012.  Towards Linear Algebras of Components. FACS - Formal Aspects of Component Software . 6921:300-303. Abstract

We show that there is significant foundational work on what could lead to a linear algebra approach to software components. Should these be weighted automata or machines in the sense of Bloom, we show that there is a constructive, fully typed approach to such matrix models. However abstract or primitive these may look to practitioners, they promise the calculation style which is the hallmark of engineering disciplines.