blockchainconference

27th International Conference on Types for Proofs and Programs (TYPES 2021)

Publication year: 2022

This paper addresses the verification of Bitcoin smart contracts using the interactive theorem prover Agda. It focuses on two standard smart contracts that govern the distribution of Bitcoins, Pay to Public Key Hash (P2PKH) and Pay to Multisig (P2MS). Both are written in Bitcoin’s low-level language script, and provide the security property of access control to the distribution of Bitcoins.

The paper introduces an operational semantics of the script commands used in P2PKH and P2MS, and formalises it in the Agda proof assistant using Hoare triples. It advocates weakest preconditions in the context of Hoare triples as the appropriate notion for verifying access control. Two methodologies for obtaining human-readable weakest preconditions are discussed in order to close the validation gap between user requirements and formal specification of smart contracts: (1) a step-by-step approach, which works backwards instruction by instruction through a script, sometimes stepping over several instructions in one go; (2) symbolic execution of the code and translation into a nested case distinction, which allows to read off weakest preconditions as the disjunction of accepting paths. A syntax for equational reasoning with Hoare Triples is defined in order to formalise those approaches in Agda.

We consider pure equational theories that allow substitution but disallow induction, which we denote as PETS, based on recursive definition of their function symbols. We show that the Bounded Arithmetic theory S12 proves the consistency of PETS. Our approach employs models for PETS based on approximate values resembling notions from domain theory in Bounded Arithmetic, which may be of independent interest.

journalknowledge tech

Robotics and Computer-Integrated Manufacturing (2022), 74, 102281

Publication year: 2022

In the context of Industry 4.0, smart factories use advanced sensing and data analytic technologies to understand and monitor the manufacturing processes. To enhance production efficiency and reliability, statistical Artificial Intelligence (AI) technologies such as machine learning and data mining are used to detect and predict potential anomalies within manufacturing processes. However, due to the heterogeneous nature of industrial data, sometimes the knowledge extracted from industrial data is presented in a complex structure. This brings the semantic gap issue which stands for the lack of interoperability among different manufacturing systems. Furthermore, as the Cyber-Physical Systems (CPS) are becoming more knowledge-intensive, uniform knowledge representation of physical resources and real-time reasoning capabilities for analytic tasks are needed to automate the decision- making processes for these systems. These requirements highlight the potential of using symbolic AI for predictive maintenance.

To automate and facilitate predictive analytics in Industry 4.0, in this paper, we present a novel Knowledge-based System for Predictive Maintenance in Industry 4.0 (KSPMI). KSPMI is developed based on a novel hybrid approach that leverages both statistical and symbolic AI technologies. The hybrid approach involves using statistical AI technologies such as machine learning and chronicle mining (a special type of sequential pattern mining approach) to extract machine degradation models from industrial data. On the other hand, symbolic AI technologies, especially domain ontologies and logic rules, will use the extracted chronicle patterns to query and reason on system input data with rich domain and contextual knowledge. This hybrid approach uses Semantic Web Rule Language (SWRL) rules generated from chronicle patterns together with domain ontologies to perform ontology reasoning, which enables the automatic detection of machinery anomalies and the prediction of future events’ occurrence. KSPMI is evaluated and tested on both real-world and synthetic data sets.

Following the trend of Industry 4.0, the business model of steel manufacturing is transforming from a historical inwardly focused supplier/customer relationship to one that embraces the wider end-to-end supply chain and improves productivity more holistically. However, the data and information required for supply chain planning and steelmaking process modelling are normally distributed over scattered sources across organisation boundaries and research communities. This leads to a major problem concerning semantic interoperability. To address this issue, this paper introduces a Com- mon Reference Ontology for Steelmaking (CROS). CROS serves as a shared steelmaking resource and capability model that aims to facilitate knowledge modelling, knowledge sharing and information management. In contrast to most of the existing steelmaking ontologies which merely focus on conceptual modelling, our work pays special attention to the real-world implementation and utilisation aspects of CROS. The functionality and usefulness of CROS is evaluated and tested on a real-world condition-based monitoring and maintenance task for cold rolling mills at Tata Steel in the United Kingdom.

conferenceknowledge tech

Proceedings of the 4th IEEE International Conference on Industrial Cyber-Physical Systems (ICPS), pp. 31-38

Publication year: 2021

The Asset Administration Shell (AAS) is a fundamental concept in the Reference Architecture Model for Industry 4.0 (RAMI 4.0), that provides a virtual and digital representation of all information and functions of a physical asset in a manufacturing environment. Recently, Semantic AASs have emerged that add knowledge representation formalisms to enhance the digital representation of physical assets. In this paper, we pro- vide a comprehensive survey of the scientific contributions to Semantic AASs that model the Information and Communication Layer within RAMI 4.0, and summarise and demonstrate their structure, communication, functionalities, and use cases. We also highlight the challenges of future development of Semantic AASs.

This paper introduces the Steel Cold Rolling Ontology (SCRO) to model and capture domain knowledge of cold rolling processes and activities within a steel plant. A case study is set up that uses real-world cold rolling data sets to validate the performance and functionality of SCRO. This includes using the Ontop framework to deploy virtual knowledge graphs for data access, data integration, data querying, and condition-based maintenance purposes. SCRO is evaluated using OOPS!, the ontology pitfall detection system, and feedback from domain experts from Tata Steel.

blockchainconference

The 5th International Conference on Future Networks & Distributed Systems (pp. 795-802). ACM.

Publication year: 2021

We consider the challenge of determining which rights are available for licensing within the context of algorithmic licensing of movies using blockchain technology. We model the space of rights using set theory. We define algorithms within our modelling context that implement basic transactions for our use case, and argue for their correctness. We evaluate our algorithms through experiments within a multi-node Hyperledger Fabric network, and establish the feasibility of our algorithms for the intended use case.

Cyber Physical Trust Systems (CPTS) are Cyber Physical Systems and Internet of Things enriched with trust as an explicit, measurable, testable and verifiable system component. In this paper, we propose to use blockchain, a distributed ledger technology, as the trust enabling system component for CPTS. We propose two schemes for CPTSs driven by blockchain in relation to two typical network model cases. We show that our proposed approach achieves the security properties, such as device identification, authentication, integrity, and non-repudiation, and provides protection against popular attacks, such as replay and spoofing. We provide formal proofs of those properties using the Tamarin Prover tool. We describe results of a proof-of-concept which implements a CPTS driven by blockchain for physical asset management and present a performance analysis of our implementation. We identify use cases in which CPTSs driven by blockchain find applications.

journal

Annals of Pure and Applied Logic, Volume 170, Issue 10, pp. 1176-1187

Publication year: 2019

This paper studies the complexity of constant depth propositional proofs in the cedent and sequent calculus. We discuss the relationships between the size of tree-like proofs, the size of dag-like proofs, and the heights of proofs. The main result is to correct a proof construction in an earlier paper about transformations from proofs with polylogarithmic height and constantly many formulas per cedent.

blockchainbook chapter

In IoT Security (eds M. Liyanage, A. Braeken, P. Kumar and M. Ylianttila), Wiley, 265 - 277.

Publication year: 2019

Cyber Physical Trust Systems (CPTS) are Cyber Physical Systems and Internet of Things enriched with trust as an explicit, measurable, testable system component. In this chapter, we propose to use blockchain technology as the trust enabling system component for CPTS. Our proposed approach shows that a blockchain based CPTS achieves the security properties of data authenticity, identity and integrity. We describe results of a testbed which implements a blockchain based CPTS for physical asset management.

journal

Journal Logic Computation, Volume 28, Issue 6, September 2018, Pages 1125–1187.

Publication year: 2018

We introduce a system of Hyper Natural Deduction for Gödel Logic as an extension of Gentzen’s system of Natural Deduction. A deduction in this system consists of a finite set of derivations which uses the typical rules of Natural Deduction, plus additional rules providing means for communication between derivations. We show that our system is sound and complete for infinite-valued propositional Gödel Logic, by giving translations to and from Avron’s Hypersequent Calculus. We provide conversions for normalization extending usual conversions for Natural Deduction and prove the existence of normal forms for Hyper Natural Deduction for Gödel Logic. We show that normal deductions satisfy the subformula property.

The Cobham Recursive Set Functions (CRSF) provide an analogue of polynomial time computation which applies to arbitrary sets. We give three new equivalent characterizations of CRSF. The first is algebraic, using subset-bounded recursion and a form of Mostowski collapse. The second is our main result: the CRSF functions are shown to be precisely the functions computed by a class of uniform, infinitary, Boolean circuits. The third is in terms of a simple extension of the rudimentary functions by transitive closure and subset-bounded recursion.

conference

24th International Workshop on Logic, Language, Information, and Computation (WoLLiC), 2017. Pages: 31 - 47

Publication year: 2017

We define a new class of total search problems as a subclass of Megiddo and Papadimitriou’s class of total $\NP$ search problems, in which solutions are verifiable in $\AC^0$. We denote this class $\forall\exists\AC^0$. We show that all total $\NP$ search problems are equivalent wrt. $\AC^0$-many-one reductions to search problems in $\forall\exists\AC^0$. Furthermore, we show that $\forall\exists\AC^0$ contains well-known problems such as the Stable Marriage and the Maximal Independent Set problems. We introduce the class of Inflationary Iteration problems in $\forall\exists\AC^0$ and show that it characterizes the provably total $\NP$ search problems of the bounded arithmetic theory corresponding to polynomial-time. Cook and Nguyen introduced a generic way of defining a bounded arithmetic theory $\VC$ for complexity classes $\C$ which can be obtained using a complete problem. For such $C$ we will define a new class $\KPT[C]$ of $\forall\exists\AC^0$ search problems based on Student-Teacher games in which the student has computing power limited to $\AC^0$. We prove that $\KPT[C]$ characterizes the provably total $\NP$ search problems of the bounded arithmetic theory corresponding to $\C$. All our characterizations are obtained via “new-style” witnessing theorems, where reductions are provable in a theory corresponding to $\AC^0$.

We study consistency search problems for Frege and extended Frege proofs, namely the NP search problems of finding syntactic errors in Frege and extended Frege proofs of contradictions. The input is a polynomial time function, or an oracle, describing a proof of a contradiction; the output is the location of a syntactic error in the proof. The consistency search problems for Frege and extended Frege systems are shown to be many-one complete for the provably total NP search problems of the second order bounded arithmetic theories U^1_2 and V^1_2 , respectively.

We show that logics based on linear Kripke frames – with or without constant domains – that have a scattered end piece are not recursively enumerable. This is done by reduction to validity in all finite classical models.

feasible set fctsjournal

Lecture Notes Series, Institute for Mathematical Sciences, National University of Singapore: Volume 33, 2017, pp. 55-116

Publication year: 2017

This paper introduces bounded fragments of Kripke Platek set theory which characterise the *Cobham Recursive Set Functions.*

edited volumes

12th Conference on Computability in Europe, CiE 2016, Paris, France, June 27 - July 1, 2016, Proceedings, LNCS 9709

Publication year: 2016

This paper introduces the *Cobham Recursive Set Functions* (CRSF) as a version of polynomial time computable functions on general sets, based on a limited (bounded) form of epsilon-recursion. This is inspired by Cobham’s classic definition of polynomial time functions based on limited recursion on notation. We introduce a new set composition function, and a new smash function for sets which allows polynomial increases in the ranks and in the cardinalities of transitive closures. We bootstrap CRSF, prove closure under (unbounded) replacement, and prove that any CRSF function is embeddable into a smash term. When restricted to natural encodings of binary strings as hereditarily finite sets, the CRSF functions define precisely the polynomial time computable functions on binary strings. Prior work of Beckmann, Buss and Friedman and of Arai introduced set functions based on safe-normal recursion in the sense of Bellantoni-Cook. We prove an equivalence between our class CRSF and a variant of Arai’s predicatively computable set functions.

We consider intermediate predicate logics defined by fixed well-ordered (or dually well-ordered) linear Kripke frames with constant domains where the order-type of the well-order is strictly smaller than~$\omega^\omega$. We show that two such logics of different order-type are separated by a first-order sentence using only one monadic predicate symbol. Previous results by Minari, Takano and Ono, as well as the second author, obtained the same separation but relied on the use of predicate symbols of unbounded arity.

We introduce the *safe recursive set functions* based on a Bellantoni-Cook style subclass of the primitive recursive set functions. We show that the functions computed by safe recursive set functions under a list encoding of finite strings by hereditarily finite sets are exactly the polynomial growth rate functions computed by alternating exponential time Turing machines with polynomially many alternations. We also show that the functions computed by safe recursive set functions under a more efficient binary tree encoding of finite strings by hereditarily finite sets are exactly the quasipolynomial growth rate functions computed by alternating quasipolynomial time Turing machines with polylogarithmic many alternations.

We characterize the safe recursive set functions on arbitrary sets in definability-theoretic terms. In its strongest form, we show that a function on arbitrary sets is safe recursive if, and only if, it is uniformly definable in some polynomial level of a refinement of Jensen’s J-hierarchy, relativised to the transitive closure of the function’s arguments.

We observe that safe-recursive functions on infinite binary strings are equivalent to functions computed by so-called infinite-time Turing machines in time less than ω^ω. We also give a machine model for safe recursion which is based on set-indexed parallel processors and the natural bound on running times.

conference

Symposium on Logic in Computer Science (LICS), 2015, 30th Annual ACM/IEEE. Pages: 547 - 558

Publication year: 2015

We introduce a Hyper Natural Deduction system as an extension of Gentzen’s Natural Deduction system. A Hyper Natural Deduction consists of a finite set of derivations which may use, beside typical Natural Deduction rules, additional rules providing means for communication between derivations. We show that our Hyper Natural Deduction system is sound and complete for infinite-valued propositional Gödel Logic, by giving translations to and from Avron’s Hyper sequent Calculus. We also provide conversions for normalisation and prove the existence of normal forms for our Hyper Natural Deduction system.

edited volumes

11th Conference on Computability in Europe, CiE 2015, Bucharest, Romania, June 29-July 3, 2015. Proceedings, LNCS9136

Publication year: 2015

special issues

special issue of the journal Logical Methods in Computer Science,

Publication year: 2014

A propositional proof system is weakly automatizable if there is a polynomial time algorithm which separates satisfiable formulas from formulas which have a short refutation in the system, with respect to a given length bound. We show that if the resolution proof system is weakly automatizable, then parity games can be decided in polynomial time. We give simple proofs that the same holds for depth-1 propositional calculus (where resolution has depth 0) with respect to mean payoff and simple stochastic games. We define a new type of combinatorial game and prove that resolution is weakly automatizable if and only if one can separate, by a set decidable in polynomial time, the games in which the first player has a positional winning strategy from the games in which the second player has a positional winning strategy.

Our main technique is to show that a suitable weak bounded arithmetic theory proves that both players in a game cannot simultaneously have a winning strategy, and then to translate this proof into propositional form.

edited volumes

10th Conference on Computability in Europe, CiE 2014, Budapest, Hungary, June 23-27, 2014, Proceedings, LNCS 8493

Publication year: 2014

This paper concerns the second order systems U^1_2 and V^1_2 of bounded arithmetic, which have proof theoretic strengths corresponding to polynomial space and exponential time computation. We formulate improved witnessing theorems for these two theories by using S^1_2 as a base theory for proving the correctness of the polynomial space or exponential time witnessing functions. We develop the theory of nondeterministic polynomial space computation in U^1_2 . Kołodziejczyk, Nguyen, and Thapen have introduced local improvement properties to characterize the provably total NP functions of these second order theories. We show that the strengths of their local improvement principles over U^1_2 and V^1_2 depend primarily on the topology of the underlying graph, not the number of rounds in the local improvement games. The theory U^1_2 proves the local improvement principle for linear graphs even without restricting to logarithmically many rounds. The local improvement principle for grid graphs with only logarithmically rounds is complete for the provably total NP search problems of V^1_2 . Related results are obtained for local improvement principles with one improvement round, and for local improvement over rectangular grids.

conference

38th International Symposium on Mathematical Foundations of Computer Science, MFCS 2013, Klosterneuburg, Austria, August 26-30, 2013. LNCS 8087. Pages: 111-122

Publication year: 2013

A propositional proof system is weakly automatizable if there is a polynomial time algorithm which separates satisfiable formulas from formulas which have a short refutation in the system, with respect to a given length bound. We show that if the resolution proof system is weakly automatizable, then parity games can be decided in polynomial time. We also define a combinatorial game and prove that resolution is weakly automatizable if and only if one can separate, by a set decidable in polynomial time, the games in which the first player has a positional winning strategy from the games in which the second player has a positional winning strategy.

special issues

special issue of the journal Journal of Logic and Computation, Volume 23, Number 4

Publication year: 2013

This volume documents the presentations that were given as part of the research programme Semantics and Syntax: A Legacy of Alan Turing, held at the Isaac Newton Institute for the Mathematical Sciences in Cambridge, UK, 9 January – 6 July 2012.

conference

Hardware and Software: Verification and Testing. 8th International Haifa Verification Conference, HVC 2012 Haifa, Israel, November 6-8, 2012. Pages: 274 -275

Publication year: 2012

We explore the support of automatic verification via careful design of a domain specific language (DSL) in the context of algebraic specification. Formally a DSL is a loose specification the logical closure of which we regard as implicitly encoded “domain knowledge”. We systematically exploit this “domain knowledge” for automatic verification. We illustrate these ideas within the Railway Domain using the algebraic specification language Casl and an existing DSL, designed by Bjøerner, for modelling railways. Empirical evidence to the benefit of our approach is given in the form of the successful automatic verification of four railway track plans of real world complexity.