Vol 17, No 4 (2010)
Articles
7-16 474
Abstract
The Elliptic Curve Cryptography (ECC) is widely known as secure and reliable
cryptographic scheme. In many situations the original cryptographic algorithm is
modified to improve its efficiency in terms like power consumption or memory
consumption which were not in the focus of the original algorithm. For all this
modification it is crucial that the functionality and correctness of the original
algorithm is preserved. In particular, various projective coordinate systems are
applied in order to reduce the computational complexity of elliptic curve encryption
by avoiding division in finite fields. This work investigates the possibilities of
automated proofs on the correctness of different algorithmic variants. We introduce
the theorems which are required to prove the correctness of a modified algorithm
variant and the lemmas and definitions which are necessary to prove these goals.
The correctness proof of the projective coordinate system transformation has practically
been performed with the help of the an interactive formal verification system
XeriFun.
cryptographic scheme. In many situations the original cryptographic algorithm is
modified to improve its efficiency in terms like power consumption or memory
consumption which were not in the focus of the original algorithm. For all this
modification it is crucial that the functionality and correctness of the original
algorithm is preserved. In particular, various projective coordinate systems are
applied in order to reduce the computational complexity of elliptic curve encryption
by avoiding division in finite fields. This work investigates the possibilities of
automated proofs on the correctness of different algorithmic variants. We introduce
the theorems which are required to prove the correctness of a modified algorithm
variant and the lemmas and definitions which are necessary to prove these goals.
The correctness proof of the projective coordinate system transformation has practically
been performed with the help of the an interactive formal verification system
XeriFun.
17-26 475
Abstract
This article discusses a problem of test data preparation for functional testing with
a dened level of test coverage. Application of the method simplies the management
of software project conguration, keeping requirements, code and tests in a consistent
state. Classication of software deects is presented in the article. An approach the
formalizing the target system code and requirements analysis is proposed. This method
is based on the representation of equivalence class partitioning as logical equations. An
original method to get equations solutions is also provided. Method applicability in real
industrial projects is discussed.
a dened level of test coverage. Application of the method simplies the management
of software project conguration, keeping requirements, code and tests in a consistent
state. Classication of software deects is presented in the article. An approach the
formalizing the target system code and requirements analysis is proposed. This method
is based on the representation of equivalence class partitioning as logical equations. An
original method to get equations solutions is also provided. Method applicability in real
industrial projects is discussed.
27-40 556
Abstract
The paper deals with conformance testing based on formal specications. The con-
cept of safe testing was earlier proposed by the authors for trace based conformance.
This concept is propagated on the case of (weak) simulation based on a relation between
specication and implementation states. The theory of safe simulation of systems with
refusals and destructions is proposed. The problems of complete testing and sucient
conditions for the existense of complete test suite are discussed. The practical algo-
rithm of complete testing for restricted classes of specications and implementations is
described.
cept of safe testing was earlier proposed by the authors for trace based conformance.
This concept is propagated on the case of (weak) simulation based on a relation between
specication and implementation states. The theory of safe simulation of systems with
refusals and destructions is proposed. The problems of complete testing and sucient
conditions for the existense of complete test suite are discussed. The practical algo-
rithm of complete testing for restricted classes of specications and implementations is
described.
41-51 508
Abstract
Probabilistic systems of interacting nondeterministic intelligent agents are consid-
ered. States of the agents in these systems are some probabilistic databases, and the
activity of the agents is controlled by some probabilistic logic programs. Moreover, com-
munication channels between agents are also probabilistic. We show how such systems
can be polynomially transformed to ¯nite state Markov decision processes. This allows
one to transfer the known results on verifying temporal properties of the ¯nite state
Markov processes to the probabilistic multi-agent systems of considered type.
ered. States of the agents in these systems are some probabilistic databases, and the
activity of the agents is controlled by some probabilistic logic programs. Moreover, com-
munication channels between agents are also probabilistic. We show how such systems
can be polynomially transformed to ¯nite state Markov decision processes. This allows
one to transfer the known results on verifying temporal properties of the ¯nite state
Markov processes to the probabilistic multi-agent systems of considered type.
52-59 541
Abstract
A new data structure is suggested for symbolic model checking of distributed systems
defined by linear functions of integer variables.
defined by linear functions of integer variables.
60-69 567
Abstract
The paper deals with an expressive logic language LF and its calculus. Formulas of
this language consist of some large-block structural elements, such as type quanti¯ers.
The language LF contains only two logic symbols for any and is, which form the set of logic
connectives of the language. A logic calculus JF and complete strategy for automated
proof search based on a single unary rule of inference are considered. This calculus has a
number of other features that lead to the reduction of combinatorial complexity of ¯nding
the deductions in comparison with the known systems for automated theorem proving
as the resolution method and Genzen calculus. Problems of effective implementation of JF as a program system for automated theorem proving are considered.
this language consist of some large-block structural elements, such as type quanti¯ers.
The language LF contains only two logic symbols for any and is, which form the set of logic
connectives of the language. A logic calculus JF and complete strategy for automated
proof search based on a single unary rule of inference are considered. This calculus has a
number of other features that lead to the reduction of combinatorial complexity of ¯nding
the deductions in comparison with the known systems for automated theorem proving
as the resolution method and Genzen calculus. Problems of effective implementation of JF as a program system for automated theorem proving are considered.
70-77 548
Abstract
This paper introduces a method for static semantic analysis of source codes at com-
pilation time directly within standard compilers. The method is implemented via unied
integration with Java compilers to get the full access to Abstract Syntax Tree (AST)
of compiled les after the semantic analysis stage of compilation process. The unied
integration is implemented by common AST interfaces and adapters to AST implemen-
tations of Sun/Oracle javac and Eclipse Compiler for Java (ecj) compilers. This method
provides transparent integration with Eclipse and Netbeans IDE without a need for any
special plugins. Several examples of program verication rules are presented to demon-
strate the method.
pilation time directly within standard compilers. The method is implemented via unied
integration with Java compilers to get the full access to Abstract Syntax Tree (AST)
of compiled les after the semantic analysis stage of compilation process. The unied
integration is implemented by common AST interfaces and adapters to AST implemen-
tations of Sun/Oracle javac and Eclipse Compiler for Java (ecj) compilers. This method
provides transparent integration with Eclipse and Netbeans IDE without a need for any
special plugins. Several examples of program verication rules are presented to demon-
strate the method.
78-87 656
Abstract
Adaptive symmetry reduction is a technique which exploits the similarity of com-
ponents in systems of regular structure. It helps to reduce the effect of state explosion
when exploring reachable states of a system. It assumes the perfect symmetry of states
initially and tracks symmetry violations on-the-fly by exploring an extended state space.
In this paper we show that the technique is applicable to LTL model checking as well.
To succeed in this we incorporate the operations over the extended state space into the
classic double depth-first search algorithm. The nested search is modified to detect a fea-
sible pseudo-cycle via an accepting state of Buchi automaton instead of a cycleWe show
that the existence of a pseudo-cycle results in satisfiability of an Indexed LTL formula
on a model of the system and vice versa. This result opens the way for implementing
adaptive symmetry reduction in a LTL model checker.
ponents in systems of regular structure. It helps to reduce the effect of state explosion
when exploring reachable states of a system. It assumes the perfect symmetry of states
initially and tracks symmetry violations on-the-fly by exploring an extended state space.
In this paper we show that the technique is applicable to LTL model checking as well.
To succeed in this we incorporate the operations over the extended state space into the
classic double depth-first search algorithm. The nested search is modified to detect a fea-
sible pseudo-cycle via an accepting state of Buchi automaton instead of a cycleWe show
that the existence of a pseudo-cycle results in satisfiability of an Indexed LTL formula
on a model of the system and vice versa. This result opens the way for implementing
adaptive symmetry reduction in a LTL model checker.
88-100 524
Abstract
This paper presents the expendable multi-language analysis and verication system
SPECTRUM, which is being developed within the framework of the project SPEC-
TRUM. The project prospects are discussed using the example of C program verication.
The project aims at the development of a new integrated approach to program verica-
tion which will allow the integration, unication and combination of program verication
techniques together with accumulation and reuse of knowledge about them. One of the
project features consists in the use of the specialized executable specication language
Atoment. This language provides a unied format to represent both verication meth-
ods and data for them (program models, annotations, logic formulas). The C-targeted
component of the SPECTRUM system is based on our two-level C program verication
method. This method represents a good illustration of the integrated approach, since
it provides a complex C program verication that combines operational, axiomatic and
transformational approaches.
SPECTRUM, which is being developed within the framework of the project SPEC-
TRUM. The project prospects are discussed using the example of C program verication.
The project aims at the development of a new integrated approach to program verica-
tion which will allow the integration, unication and combination of program verication
techniques together with accumulation and reuse of knowledge about them. One of the
project features consists in the use of the specialized executable specication language
Atoment. This language provides a unied format to represent both verication meth-
ods and data for them (program models, annotations, logic formulas). The C-targeted
component of the SPECTRUM system is based on our two-level C program verication
method. This method represents a good illustration of the integrated approach, since
it provides a complex C program verication that combines operational, axiomatic and
transformational approaches.
101-110 600
Abstract
Deductive verication and synthesis of binary addition programs are performed on
the base of the rules of program correctness for statements of the predicate programming
P language. The paper presents the sketch of verication and synthesis of the programs
for the Ripple carry, Carry look-ahead and Ling adders. The correctness conditions of the
programs were translated into the specication language of the PVS verication system.
The time needed to prove the program correctness conditions in the PVS is more than the
time of the ordinary programming by a factor of 10. However, for program synthesis,
development of PVS theories and proofs are easier and faster than that for program
verication.
the base of the rules of program correctness for statements of the predicate programming
P language. The paper presents the sketch of verication and synthesis of the programs
for the Ripple carry, Carry look-ahead and Ling adders. The correctness conditions of the
programs were translated into the specication language of the PVS verication system.
The time needed to prove the program correctness conditions in the PVS is more than the
time of the ordinary programming by a factor of 10. However, for program synthesis,
development of PVS theories and proofs are easier and faster than that for program
verication.
111-124 566
Abstract
A verifying compiler is a system computer program that translates programs written
by man from a high-level language into equivalent executable programs, and besides,
proves (veri¯es) mathematical statements speci¯ed by man about the properties of the
programs being translated. The purpose of the F@BOOL@ project is to develop a
transparent for users, compact and portable verifying compiler F@BOOL@ for anno-
tated computational programs, that uses e®ective and sound automatic SAT-solvers (i.e.
programs that check satis¯ability of prepositional Boolean formulas in the conjunctive
normal form) as means of automatic validation of correctness conditions (instead of
semi-automatic proof techniques). The key idea is Boolean representation of all data
instead of Boolean abstraction or ¯rst-order representation. (It makes di®erence be-
tween F@BOOL@ and SLAM.) Our project is aimed at the veri¯cation of functional
properties, and it assumes generation of ¯rst-order veri¯cation conditions (from invari-
ants), and the validation/refutation of each veri¯cation condition using SAT-solvers after
\conservative" translation of the veri¯cation conditions into Boolean form. During the
period from 2006 to 2009, a popular (at that time) SAT-solver zCha® was used in the
F@BOOL@ project. The ¯rst three veri¯cation experiments that have been exercised
with its help are listed below: swapping values of two variables, checking whether three
input values are lengths of sides of an equilateral or isosceles triangle, and detecting a
unique fake in a set of 15 coins. The paper presents general outlines of the project and
details of the last (the most extensive) experiment.
by man from a high-level language into equivalent executable programs, and besides,
proves (veri¯es) mathematical statements speci¯ed by man about the properties of the
programs being translated. The purpose of the F@BOOL@ project is to develop a
transparent for users, compact and portable verifying compiler F@BOOL@ for anno-
tated computational programs, that uses e®ective and sound automatic SAT-solvers (i.e.
programs that check satis¯ability of prepositional Boolean formulas in the conjunctive
normal form) as means of automatic validation of correctness conditions (instead of
semi-automatic proof techniques). The key idea is Boolean representation of all data
instead of Boolean abstraction or ¯rst-order representation. (It makes di®erence be-
tween F@BOOL@ and SLAM.) Our project is aimed at the veri¯cation of functional
properties, and it assumes generation of ¯rst-order veri¯cation conditions (from invari-
ants), and the validation/refutation of each veri¯cation condition using SAT-solvers after
\conservative" translation of the veri¯cation conditions into Boolean form. During the
period from 2006 to 2009, a popular (at that time) SAT-solver zCha® was used in the
F@BOOL@ project. The ¯rst three veri¯cation experiments that have been exercised
with its help are listed below: swapping values of two variables, checking whether three
input values are lengths of sides of an equilateral or isosceles triangle, and detecting a
unique fake in a set of 15 coins. The paper presents general outlines of the project and
details of the last (the most extensive) experiment.
125-136 537
Abstract
We consider a problem of integrating a formal method of verification (model checking)
into the process of designing complex distributed software systems to improve the quality
of software. We use an approach based on the Model-Driven Engineering. It allows
us to structure the design process by selecting and verifying a system core, consisting
of independent subsystems and being responsible for logical management of the entire
system as a whole. The proposed method was tested on a real control system of vessel
power supply.
into the process of designing complex distributed software systems to improve the quality
of software. We use an approach based on the Model-Driven Engineering. It allows
us to structure the design process by selecting and verifying a system core, consisting
of independent subsystems and being responsible for logical management of the entire
system as a whole. The proposed method was tested on a real control system of vessel
power supply.
ISSN 1818-1015 (Print)
ISSN 2313-5417 (Online)
ISSN 2313-5417 (Online)