Formal Analysis, Algorithm for System Theories (FAST) Group at Centre for Advanced Studies shall develops and applies mathematics and logic to the design and analysis of algorithms and complex computational systems. We are especially interested in bringing the clarity and insight of formal theories to hard application problems of real practical significance.
As the complexity of information systems and the costs of potential failures are increasing, it becomes more and more important to prove that the design of the critical system components is correct. Logical errors in computer hardware and software can have significant economic and societal impact, while errors in the embedded systems that are increasingly used in safety-critical applications like ‘drive-by-wire’ and implantable medical devices, can lead to loss of human life. One of the typical solutions for the challenge of provably correct design is the application of formal methods.
Mathematically precise formal models allow the precise and unambiguous specification of requirements and construction of designs; formal verification allows the checking of design decisions and proof of design properties; while the verified models allow automated software synthesis. The subject provides an overview of the formal background needed for the elaboration and analysis of the formal models of IT components and systems: the modelling paradigms, the widely used formal modelling languages, and the related verification and validation by automated techniques.
• Automata theory and algorithms
Automata theory is the study of abstract machines and automata, as well as the computational problems that can be solved using them. It is a theory in theoretical computer science and discrete mathematics (a subject of study in both mathematics and computer science). The word automata(the plural of automaton) comes from the Greek word αὐτόματα, which means "self-acting". The figure at right illustrates a finite-state machine, which belongs to a well-known type of automaton. This automaton consists of states (represented in the figure by circles) and transitions (represented by arrows). As the automaton sees a symbol of input, it makes a transition (or jump) to another state, according to its transition function, which takes the current state and the recent symbol as its inputs. In mathematics and computer science, an algorithm is an unambiguous specification of how to solve a class of problems. Algorithms can perform calculation, data processing and automated reasoning tasks. As an effective method, an algorithm can be expressed within a finite amount of space and time and in a well-defined formal language for calculating a function. Starting from an initial state and initial input (perhaps empty) the instructions describe a computation that, when executed, proceeds through a finite number of well-defined successive states, eventually producing "output" and terminating at a final ending state.
• Modeling and analysis of complex and reactive systems
Modeling and Analysis of Complex Systems introduces students to mathematical/computational modeling and analysis developed in the emerging interdisciplinary field of Complex Systems Science. Complex systems are systems made of a large number of microscopic components interacting with each other in nontrivial ways. Many real-world systems can be understood as complex systems, where critically important information resides in the relationships between the parts and not necessarily within the parts themselves. Reactive (complex) systems such as flexible manufacturing systems (FMS), appropriate specification and modeling tools which can facilitate (besides traditional quantitative analysis), qualitative analysis involving complex and/or dynamic data and relationships, are essential. The ideal tools have to support model simulation, validation, verification and redesign, which is crucial in the development and advancement of such systems.
• Programming language semantics and type systems
In programming language theory, semantics is the field concerned with the rigorous mathematical study of the meaning of programming languages. It does so by evaluating the meaning of syntactically valid strings defined by a specific programming language, showing the computation involved. In such a case that the evaluation would be of syntactically invalid strings, the result would be non-computation. Semantics describes the processes a computer follows when executing a program in that specific language. This can be shown by describing the relationship between the input and output of a program, or an explanation of how the program will be executed on a certain platform, hence creating a model of computation. Type systems have become enormously important in language design because they can be used to formalize the definition of a language’s data types and their proper usage in programs. Type systems are often associated with syntax, especially for languages whose programs are type checked at compile time. For these languages, a type system is a definitional extension that imposes specific syntactic constraints (such as the requirement that all variables referenced in a program be declared) that cannot be expressed in BNF or EBNF. For languages whose programs are type checked at run time, a type system can be viewed as part of the language’s semantics. Thus, a language’s type system stands at the bridge between syntax and semantics, and can be properly viewed in either realm.
• Process algebras
In computer science, the process calculi (or process algebras) are a diverse family of related approaches for formally modelling concurrent systems. Process calculi provide a tool for the high-level description of interactions, communications, and synchronizations between a collection of independent agents or processes. They also provide algebraic laws that allow process descriptions to be manipulated and analyzed, and permit formal reasoning about equivalences between processes (e.g., using bisimulation). Leading examples of process calculi include CSP, CCS, ACP, and LOTOS. More recent additions to the family include the π-calculus, the ambient calculus, PEPA, the fusion calculus and the join-calculus. Our formal methods group aims to help hardware and software engineers build more reliable systems through Process calculus. The term "process algebra" was coined in 1982 by Bergstra & Klop . A process algebra was a structure in the sense of universal algebra that satisfied a particular set of axioms. An algebraic approach to the study of concurrent processes. Its tools are algebraical languages for the specification of processes and the formulation of statements about them, together with calculi for the verification of these statements.
The most popular books on process algebra are:
1. R. Milner (1989): Communication and Concurrency, Prentice Hall
2. C.A.R. Hoare (1985): Communicating Sequential Processes, Prentice Hall
3. M. Hennessy (1988): Algebraic Theory of Processes, MIT Press
4. R. Milner (1999): Communicating and Mobile Systems: The pi-calculus, Cambridge University Press
5. M. Hennessy(2007): A Distributed Pi-Calculus, Cambridge University Press
• Telecommunications software and protocol analysis
Telecommunications networks have evolved from basic wire line services to highly complex systems involving a wide range of technologies and delivering a host of services. Telecommunications software must keep pace with the ongoing evolution of architectures and technologies and provide flexibility to support the unique and demanding engineering and field workflows of very large service providers. If you need highly efficient engineering and field software for telecommunications networks, Bentley provides the solutions for you. A protocol analyzer is a tool (hardware or software) used to capture and analyze signals and data traffic over a communication channel. Such a channel varies from a local computer bus to a satellite link, that provides a means of communication using a standard communication protocol (networked or point-to-point). Each type of communication protocol has a different tool to collect and analyze signals and data.
• Computational biology
It involves the development and application of data-analytical and theoretical methods, mathematical modeling and computational simulation techniques to the study of biological, ecological, behavioral, and social systems. The field is broadly defined and includes foundations in biology, applied mathematics, statistics, biochemistry, chemistry, biophysics, molecular biology, genetics, genomics, computer science and evolution. Computational biology is different from biological computing, which is a subfield of computer science and computer engineering using bioengineering and biology to build computers, but is similar to bioinformatics, which is an interdisciplinary science using computers to store and process biological data.
• Constraint programming
In computer science, constraint programming is a programming paradigm wherein relations between variables are stated in the form of constraints. Constraints differ from the common primitives of imperative programming languages in that they do not specify a step or sequence of steps to execute, but rather the properties of a solution to be found. This makes constraint programming a form of declarative programming. The constraints used in constraint programming are of various kinds: those used in constraint satisfaction problems (e.g. "A or B is true"), linear inequalities (e.g. "x ≤ 5"), and others. Constraints are usually embedded within a programming language or provided via separate software libraries.
• Model checking
It is a method for formally verifying finite-state concurrent systems. Specifications about the system are expressed as temporal logic formulas, and efficient symbolic algorithms are used to traverse the model defined by the system and check if the specification holds or not. Extremely large state-spaces can often be traversed in minutes. The technique has been applied to several complex industrial systems such as the Futurebus+ and the PCI local bus protocols.
• Combinatorial search and optimization
In computer science and artificial intelligence, combinatorial search studies search algorithms for solving instances of problems that are believed to be hard in general, by efficiently exploring the usually large solution space of these instances. Combinatorial search algorithms achieve this efficiency by reducing the effective size of the search space or employing heuristics. Some algorithms are guaranteed to find the optimal solution, while others may only return the best solution found in the part of the state space that was explored. In applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects. In many such problems, exhaustive search is not feasible. It operates on the domain of those optimization problems, in which the set of feasible solutions is discrete or can be reduced to discrete, and in which the goal is to find the best solution. Some common problems involving combinatorial optimization are the travelling salesman problem ("TSP") and the minimum spanning tree problem ("MST").
Centre for Advanced Studies
Dr. A.P.J. Abdul Kalam Technical University, New Campus(Lucknow)
Sec-11, Jankipuram, Vistar Yojna, Lucknow, Uttar Pradesh-226031