Next Previous Contents

2. Symbolic Systems (GOFAI)

Traditionally AI was based around the ideas of logic, rule systems, linguistics, and the concept of rationality. At its roots are programming languages such as Lisp and Prolog though newer systems tend to use more popular procedural languages. Expert systems are the largest successful example of this paradigm. An expert system consists of a detailed knowledge base and a complex rule system to utilize it. Such systems have been used for such things as medical diagnosis support and credit checking systems.

2.1 AI class/code libraries

These are libraries of code or classes for use in programming within the artificial intelligence field. They are not meant as stand alone applications, but rather as tools for building your own applications.

ACL2

ACL2 (A Computational Logic for Applicative Common Lisp) is a theorem prover for industrial applications. It is both a mathematical logic and a system of tools for constructing proofs in the logic. ACL2 works with GCL (GNU Common Lisp).

AI Kernel

The AI Kernel is a re-usable artificial intelligence engine that uses natural language processing and an Activator / Context model to allow multi tasking between installed cells.

AI Search II

Basically, the library offers the programmer a set of search algorithms that may be used to solve all kind of different problems. The idea is that when developing problem solving software the programmer should be able to concentrate on the representation of the problem to be solved and should not need to bother with the implementation of the search algorithm that will be used to actually conduct the search. This idea has been realized by the implementation of a set of search classes that may be incorporated in other software through C++'s features of derivation and inheritance. The following search algorithms have been implemented:

This library has a corresponding book, " Object-Oriented Artificial Intelligence, Using C++".

Alchemy

Alchemy is a software package providing a series of algorithms for statistical relational learning and probabilistic logic inference, based on the Markov logic representation. Alchemy allows you to easily develop a wide range of AI applications, including:

Aleph

This document provides reference information on A Learning Engine for Proposing Hypotheses (Aleph). Aleph is an Inductive Logic Programming (ILP) system. Aleph is intended to be a prototype for exploring ideas. Aleph is an ILP algorithm implemented in Prolog by Dr Ashwin Srinivasan at the Oxford University Computing Laboratory, and is written specifically for compilation with the YAP Prolog compiler

Microprograms

A collection of case-based reasoning "micro" versions of dissertation programs that were developed for pedagogical purposes. These programs are meant to distill key aspects of the original programs into a form that can be easily understood, modified, and extended.

Chess In Lisp (CIL)

The CIL (Chess In Lisp) foundation is a Common Lisp implementaion of all the core functions needed for development of chess applications. The main purpose of the CIL project is to get AI researchers interested in using Lisp to work in the chess domain.

clasp

clasp is an answer set solver for (extended) normal logic programs. It combines the high-level modeling capacities of answer set programming (ASP) with state-of-the-art techniques from the area of Boolean constraint solving. The primary clasp algorithm relies on conflict-driven nogood learning, a technique that proved very successful for satisfiability checking (SAT). Unlike other learning ASP solvers, clasp does not rely on legacy software, such as a SAT solver or any other existing ASP solver. Rather, clasp has been genuinely developed for answer set solving based on conflict-driven nogood learning. clasp can be applied as an ASP solver (on LPARSE output format), as a SAT solver (on simplified DIMACS/CNF format), or as a PB solver (on OPB format).

ConceptNet

ConceptNet aims to give computers access to common-sense knowledge, the kind of information that ordinary people know but usually leave unstated. The data in ConceptNet was collected from ordinary people who contributed it over the Web. ConceptNet represents this data in the form of a semantic network, and makes it available to be used in natural language processing and intelligent user interfaces.

This API provides Python code with access to both ConceptNet 3 and the development database that will become ConceptNet 4, and the natural language tools necessary to work with it. It uses Django for interacting with the database.

ERESYE

ERESYE means ERlang Expert SYstem Engine. It is a library to write expert systems and rule processing engines using the Erlang programming language. It allows to create multiple engines, each one with its own facts and rules to be processed.

FFLL

The Free Fuzzy Logic Library (FFLL) is an open source fuzzy logic class library and API that is optimized for speed critical applications, such as video games. FFLL is able to load files that adhere to the IEC 61131-7 standard.

FLiP

Flip is a logical framework written in Python. A logical framework is a library for defining logics and writing applications such as theorem provers. The checker can use different logics; Flip comes with several. You can add another logic, or add axioms and derived rules, by writing a module in Python. Python is both the object language and the metalanguage. Formulas, inference rules, and entire proofs are Python expressions. Prover commands are Python functions.

Fuzzy sets for Ada

Fuzzy sets for Ada is a library providing implementations of confidence factors with the operations not, and, or, xor, +, and *, classical fuzzy sets with the set-theoretic operations and the operations of the possibility theory, intuitionistic fuzzy sets with the operations on them, fuzzy logic based on the intuitionistic fuzzy sets and the possibility theory; fuzzy numbers, both integer and floating-point with conventional arithmetical operations, and linguistic variables and sets of linguistic variables with operations on them. String-oriented I/O is supported.

HTK

The Hidden Markov Model Toolkit (HTK) is a portable toolkit for building and manipulating hidden Markov models. HTK consists of a set of library modules and tools available in C source form. The tools provide sophisticated facilities for speech analysis, HMM training, testing and results analysis. The software supports HMMs using both continuous density mixture Gaussians and discrete distributions and can be used to build complex HMM systems. The HTK release contains extensive documentation and examples.

JCK

JCK is a new library providing constraint programming and search for Java.

Source and documentation available from link above.

KANREN

KANREN is a declarative logic programming system with first-class relations, embedded in a pure functional subset of Scheme. The system has a set-theoretical semantics, true unions, fair scheduling, first-class relations, lexically-scoped logical variables, depth-first and iterative deepening strategies. The system achieves high performance and expressivity without cuts.

LK

LK is an implementation of the Lin-Kernighan heuristic for the Traveling Salesman Problem and for the minimum weight perfect matching problem. It is tuned for 2-d geometric instances, and has been applied to certain instances with up to a million cities. Also included are instance generators and Perl scripts for munging TSPLIB instances.

This implementation introduces ``efficient cluster compensation'', an experimental algorithmic technique intended to make the Lin-Kernighan heuristic more robust in the face of clustered data.

LingPipe

LingPipe is a state-of-the-art suite of natural language processing tools written in Java that performs tokenization, sentence detection, named entity detection, coreference resolution, classification, clustering, part-of-speech tagging, general chunking, fuzzy dictionary matching.

Logfun

Logfun is a library of logic functors. A logic functor is a function that can be applied to zero, one or several logics so as to produce a new logic as a combination of argument logics. Each argument logic can itself be built by combination of logic functors. The signature of a logic is made of a parser and a printer of formulas, logical operations such as a theorem prover for entailment between formulas, and more specific operations required by Logical Information Systems (LIS). Logic functors can be concrete domains like integers, strings, or algebraic combinators like product or sum of logics.

Logic functors are coded as Objective Caml modules. A logic semantics is associated to each of these logic functors. This enables to define properties of logics like the consistency and completeness of the entailment prover, and to prove under which conditions a generated entailement prover satisfies these properties given the properties of argument logics.

Loom

* Note: Loom has been succeeded by PowerLoom .

Loom is a language and environment for constructing intelligent applications. The heart of Loom is a knowledge representation system that is used to provide deductive support for the declarative portion of the Loom language. Declarative knowledge in Loom consists of definitions, rules, facts, and default rules. A deductive engine called a classifier utilizes forward-chaining, semantic unification and object-oriented truth maintainance technologies in order to compile the declarative knowledge into a network designed to efficiently support on-line deductive query processing.

The Loom system implements a logic-based pattern matcher that drives a production rule facility and a pattern-directed method dispatching facility that supports the definition of object-oriented methods. The high degree of integration between Loom's declarative and procedural components permits programmers to utilize logic programming, production rule, and object-oriented programming paradigms in a single application. Loom can also be used as a deductive layer that overlays an ordinary CLOS network. In this mode, users can obtain many of the benefits of using Loom without impacting the function or performance of their CLOS-based applications.

maxent

The Maximum Entropy Toolkit provides a set of tools and library for constructing maximum entropy (maxent) models in either Python or C++. Maxent Entropy Model is a general purpose machine learning framework that has proved to be highly expressive and powerful in statistical natural language processing, statistical physics, computer vision and many other fields.

It features conditional maximum entropy models, L-BFGS and GIS parameter estimation, Gaussian Prior smoothing, a C++ API, a Python extension module, a command line utility, and good documentation. A Java version is also available.

Nyquist

The Computer Music Project at CMU is developing computer music and interactive performance technology to enhance human musical experience and creativity. This interdisciplinary effort draws on Music Theory, Cognitive Science, Artificial Intelligence and Machine Learning, Human Computer Interaction, Real-Time Systems, Computer Graphics and Animation, Multimedia, Programming Languages, and Signal Processing. A paradigmatic example of these interdisciplinary efforts is the creation of interactive performances that couple human musical improvisation with intelligent computer agents in real-time.

OpenCyc

OpenCyc is the open source version of Cyc, the largest and most complete general knowledge base and commonsense reasoning engine. An ontology based on 6000 concepts and 60000 assertions about them.

Pattern

Pattern is a web mining module for the Python programming language. It bundles tools for data retrieval (Google + Twitter + Wikipedia API, web spider, HTML DOM parser), text analysis (rule-based shallow parser, WordNet interface, syntactical + semantical n-gram search algorithm, tf-idf + cosine similarity + LSA metrics) and data visualization (graph networks).

PowerLoom

PowerLoom is the successor to the Loom knowledge representation system. It provides a language and environment for constructing intelligent, knowledge-based applications. PowerLoom uses a fully expressive, logic-based representation language (a variant of KIF). It uses a natural deduction inference engine that combines forward and backward chaining to derive what logically follows from the facts and rules asserted in the knowledge base. While PowerLoom is not a description logic, it does have a description classifier which uses technology derived from the Loom classifier to classify descriptions expressed in full first order predicate calculus (see paper). PowerLoom uses modules as a structuring device for knowledge bases, and ultra-lightweight worlds to support hypothetical reasoning.

To implement PowerLoom we developed a new programming language called STELLA , which is a Strongly Typed, Lisp-like LAnguage that can be translated into Lisp, C++ and Java. PowerLoom is written in STELLA and therefore available in Common-Lisp, C++ and Java versions.

PyCLIPS

PyCLIPS is an extension module for the Python language that embeds full CLIPS functionality in Python applications. This means that you can provide Python with a strong, reliable, widely used and well documented inference engine.

Pyke

Pyke is a knowledge-based inference engine (expert system) written in 100% python that can:

python-dlp

python-dlp aims to be a contemporary expert system based on the Semantic Web technologies. Traditionally, expert systems are an application of computing and artificial intelligence with the aim of supporting software that attempts to reproduce the deterministic behavior of one or more human experts in a specific problem domain. It utilizes the efficient RETE_UL algorithm as the 'engine' for the expert system

Reverend

Reverned is a general purpose Bayesian classifier written in Python. It is designed to be easily extended to any application domain.

Screamer

Screamer is an extension of Common Lisp that adds support for nondeterministic programming. Screamer consists of two levels. The basic nondeterministic level adds support for backtracking and undoable side effects. On top of this nondeterministic substrate, Screamer provides a comprehensive constraint programming language in which one can formulate and solve mixed systems of numeric and symbolic constraints. Together, these two levels augment Common Lisp with practically all of the functionality of both Prolog and constraint logic programming languages such as CHiP and CLP(R). Furthermore, Screamer is fully integrated with Common Lisp. Screamer programs can coexist and interoperate with other extensions to Common Lisp such as CLOS, CLIM and Iterate.

SPASS

SPASS: An Automated Theorem Prover for First-Order Logic with Equality

If you are interested in first-order logic theorem proving, the formal analysis of software, systems, protocols, formal approaches to AI planning, decision procedures, modal logic theorem proving, SPASS may offer you the right functionality.

Torch

Torch is a machine-learning library, written in C++. Its aim is to provide the state-of-the-art of the best algorithms. It is, and it will be, in development forever.

Torch is an open library whose authors encourage everybody to develop new packages to be included in future versions on the official website.

2.2 AI software kits, applications, etc.

These are various applications, software kits, etc. meant for research in the field of artificial intelligence. Their ease of use will vary, as they were designed to meet some particular research interest more than as an easy to use commercial package.

ASA - Adaptive Simulated Annealing

ASA (Adaptive Simulated Annealing) is a powerful global optimization C-code algorithm especially useful for nonlinear and/or stochastic systems.

ASA is developed to statistically find the best global fit of a nonlinear non-convex cost-function over a D-dimensional space. This algorithm permits an annealing schedule for 'temperature' T decreasing exponentially in annealing-time k, T = T_0 exp(-c k^1/D). The introduction of re-annealing also permits adaptation to changing sensitivities in the multi-dimensional parameter-space. This annealing schedule is faster than fast Cauchy annealing, where T = T_0/k, and much faster than Boltzmann annealing, where T = T_0/ln k.

Babylon

BABYLON is a modular, configurable, hybrid environment for developing expert systems. Its features include objects, rules with forward and backward chaining, logic (Prolog) and constraints. BABYLON is implemented and embedded in Common Lisp.

cfengine

Cfengine, or the configuration engine is a very high level language for building expert systems which administrate and configure large computer networks. Cfengine uses the idea of classes and a primitive form of intelligence to define and automate the configuration of large systems in the most economical way possible. Cfengine is design to be a part of computer immune systems.

CLIPS

CLIPS is a productive development and delivery expert system tool which provides a complete environment for the construction of rule and/or object based expert systems.

CLIPS provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules of thumb," which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or to create new components). The procedural programming capabilities provided by CLIPS are similar to capabilities found in languages such as C, Pascal, Ada, and LISP.

EMA-XPS - A Hybrid Graphic Expert System Shell

EMA-XPS is a hybrid graphic expert system shell based on the ASCII-oriented shell Babylon 2.3 of the German National Research Center for Computer Sciences (GMD). In addition to Babylon's AI-power (object oriented data representation, forward and backward chained rules - collectible into sets, horn clauses, and constraint networks) a graphic interface based on the X11 Window System and the OSF/Motif Widget Library has been provided.

Eprover

The E Equational Theorem Prover is a purely equational theorem prover. The core proof procedure operates on formulas in clause normal form, using a calculus that combines superposition (with selection of negative literals) and rewriting. No special rules for non-equational literals have been implemented, i.e., resolution is simulated via paramodulation and equality resolution. The basic calculus is extended with rules for AC redundancy elemination, some contextual simplification, and pseudo-splitting. The latest version of E also supports simultaneous paramodulation, either for all inferences or for selected inferences.

E is based on the DISCOUNT-loop variant of the given-clause algorithm, i.e. a strict separation of active and passive facts. Proof search in E is primarily controlled by a literal selection strategy, a clause evaluation heuristic, and a simplification ordering. The prover supports a large number of preprogrammed literal selection strategies, many of which are only experimental. Clause evaluation heuristics can be constructed on the fly by combining various parameterized primitive evaluation functions, or can be selected from a set of predefined heuristics. Supported term orderings are several parameterized instances of Knuth-Bendix-Ordering (KBO) and Lexicographic Path Ordering (LPO).

FOOL & FOX

FOOL stands for the Fuzzy Organizer OLdenburg. It is a result from a project at the University of Oldenburg. FOOL is a graphical user interface to develop fuzzy rulebases. FOOL will help you to invent and maintain a database that specifies the behavior of a fuzzy-controller or something like that.

FOX is a small but powerful fuzzy engine which reads this database, reads some input values and calculates the new control value.

FreeHAL

FreeHAL is a self-learning conversation simulator which uses semantic nets to organize its knowledge.

FreeHAL uses a semantic network, pattern matching, stemmers, part of speech databases, part of speech taggers, and Hidden Markov Models. Both the online and the download version support TTS.

FUF and SURGE

FUF is an extended implementation of the formalism of functional unification grammars (FUGs) introduced by Martin Kay specialized to the task of natural language generation. It adds the following features to the base formalism:

These extensions allow the development of large grammars which can be processed efficiently and can be maintained and understood more easily. SURGE is a large syntactic realization grammar of English written in FUF. SURGE is developed to serve as a black box syntactic generation component in a larger generation system that encapsulates a rich knowledge of English syntax. SURGE can also be used as a platform for exploration of grammar writing with a generation perspective.

GATE

GATE (General Architecture for Text Engineering) is an architecture, framework and development environment for developing, evaluating and embedding Human Language Technology.

GATE is made up of three elements:

The Grammar Workbench

Seems to be obsolete??? Its gone from the site, though its parent project is still ongoing.

The Grammar Workbench, or GWB for short, is an environment for the comfortable development of Affix Grammars in the AGFL-formalism. Its purposes are:

GSM Suite

The GSM Suite is a set of programs for using Finite State Machines in a graphical fashion. The suite consists of programs that edit, compile, and print state machines. Included in the suite is an editor program, gsmedit, a compiler, gsm2cc, that produces a C++ implementation of a state machine, a PostScript generator, gsm2ps, and two other minor programs. GSM is licensed under the GNU Public License and so is free for your use under the terms of that license.

Isabelle

Isabelle is a popular generic theorem prover developed at Cambridge University and TU Munich. Existing logics like Isabelle/HOL provide a theorem proving environment ready to use for sizable applications. Isabelle may also serve as framework for rapid prototyping of deductive systems. It comes with a large library including Isabelle/HOL (classical higher-order logic), Isabelle/HOLCF (Scott's Logic for Computable Functions with HOL), Isabelle/FOL (classical and intuitionistic first-order logic), and Isabelle/ZF (Zermelo-Fraenkel set theory on top of FOL).

Jess, the Java Expert System Shell

Jess is a clone of the popular CLIPS expert system shell written entirely in Java. With Jess, you can conveniently give your applets the ability to 'reason'. Jess is compatible with all versions of Java starting with version 1.0.2. Jess implements the following constructs from CLIPS: defrules, deffunctions, defglobals, deffacts, and deftemplates.

learn

Learn is a vocable learning program with memory model.

LISA

LISA (Lisp-based Intelligent Software Agents) is a production-rule system heavily influenced by JESS (Java Expert System Shell). It has at its core a reasoning engine based on the Rete pattern matching algorithm. LISA also provides the ability to reason over ordinary CLOS objects.

Livingstone2

Livingstone2 (L2) is a reusable artificial intelligence (AI) software system designed to assist spacecraft, life support systems, chemical plants or other complex systems in operating robustly with minimal human supervision, even in the face of hardware failures or unexpected events.

NICOLE

NICOLE (Nearly Intelligent Computer Operated Language Examiner) is a theory or experiment that if a computer is given enough combinations of how words, phrases and sentences are related to one another, it could talk back to you. It is an attempt to simulate a conversation by learning how words are related to other words. A human communicates with NICOLE via the keyboard and NICOLE responds back with its own sentences which are automatically generated, based on what NICOLE has stored in it's database. Each new sentence that has been typed in, and NICOLE doesn't know about, is included into NICOLE's database, thus extending the knowledge base of NICOLE.

Otter: An Automated Deduction System

Our current automated deduction system Otter is designed to prove theorems stated in first-order logic with equality. Otter's inference rules are based on resolution and paramodulation, and it includes facilities for term rewriting, term orderings, Knuth-Bendix completion, weighting, and strategies for directing and restricting searches for proofs. Otter can also be used as a symbolic calculator and has an embedded equational programming system.

PVS

PVS is a verification system: that is, a specification language integrated with support tools and a theorem prover. It is intended to capture the state-of-the-art in mechanized formal methods and to be sufficiently rugged that it can be used for significant applications. PVS is a research prototype: it evolves and improves as we develop or apply new capabilities, and as the stress of real use exposes new requirements.

SNePS

The long-term goal of The SNePS Research Group is the design and construction of a natural-language-using computerized cognitive agent, and carrying out the research in artificial intelligence, computational linguistics, and cognitive science necessary for that endeavor. The three-part focus of the group is on knowledge representation, reasoning, and natural-language understanding and generation. The group is widely known for its development of the SNePS knowledge representation/reasoning system, and Cassie, its computerized cognitive agent.

Soar

Soar has been developed to be a general cognitive architecture. We intend ultimately to enable the Soar architecture to:

In other words, our intention is for Soar to support all the capabilities required of a general intelligent agent.

TCM

TCM (Toolkit for Conceptual Modeling) is our suite of graphical editors. TCM contains graphical editors for Entity-Relationship diagrams, Class-Relationship diagrams, Data and Event Flow diagrams, State Transition diagrams, Jackson Process Structure diagrams and System Network diagrams, Function Refinement trees and various table editors, such as a Function-Entity table editor and a Function Decomposition table editor. TCM is easy to use and performs numerous consistency checks, some of them immediately, some of them upon request.

Yale

YALE (Yet Another Learning Environment) is an environment for machine learning experiments. Experiments can be made up of a large number of arbitrarily nestable operators and their setup is described by XML files which can easily created with a graphical user interface. Applications of YALE cover both research and real-world learning tasks.

WEKA

WEKA (Waikato Environment for Knowledge Analysis) is an state-of-the-art facility for applying machine learning techniques to practical problems. It is a comprehensive software "workbench" that allows people to analyse real-world data. It integrates different machine learning tools within a common framework and a uniform user interface. It is designed to support a "simplicity-first" methodology, which allows users to experiment interactively with simple machine learning tools before looking for more complex solutions.


Next Previous Contents