Misplaced Pages

Web Ontology Language

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

Knowledge representation and reasoning ( KRR , KR&R , or KR² ) is a field of artificial intelligence (AI) dedicated to representing information about the world in a form that a computer system can use to solve complex tasks, such as diagnosing a medical condition or having a natural-language dialog . Knowledge representation incorporates findings from psychology about how humans solve problems and represent knowledge, in order to design formalisms that make complex systems easier to design and build. Knowledge representation and reasoning also incorporates findings from logic to automate various kinds of reasoning .

#293706

74-673: The Web Ontology Language ( OWL ) is a family of knowledge representation languages for authoring ontologies . Ontologies are a formal way to describe taxonomies and classification networks, essentially defining the structure of knowledge for various domains: the nouns representing classes of objects and the verbs representing relations between the objects. Ontologies resemble class hierarchies in object-oriented programming but there are several critical differences. Class hierarchies are meant to represent structures used in source code that evolve fairly slowly (perhaps with monthly revisions) whereas ontologies are meant to represent information on

148-664: A URI (http://www.example.org/tea.owl, say). This example provides a sense of the syntax. To save space below, preambles and prefix definitions have been skipped. OWL classes correspond to description logic (DL) concepts , OWL properties to DL roles , while individuals are called the same way in both the OWL and the DL terminology. In the beginning, IS-A was quite simple. Today, however, there are almost as many meanings for this inheritance link as there are knowledge-representation systems. Early attempts to build large ontologies were plagued by

222-465: A knowledge base , which includes facts and rules about a problem domain, and an inference engine , which applies the knowledge in the knowledge base to answer questions and solve problems in the domain. In these early systems the facts in the knowledge base tended to be a fairly flat structure, essentially assertions about the values of variables used by the rules. Meanwhile, Marvin Minsky developed

296-677: A Candidate Recommendation in March 2000. In February 2001, the Semantic Web Activity replaced the Metadata Activity. In 2004 (as part of a wider revision of RDF) RDFS became a W3C Recommendation. Though RDFS provides some support for ontology specification, the need for a more expressive ontology language had become clear. As of Monday, the 31st of May, our working group will officially come to an end. We have achieved all that we were chartered to do, and I believe our work

370-493: A big part in the vision for the future Semantic Web. The automatic classification gives developers technology to provide order on a constantly evolving network of knowledge. Defining ontologies that are static and incapable of evolving on the fly would be very limiting for Internet-based systems. The classifier technology provides the ability to deal with the dynamic environment of the Internet. Recent projects funded primarily by

444-412: A class may be a subclass of many classes, a class cannot be an instance of another class). OWL DL is so named due to its correspondence with description logic , a field of research that has studied the logics that form the formal foundation of OWL. This one can be expressed as S H O I N ( D ) {\displaystyle {\mathcal {SHOIN}}(\mathbf {D} )} , using

518-560: A common framework that allows data to be shared and reused across application, enterprise, and community boundaries. a declarative representation language influenced by ideas from knowledge representation In the late 1990s, the World Wide Web Consortium (W3C) Metadata Activity started work on RDF Schema (RDFS), a language for RDF vocabulary sharing. The RDF became a W3C Recommendation in February 1999, and RDFS

592-436: A knowledge base (which in the case of KL-ONE languages is also referred to as an Ontology). Another area of knowledge representation research was the problem of common-sense reasoning . One of the first realizations learned from trying to make software that can function with human natural language was that humans regularly draw on an extensive foundation of knowledge about the real world that we simply take for granted but that

666-488: A lack of clear definitions. Members of the OWL family have model theoretic formal semantics, and so have strong logical foundations. Description logics are a family of logics that are decidable fragments of first-order logic with attractive and well-understood computational properties. OWL DL and OWL Lite semantics are based on DLs. They combine a syntax for describing and exchanging ontologies, and formal semantics that gives them meaning. For example, OWL DL corresponds to

740-408: A pair of individual identifiers (that the objects identified are distinct or the same). Axioms specify the characteristics of classes and properties. This style is similar to frame languages , and quite dissimilar to well known syntaxes for DLs and Resource Description Framework (RDF). Sean Bechhofer, et al. argue that though this syntax is hard to parse, it is quite concrete. They conclude that

814-450: A set of prototypes, in particular prototypical diseases, to be matched against the case at hand. Abstract syntax In computer science , the abstract syntax of data is its structure described as a data type (possibly, but not necessarily, an abstract data type ), independent of any particular representation or encoding. This is particularly used in the representation of text in computer languages , which are generally stored in

SECTION 10

#1732772431294

888-479: A sharply different view of the task at hand. Consider the difference that arises in selecting the lumped element view of a circuit rather than the electrodynamic view of the same device. As a second example, medical diagnosis viewed in terms of rules (e.g., MYCIN ) looks substantially different from the same task viewed in terms of frames (e.g., INTERNIST). Where MYCIN sees the medical world as made up of empirical associations connecting symptom to disease, INTERNIST sees

962-400: A subset of first-order logic that is decidable, propositional logic was used, increasing its power by adding logics represented by convention with acronyms: The W3C-endorsed OWL specification includes the definition of three variants of OWL, with different levels of expressiveness. These are OWL Lite, OWL DL and OWL Full (ordered by increasing expressiveness). Each of these sublanguages is

1036-564: A syntactic extension of its simpler predecessor. The following set of relations hold. Their inverses do not. OWL Lite was originally intended to support those users primarily needing a classification hierarchy and simple constraints. For example, while it supports cardinality constraints, it only permits cardinality values of 0 or 1. It was hoped that it would be simpler to provide tool support for OWL Lite than its more expressive relatives, allowing quick migration path for systems using thesauri and other taxonomies . In practice, however, most of

1110-472: A tree structure as an abstract syntax tree . Abstract syntax, which only consists of the structure of data, is contrasted with concrete syntax , which also includes information about the representation. For example, concrete syntax includes features like parentheses (for grouping) or commas (for lists), which are not included in the abstract syntax, as they are implicit in the structure. Abstract syntaxes are classified as first-order abstract syntax (FOAS), if

1184-429: A wide variety of languages and notations (e.g., logic, LISP, etc.); the essential information is not the form of that language but the content, i.e., the set of concepts offered as a way of thinking about the world. Simply put, the important part is notions like connections and components, not the choice between writing them as predicates or LISP constructs. The commitment made selecting one or another ontology can produce

1258-546: Is a field of artificial intelligence that focuses on designing computer representations that capture information about the world that can be used for solving complex problems. The justification for knowledge representation is that conventional procedural code is not the best formalism to use to solve complex problems. Knowledge representation makes complex software easier to define and maintain than procedural code and can be used in expert systems . For example, talking to experts in terms of business rules rather than code lessens

1332-428: Is a treaty–a social agreement among people with common motive in sharing." There are always many competing and differing views that make any general-purpose ontology impossible. A general-purpose ontology would have to be applicable in any domain and different areas of knowledge need to be unified. There is a long history of work attempting to build ontologies for a variety of task domains, e.g., an ontology for liquids,

1406-419: Is a useful view, but not the only possible one. A different ontology arises if we need to attend to the electrodynamics in the device: Here signals propagate at finite speed and an object (like a resistor) that was previously viewed as a single component with an I/O behavior may now have to be thought of as an extended medium through which an electromagnetic wave flows. Ontologies can of course be written down in

1480-532: Is being quite well appreciated. The World Wide Web Consortium (W3C) created the Web-Ontology Working Group as part of their Semantic Web Activity. It began work on November 1, 2001 with co-chairs James Hendler and Guus Schreiber. The first working drafts of the abstract syntax , reference and synopsis were published in July 2002. OWL became a formal W3C recommendation on February 10, 2004 and

1554-546: Is ideal for the ever-changing and evolving information space of the Internet. The Semantic Web integrates concepts from knowledge representation and reasoning with markup languages based on XML. The Resource Description Framework (RDF) provides the basic capabilities to define knowledge-based objects on the Internet with basic features such as Is-A relations and object properties. The Web Ontology Language (OWL) adds additional semantics and integrates with automatic classification reasoners. In 1985, Ron Brachman categorized

SECTION 20

#1732772431294

1628-450: Is normative. OWL2 specifies an XML serialization that closely models the structure of an OWL2 ontology. The Manchester Syntax is a compact, human readable syntax with a style close to frame languages. Variations are available for OWL and OWL2. Not all OWL and OWL2 ontologies can be expressed in this syntax. Consider an ontology for tea based on a Tea class. First, an ontology identifier is needed. Every OWL ontology must be identified by

1702-484: Is not at all obvious to an artificial agent, such as basic principles of common-sense physics, causality, intentions, etc. An example is the frame problem , that in an event driven logic there need to be axioms that state things maintain position from one moment to the next unless they are moved by some external force. In order to make a true artificial intelligence agent that can converse with humans using natural language and can process basic statements and questions about

1776-518: Is not widely used. OWL DL is designed to provide the maximum expressiveness possible while retaining computational completeness (either φ or ¬φ holds), decidability (there is an effective procedure to determine whether φ is derivable or not), and the availability of practical reasoning algorithms. OWL DL includes all OWL language constructs, but they can be used only under certain restrictions (for example, number restrictions may not be placed upon properties which are declared to be transitive; and while

1850-437: Is the knowledge representation hypothesis first formalized by Brian C. Smith in 1985: Any mechanically embodied intelligent process will be comprised of structural ingredients that a) we as external observers naturally take to represent a propositional account of the knowledge that the overall process exhibits, and b) independent of such external semantic attribution, play a formal but causal and essential role in engendering

1924-408: Is typical today, it will be possible to define logical queries and find pages that map to those queries. The automated reasoning component in these systems is an engine known as the classifier. Classifiers focus on the subsumption relations in a knowledge base rather than rules. A classifier can infer new classes and dynamically change the ontology as new information becomes available. This capability

1998-431: Is undecidable, so no reasoning software is able to perform complete reasoning for it. In OWL 2, there are three sublanguages of the language: The OWL family of languages supports a variety of syntaxes. It is useful to distinguish high level syntaxes aimed at specification from exchange syntaxes more suitable for general use. These are close to the ontology structure of languages in the OWL family. High level syntax

2072-488: Is used to specify the OWL ontology structure and semantics. The OWL abstract syntax presents an ontology as a sequence of annotations , axioms and facts . Annotations carry machine and human oriented meta-data. Information about the classes, properties and individuals that compose the ontology is contained in axioms and facts only. Each class, property and individual is either anonymous or identified by an URI reference . Facts state data either about an individual or about

2146-437: The S H O I N ( D ) {\displaystyle {\mathcal {SHOIN}}^{\mathcal {(D)}}} description logic, while OWL 2 corresponds to the S R O I Q ( D ) {\displaystyle {\mathcal {SROIQ}}^{\mathcal {(D)}}} logic. Sound, complete, terminating reasoners (i.e. systems which are guaranteed to derive every consequence of

2220-525: The Defense Advanced Research Projects Agency (DARPA) have integrated frame languages and classifiers with markup languages based on XML. The Resource Description Framework (RDF) provides the basic capability to define classes, subclasses, and properties of objects. The Web Ontology Language (OWL) provides additional levels of semantics and enables integration with classification engines. Knowledge-representation

2294-852: The World Wide Web Consortium 's (W3C) standard for objects called the Resource Description Framework (RDF). OWL and RDF have attracted significant academic, medical and commercial interest. In October 2007, a new W3C working group was started to extend OWL with several new features as proposed in the OWL 1.1 member submission. W3C announced the new version of OWL on 27 October 2009. This new version, called OWL 2, soon found its way into semantic editors such as Protégé and semantic reasoners such as Pellet, RacerPro, FaCT++ and HermiT. The OWL family contains many species, serializations, syntaxes and specifications with similar names. OWL and OWL2 are used to refer to

Web Ontology Language - Misplaced Pages Continue

2368-453: The cognitive revolution in psychology and to the phase of AI focused on knowledge representation that resulted in expert systems in the 1970s and 80s, production systems , frame languages , etc. Rather than general problem solvers, AI changed its focus to expert systems that could match human competence on a specific task, such as medical diagnosis. Expert systems gave us the terminology still in use today where AI systems are divided into

2442-406: The lumped element model widely used in representing electronic circuits (e.g. ), as well as ontologies for time, belief, and even programming itself. Each of these offers a way to see some part of the world. The lumped element model, for instance, suggests that we think of circuits in terms of components with connections between them, with signals flowing instantaneously along the connections. This

2516-402: The "HasTypeABBlood" class. If it is stated that the individual Harriet is related via "hasMother" to the individual Sue, and that Harriet is a member of the "HasTypeOBlood" class, then it can be inferred that Sue is not a member of "HasTypeABBlood". This is, however, only true if the concepts of "Parent" and "Mother" only mean biological parent or mother and not social parent or mother. To choose

2590-535: The "transfer syntax" (in communications). A compiler 's internal representation of a program will typically be specified by an abstract syntax in terms of categories such as "statement", "expression" and "identifier". This is independent of the source syntax ( concrete syntax ) of the language being compiled (though it will often be very similar). A parse tree is similar to an abstract syntax tree but it will typically also contain features such as parentheses, which are syntactically significant but which are implicit in

2664-415: The 1970s. A 2006 survey of ontologies available on the web collected 688 OWL ontologies. Of these, 199 were OWL Lite, 149 were OWL DL and 337 OWL Full (by syntax). They found that 19 ontologies had in excess of 2,000 classes, and that 6 had more than 10,000. The same survey collected 587 RDFS vocabularies. An ontology is an explicit specification of a conceptualization. The data described by an ontology in

2738-453: The 2004 and 2009 specifications, respectively. Full species names will be used, including specification version (for example, OWL2 EL). When referring more generally, OWL Family will be used. There is a long history of ontological development in philosophy and computer science. Since the 1990s, a number of research efforts have explored how the idea of knowledge representation (KR) from artificial intelligence (AI) could be made useful on

2812-465: The Internet and are expected to be evolving almost constantly. Similarly, ontologies are typically far more flexible as they are meant to represent information on the Internet coming from all sorts of heterogeneous data sources. Class hierarchies on the other hand tend to be fairly static and rely on far less diverse and more structured sources of data such as corporate databases. The OWL languages are characterized by formal semantics . They are built upon

2886-444: The OWL family is interpreted as a set of "individuals" and a set of "property assertions" which relate these individuals to each other. An ontology consists of a set of axioms which place constraints on sets of individuals (called "classes") and the types of relationships permitted between them. These axioms provide semantics by allowing systems to infer additional information based on the data explicitly provided. A full introduction to

2960-463: The RDFS meaning, and OWL Full is a semantic extension of RDF. [The closed] world assumption implies that everything we don't know is false , while the open world assumption states that everything we don't know is undefined . The languages in the OWL family use the open world assumption . Under the open world assumption, if a statement cannot be proven to be true with current knowledge, we cannot draw

3034-736: The Semantic Web Activity in September 2007. In April 2008, this group decided to call this new language OWL2, indicating a substantial revision. OWL 2 became a W3C recommendation in October 2009. OWL 2 introduces profiles to improve scalability in typical applications. Why not be inconsistent in at least one aspect of a language which is all about consistency? OWL was chosen as an easily pronounced acronym that would yield good logos, suggest wisdom, and honor William A. Martin 's One World Language knowledge representation project from

Web Ontology Language - Misplaced Pages Continue

3108-663: The World Wide Web. These included languages based on HTML (called SHOE ), based on XML (called XOL, later OIL ), and various frame-based KR languages and knowledge acquisition approaches. In 2000 in the United States, DARPA started development of DAML led by James Hendler . In March 2001, the Joint EU/US Committee on Agent Markup Languages decided that DAML should be merged with OIL. The EU/US ad hoc Joint Working Group on Agent Markup Languages

3182-594: The behavior that manifests that knowledge. One of the most active areas of knowledge representation research is the Semantic Web . The Semantic Web seeks to add a layer of semantics (meaning) on top of the current Internet. Rather than indexing web sites and pages via keywords, the Semantic Web creates large ontologies of concepts. Searching for a concept will be more effective than traditional text only searches. Frame languages and automatic classification play

3256-408: The concept of frame in the mid-1970s. A frame is similar to an object class: It is an abstract description of a category describing things in the world, problems, and potential solutions. Frames were originally used on systems geared toward human interaction, e.g. understanding natural language and the social settings in which various default expectations such as ordering food in a restaurant narrow

3330-708: The conclusion that the statement is false. A relational database consists of sets of tuples with the same attributes . SQL is a query and management language for relational databases. Prolog is a logical programming language. Both use the closed world assumption . The following tools include public ontology browsers: Knowledge representation and reasoning Examples of knowledge representation formalisms include semantic networks , frames , rules , logic programs , and ontologies . Examples of automated reasoning engines include inference engines , theorem provers , model generators , and classifiers . The earliest work in computerized knowledge representation

3404-459: The core issues for knowledge representation as follows: In the early years of knowledge-based systems the knowledge-bases were fairly small. The knowledge-bases that were meant to actually solve real problems rather than do proof of concept demonstrations needed to focus on well defined problems. So for example, not just medical diagnosis as a whole topic, but medical diagnosis of certain kinds of diseases. As knowledge-based technology scaled up,

3478-494: The expressive power of the OWL is provided in the W3C's OWL Guide . OWL ontologies can import other ontologies, adding information from the imported ontology to the current ontology. An ontology describing families might include axioms stating that a "hasMother" property is only present between two individuals when "hasParent" is also present, and that individuals of class "HasTypeOBlood" are never related via "hasParent" to members of

3552-551: The expressiveness constraints placed on OWL Lite amount to little more than syntactic inconveniences: most of the constructs available in OWL DL can be built using complex combinations of OWL Lite features, and is equally expressive as the description logic S H I F ( D ) {\displaystyle {\mathcal {SHIF}}(\mathbf {D} )} . Development of OWL Lite tools has thus proven to be almost as difficult as development of tools for OWL DL, and OWL Lite

3626-544: The key discoveries of AI research in the 1970s was that languages that do not have the full expressive power of FOL can still provide close to the same expressive power of FOL, but can be easier for both the average developer and for the computer to understand. Many of the early AI knowledge representation formalisms, from databases to semantic nets to production systems, can be viewed as making various design decisions about how to balance expressive power with naturalness of expression and efficiency. In particular, this balancing act

3700-417: The knowledge in an ontology) exist for these DLs. OWL Full is intended to be compatible with RDF Schema (RDFS), and to be capable of augmenting the meanings of existing Resource Description Framework (RDF) vocabulary. A model theory describes the formal semantics for RDF. This interpretation provides the meaning of RDF and RDFS vocabulary. So, the meaning of OWL Full ontologies are defined by extension of

3774-485: The letters logic above. OWL Full is based on a different semantics from OWL Lite or OWL DL, and was designed to preserve some compatibility with RDF Schema. For example, in OWL Full a class can be treated simultaneously as a collection of individuals and as an individual in its own right; this is not permitted in OWL DL. OWL Full allows an ontology to augment the meaning of the pre-defined (RDF or OWL) vocabulary. OWL Full

SECTION 50

#1732772431294

3848-424: The name abstract syntax may be somewhat misleading. This syntax closely follows the structure of an OWL2 ontology. It is used by OWL2 to specify semantics, mappings to exchange syntaxes and profiles. Syntactic mappings into RDF are specified for languages in the OWL family. Several RDF serialization formats have been devised. Each leads to a syntax for languages in the OWL family through this mapping. RDF/XML

3922-588: The need for larger knowledge bases and for modular knowledge bases that could communicate and integrate with each other became apparent. This gave rise to the discipline of ontology engineering, designing and building large knowledge bases that could be used by multiple projects. One of the leading research projects in this area was the Cyc project. Cyc was an attempt to build a huge encyclopedic knowledge base that would contain not just expert knowledge but common-sense knowledge. In designing an artificial intelligence agent, it

3996-408: The object-oriented community rather than AI it was quickly embraced by AI researchers as well in environments such as KEE and in the operating systems for Lisp machines from Symbolics , Xerox , and Texas Instruments . The integration of frames, rules, and object-oriented programming was significantly driven by commercial ventures such as KEE and Symbolics spun off from various research projects. At

4070-692: The other hand, proposed the use of the predicate calculus to represent common sense reasoning . Many of the early approaches to knowledge represention in Artificial Intelligence (AI) used graph representations and semantic networks , similar to knowledge graphs today. In such approaches, problem solving was a form of graph traversal or path-finding, as in the A* search algorithm . Typical applications included robot plan-formation and game-playing. Other researchers focused on developing automated theorem-provers for first-order logic, motivated by

4144-451: The process to make a medical diagnosis. Integrated systems were developed that combined frames and rules. One of the most powerful and well known was the 1983 Knowledge Engineering Environment (KEE) from Intellicorp . KEE had a complete rule engine with forward and backward chaining . It also had a complete frame-based knowledge base with triggers, slots (data values), inheritance, and message passing. Although message passing originated in

4218-432: The same information, and this can make it hard for users to formalise or even to understand knowledge expressed in complex, mathematically-oriented ways. Secondly, because of its complex proof procedures, it can be difficult for users to understand complex proofs and explanations, and it can be hard for implementations to be efficient. As a consequence, unrestricted FOL can be intimidating for many software developers. One of

4292-527: The same time, there was another strain of research that was less commercially focused and was driven by mathematical logic and automated theorem proving. One of the most influential languages in this research was the KL-ONE language of the mid-'80s. KL-ONE was a frame language that had a rigorous semantics, formal definitions for concepts such as an Is-A relation . KL-ONE and languages that were influenced by it such as Loom had an automated reasoning engine that

4366-448: The search space and allow the system to choose appropriate responses to dynamic situations. It was not long before the frame communities and the rule-based researchers realized that there was a synergy between their approaches. Frames were good for representing the real world, described as classes, subclasses, slots (data values) with various constraints on possible values. Rules were good for representing and utilizing complex logic such as

4440-436: The semantic gap between users and developers and makes development of complex systems more practical. Knowledge representation goes hand in hand with automated reasoning because one of the main purposes of explicitly representing knowledge is to be able to reason about that knowledge, to make inferences, assert new knowledge, etc. Virtually all knowledge representation languages have a reasoning or inference engine as part of

4514-412: The situation calculus. He also showed how to use resolution for question-answering and automatic programming. In contrast, researchers at Massachusetts Institute of Technology (MIT) rejected the resolution uniform proof procedure paradigm and advocated the procedural embedding of knowledge instead. The resulting conflict between the use of logical representations and the use of procedural representations

SECTION 60

#1732772431294

4588-484: The standard semantics of FOL. In a key 1993 paper on the topic, Randall Davis of MIT outlined five distinct roles to analyze a knowledge representation framework: Knowledge representation and reasoning are a key enabling technology for the Semantic Web . Languages based on the Frame model with automatic classification provide a layer of semantics on top of the existing Internet. Rather than searching via text strings as

4662-425: The structure is abstract but names (identifiers) are still concrete (and thus requires name resolution ), and higher-order abstract syntax , if the names themselves are abstract. To be implemented either for computation or communications, a mapping from the abstract syntax to specific machine representations and encodings must be defined; these may be called the " concrete syntax " (in language implementation) or

4736-549: The system. A key trade-off in the design of knowledge representation formalisms is that between expressivity and tractability. First Order Logic (FOL), with its high expressive power and ability to formalise much of mathematics, is a standard for comparing the expressibility of knowledge representation languages. Arguably, FOL has two drawbacks as a knowledge representation formalism in its own right, namely ease of use and efficiency of implementation. Firstly, because of its high expressive power, FOL allows many ways of expressing

4810-476: The use of mathematical logic to formalise mathematics and to automate the proof of mathematical theorems. A major step in this direction was the development of the resolution method by John Alan Robinson . In the meanwhile, John McCarthy and Pat Hayes developed the situation calculus as a logical representation of common sense knowledge about the laws of cause and effect. Cordell Green , in turn, showed how to do robot plan-formation by applying resolution to

4884-641: The working group was disbanded on May 31, 2004. In 2005, at the OWL Experiences And Directions Workshop a consensus formed that recent advances in description logic would allow a more expressive revision to satisfy user requirements more comprehensively whilst retaining good computational properties. In December 2006, the OWL1.1 Member Submission was made to the W3C. The W3C chartered the OWL Working Group as part of

4958-566: The world, it is essential to represent this kind of knowledge. In addition to McCarthy and Hayes' situation calculus, one of the most ambitious programs to tackle this problem was Doug Lenat's Cyc project. Cyc established its own Frame language and had large numbers of analysts document various areas of common-sense reasoning in that language. The knowledge recorded in Cyc included common-sense models of time, causality, physics, intentions, and many others. The starting point for knowledge representation

5032-538: Was a driving motivation for the development of IF-THEN rules in rule-based expert systems. A similar balancing act was also a motivation for the development of logic programming (LP) and the logic programming language Prolog . Logic programs have a rule-based syntax, which is easily confused with the IF-THEN syntax of production rules . But logic programs have a well-defined logical semantics, whereas production systems do not. The earliest form of logic programming

5106-453: Was based on formal logic rather than on IF-THEN rules. This reasoner is called the classifier. A classifier can analyze a set of declarations and infer new assertions, for example, redefine a class to be a subclass or superclass of some other class that wasn't formally specified. In this way the classifier can function as an inference engine, deducing new facts from an existing knowledge base. The classifier can also provide consistency checking on

5180-547: Was based on the Horn clause subset of FOL. But later extensions of LP included the negation as failure inference rule, which turns LP into a non-monotonic logic for default reasoning . The resulting extended semantics of LP is a variation of the standard semantics of Horn clauses and FOL, and is a form of database semantics, which includes the unique name assumption and a form of closed world assumption . These assumptions are much harder to state and reason with explicitly using

5254-626: Was convened to develop DAML+OIL as a web ontology language. This group was jointly funded by the DARPA (under the DAML program) and the European Union's Information Society Technologies (IST) funding project. DAML+OIL was intended to be a thin layer above RDFS , with formal semantics based on a description logic (DL). DAML+OIL is a particularly major influence on OWL; OWL's design was specifically based on DAML+OIL. The Semantic Web provides

5328-591: Was focused on general problem-solvers such as the General Problem Solver (GPS) system developed by Allen Newell and Herbert A. Simon in 1959 and the Advice Taker proposed by John McCarthy also in 1959. GPS featured data structures for planning and decomposition. The system would begin with a goal. It would then decompose that goal into sub-goals and then set out to construct strategies that could accomplish each subgoal. The Advisor Taker, on

5402-451: Was resolved in the early 1970s with the development of logic programming and Prolog , using SLD resolution to treat Horn clauses as goal-reduction procedures. The early development of logic programming was largely a European phenomenon. In North America, AI researchers such as Ed Feigenbaum and Frederick Hayes-Roth advocated the representation of domain-specific knowledge rather than general-purpose reasoning. These efforts led to

5476-620: Was soon realized that representing common-sense knowledge, knowledge that humans simply take for granted, was essential to make an AI that could interact with humans using natural language. Cyc was meant to address this problem. The language they defined was known as CycL . After CycL, a number of ontology languages have been developed. Most are declarative languages , and are either frame languages , or are based on first-order logic . Modularity—the ability to define boundaries around specific domains and problem spaces—is essential for these languages because as stated by Tom Gruber , "Every ontology

#293706