Misplaced Pages

ELIZA

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

ELIZA is an early natural language processing computer program developed from 1964 to 1967 at MIT by Joseph Weizenbaum . Created to explore communication between humans and machines, ELIZA simulated conversation by using a pattern matching and substitution methodology that gave users an illusion of understanding on the part of the program, but had no representation that could be considered really understanding what was being said by either party. Whereas the ELIZA program itself was written (originally) in MAD-SLIP , the pattern matching directives that contained most of its language capability were provided in separate "scripts", represented in a lisp-like representation . The most famous script, DOCTOR, simulated a psychotherapist of the Rogerian school (in which the therapist often reflects back the patient's words to the patient), and used rules, dictated in the script, to respond with non-directional questions to user inputs. As such, ELIZA was one of the first chatterbots ("chatbot" modernly) and one of the first programs capable of attempting the Turing test .

#596403

56-482: ELIZA's creator, Weizenbaum, intended the program as a method to explore communication between humans and machines. He was surprised and shocked that some people, including Weizenbaum's secretary, attributed human-like feelings to the computer program. Many academics believed that the program would be able to positively influence the lives of many people, particularly those with psychological issues, and that it could aid doctors working on such patients' treatment. While ELIZA

112-702: A professor at MIT . The Weizenbaum Award and the Weizenbaum Institute are named after him. Born in Berlin, Germany to Jewish parents, he escaped Nazi Germany in January 1936, immigrating with his family to the United States . He started studying mathematics in 1941 at Wayne State University , in Detroit, Michigan. In 1942, he interrupted his studies to serve in the U.S. Army Air Corps as

168-515: A Jesus-faced computer who claimed to be "OMM". Frederik Pohl 's science-fiction novel Gateway has the narrator undergo therapy at a praxis run by an AI that performs the task of a Freudian therapist , which he calls "Sigfrid von Shrink". The novel contains a few pages of (nonsensical) machine code illustrating Sigfrid's internal processes. ELIZA influenced a number of early computer games by demonstrating additional kinds of interface designs . Don Daglow claims he wrote an enhanced version of

224-466: A conversational strategy, and as such was a much more serious and advanced program than ELIZA. It was described as "ELIZA with attitude". PARRY was tested in the early 1970s using a variation of the Turing Test . A group of experienced psychiatrists analysed a combination of real patients and computers running PARRY through teleprinters . Another group of 33 psychiatrists were shown transcripts of

280-438: A forerunner of thinking machines, a misguided interpretation that Weizenbaum's later writing would attempt to correct. He started to think philosophically about the implications of artificial intelligence and later became one of its leading critics. In an interview with MIT's The Tech , Weizenbaum elaborated on his fears, expanding them beyond the realm of mere artificial intelligence, explaining that his fears for society and

336-492: A meteorologist, having been turned down for cryptology work because of his " enemy alien " status. After the war, in 1946, he returned to Wayne State, obtaining his B.S. in Mathematics in 1948, and his M.S. in 1950. Around 1952, as a research assistant at Wayne, Weizenbaum worked on analog computers and helped create a digital computer. In 1956, he worked for General Electric on ERMA , a computer system that introduced

392-411: A non-human therapist. When ELIZA was created in 1966, it was meant predominantly for white, male, individuals with high education. This exclusivity was especially prevalent during the creation and testing stages of the bot, which marginalized the experience of those intended users and those who did not fit into the characteristics mentioned. Although this chatbot was meant to mimic human conversation with

448-568: A program to make natural-language conversation possible with a computer. To accomplish this, Weizenbaum identified five "fundamental technical problems" for ELIZA to overcome: the identification of key words, the discovery of a minimal context, the choice of appropriate transformations, the generation of responses in the absence of key words, and the provision of an editing capability for ELIZA scripts. Weizenbaum solved these problems and made ELIZA such that it had no built-in contextual framework or universe of discourse. However, this required ELIZA to have

504-584: A programmer had attempted such a human-machine interaction with the goal of creating the illusion (however brief) of human– human interaction. At the ICCC 1972 , ELIZA was brought together with another early artificial-intelligence program named PARRY for a computer-only conversation. While ELIZA was built to speak as a doctor, PARRY was intended to simulate a patient with schizophrenia . Weizenbaum originally wrote ELIZA in MAD-SLIP for CTSS on an IBM 7094 as

560-407: A reassembly rule would take the fragments and apply them to the phrase "What makes you think I am (4)", which would result in "What makes you think I am very helpful?". This example is rather simple, since depending upon the disassembly rule, the output could be significantly more complex and use more of the input from the user. However, from this reassembly, ELIZA then sends the constructed sentence to

616-404: A script of instructions on how to respond to inputs from users. ELIZA starts its process of responding to an input by a user by first examining the text input for a "keyword". A "keyword" is a word designated as important by the acting ELIZA script, which assigns to each keyword a precedence number, or a RANK, designed by the programmer. If such words are found, they are put into a "keystack", with

SECTION 10

#1732797692597

672-466: A teletype) was new. It was 11 years before the personal computer became familiar to the general public, and three decades before most people encountered attempts at natural language processing in Internet services like Ask.com or PC help systems such as Microsoft Office Clippit . Although those programs included years of research and work, ELIZA remains a milestone simply because it was the first time

728-587: Is first defined in Fluid Concepts and Creative Analogies: Computer Models and the Fundamental Mechanisms of Thought as humans’ assumption of which computer programs understand the user inputs and make analogies. However, it has no permanent knowledge but “handling a list of ‘assertions’.” This misunderstanding can potentially manipulate and misinform users. When interacting and communicating with chatbots, users can be overly confident in

784-404: Is programmed to follow mimics a therapist’s nurturing and feminine qualities. He criticizes this decision by acknowledging that when technologies such as chatbots are created in such a way, they reinforce the idea that emotional and nurturing jobs are inherently feminine. Joseph Weizenbaum Joseph Weizenbaum (8 January 1923 – 5 March 2008) was a German American computer scientist and

840-405: Is to apply an appropriate transformation rule, which includes two parts: the "decomposition rule" and the "reassembly rule". First, the input is reviewed for syntactical patterns in order to establish the minimal context necessary to respond. Using the keywords and other nearby words from the input, different disassembly rules are tested until an appropriate pattern is found. Using the script's rules,

896-658: The ingenue in George Bernard Shaw 's Pygmalion , which could chat to the user. ELIZA was written in the SLIP programming language of Weizenbaum's own creation. The program applied pattern matching rules to statements to figure out its replies. (Programs like this are now called chatbots .) Driven by a script named DOCTOR, it was capable of engaging humans in a conversation which bore a striking resemblance to one with an empathic psychologist. Weizenbaum modeled its conversational style after Carl Rogers , who introduced

952-620: The American sitcom Young Sheldon , aired in January 2018, included the protagonist "conversing" with ELIZA, hoping to resolve a domestic issue. On August 12, 2019, independent game developer Zachtronics published a visual novel called Eliza , about an AI-based counseling service inspired by ELIZA. In A Murder at the End of the World , the anthropomorphic LLM -powered character Ray cites ELIZA as an example of how some may seek refuge in

1008-466: The DOCTOR script, created a conversational interaction somewhat similar to what might take place in the office of "a [non-directive] psychotherapist in an initial psychiatric interview" and to "demonstrate that the communication between man and machine was superficial". While ELIZA is best known for acting in the manner of a psychotherapist, the speech patterns are due to the data and instructions supplied by

1064-457: The DOCTOR script. ELIZA itself examined the text for keywords, applied values to said keywords, and transformed the input into an output; the script that ELIZA ran determined the keywords, set the values of keywords, and set the rules of transformation for the output. Weizenbaum chose to make the DOCTOR script in the context of psychotherapy to "sidestep the problem of giving the program a data base of real-world knowledge", allowing it to reflect back

1120-592: The DOCTOR script. Other versions adapted ELIZA around a religious theme, such as ones featuring Jesus (both serious and comedic), and another Apple II variant called I Am Buddha . The 1980 game The Prisoner incorporated ELIZA-style interaction within its gameplay. In 1988, the British artist and friend of Weizenbaum Brian Reffin Smith created two art-oriented ELIZA-style programs written in BASIC , one called "Critic" and

1176-481: The MAD-SLIP source-code has now been discovered in the MIT archives and published on various platforms, such as archive.org. The source-code is of high historical interest since it demonstrates not only the specificity of programming languages and techniques at that time, but also the beginning of software layering and abstraction as a means of achieving sophisticated software programming. Joseph Weizenbaum 's ELIZA, running

SECTION 20

#1732797692597

1232-533: The Pinhead . The Zippyisms were removed due to copyright issues, but the DOCTOR program remains. ELIZA has been referenced in popular culture and continues to be a source of inspiration for programmers and developers focused on artificial intelligence. It was also featured in a 2012 exhibit at Harvard University titled "Go Ask A.L.I.C.E. ", as part of a celebration of mathematician Alan Turing 's 100th birthday. The exhibit explores Turing's lifelong fascination with

1288-459: The chatbot as a sentient being and decided to sacrifising his life for Eliza to save humanity. On February 28, 2024, the chatbot from Character.AI. induced Sewell Setzer III, a 14-year-old ninth grader from Orlando, Fla, to commit suicide. Although Setzer has the knowledge of chatbot being a program that has no personality, he still has a strong emotional attachment to it. Through the Eliza effect,

1344-423: The chatbot generates misleading scripts that result in unexpected consequences, disobeying its original intention. In 1969, George Lucas and Walter Murch incorporated an Eliza-like dialogue interface in their screenplay for the feature film THX-1138 . Inhabitants of the underground future world of THX, when stressed, would retreat to "confession booths" and initiate a one-sided Eliza-formula conversation with

1400-816: The complete source code listing of ELIZA in MAD-SLIP, with the DOCTOR script attached. The Weizenbaum estate has given permission to open-source this code under a Creative Commons CC0 public domain license. The code and other information can be found on the ELIZAGEN site. Another version of Eliza popular among software engineers is the version that comes with the default release of GNU Emacs , and which can be accessed by typing M -x doctor from most modern Emacs implementations. From Figure 15.5, Chapter 15 of Speech and Language Processing (third edition). Lay responses to ELIZA were disturbing to Weizenbaum and motivated him to write his book Computer Power and Human Reason: From Judgment to Calculation , in which he explains

1456-421: The computer allowed the industry to become more efficient, it prevented a fundamental re-haul of the system. Weizenbaum also worried about the negative effects computers would have with regards to the military, calling the computer "a child of the military." When asked about his belief that a computer science professional would more often than not end up working with defense, Weizenbaum detailed his position on

1512-412: The crucial distinction between deciding and choosing. Deciding is a computational activity, something that can ultimately be programmed. Choice, however, is the product of judgment, not calculation. In deploying computers to make decisions that humans once made, the agent doing so has made a choice based on their values that will have particular, non-neutral consequences for the subjects who will experience

1568-608: The effect of rhetoric , specifically euphemism , on public viewpoints. He believed that the terms "the military" and "defense" did not accurately represent the organizations and their actions. He made it clear that he did not think of himself as a pacifist , believing that there are certainly times where arms are necessary, but by referring to defense as killings and bombings, humanity as a whole would be less inclined to embrace violent reactions so quickly. His influential 1976 book Computer Power and Human Reason displays his ambivalence towards computer technology and lays out his case:

1624-458: The exact manner by which the program dismantled, examined, and reassembled inputs is determined by the operating script. The script is not static and can be edited, or a new one created, as is necessary for the operation in the context needed. This would allow the program to be applied in multiple situations, including the well-known DOCTOR script, which simulates a Rogerian psychotherapist. A Lisp version of ELIZA, based on Weizenbaum's CACM paper,

1680-566: The future of society were largely because of the computer itself. His belief was that the computer, at its most base level, is a fundamentally conservative force and that despite being a technological innovation, it would end up hindering social progress. Weizenbaum used his experience working with Bank of America as justification for his reasoning, saying that the computer allowed banks to deal with an ever-expanding number of checks in play that otherwise would have forced drastic changes to banking organization such as decentralization. As such, although

1736-403: The goal of making the user think it is human, and those users would typically converse with others like them, ELIZA was named after a female character and programmed to give more feminine responses. Joseph Weizenbaum, the creator of ELIZA, has reflected upon and critiqued how ELIZA and other chatbots of the sort reinforce gender stereotypes. In particular, Weizenbaum reflects on how the script ELIZA

ELIZA - Misplaced Pages Continue

1792-421: The interaction between humans and computers, pointing to ELIZA as one of the earliest realizations of Turing's ideas. ELIZA won a 2021 Legacy Peabody Award . A 2023 preprint reported that ELIZA beat OpenAI 's GPT-3.5 , the model used by ChatGPT at the time, in a Turing test study. However, it did not outperform GPT-4 or real humans. The Eliza effect borrowed its name from ELIZA the chatbot. This effect

1848-428: The keyword of the highest RANK at the top. The input sentence is then manipulated and transformed as the rule associated with the keyword of the highest RANK directs. For example, when the DOCTOR script encounters words such as "alike" or "same", it would output a message pertaining to similarity, in this case "In what way?", as these words had high precedence number. This also demonstrates how certain words, as dictated by

1904-405: The keywords and the information in the sentence. The decomposition rule then designates a particular reassembly rule, or set of reassembly rules, to follow when reconstructing the sentence. The reassembly rule takes the fragments of the input that the decomposition rule had created, rearranges them, and adds in programmed words to create a response. Using Weizenbaum's example previously stated, such

1960-428: The limits of computers, as he wants to make clear his opinion that the anthropomorphic views of computers are just a reduction of human beings or any life form for that matter. In the independent documentary film Plug & Pray (2010) Weizenbaum said that only people who misunderstood ELIZA called it a sensation. David Avidan , who was fascinated with future technologies and their relation to art, desired to explore

2016-409: The manner by which the program operates. Weizenbaum first implemented ELIZA in his own SLIP list-processing language, where, depending upon the initial entries by the user, the illusion of human intelligence could appear, or be dispelled through several interchanges. Some of ELIZA's responses were so convincing that Weizenbaum and several others have anecdotes of users becoming emotionally attached to

2072-481: The other "Artist", running on two separate Amiga 1000 computers and showed them at the exhibition "Salamandre" in the Musée du Berry, Bourges , France. The visitor was supposed to help them converse by typing in to "Artist" what "Critic" said, and vice versa. The secret was that the two programs were identical. GNU Emacs formerly had a psychoanalyze-pinhead command that simulates a session between ELIZA and Zippy

2128-551: The outcomes of the computerized decisions that the agent has instituted. As of 1987, Weizenbaum had five children: one son from his first marriage and four daughters from his second. In 1996, Weizenbaum moved to Berlin and lived in the vicinity of his childhood neighborhood. Weizenbaum was buried at the Weißensee Jewish cemetery in Berlin. A memorial service was held in Berlin on 18 March 2008. A German documentary film on Weizenbaum, "Weizenbaum. Rebel at Work.",

2184-413: The patient's statements in order to carry the conversation forward. The result was a somewhat intelligent-seeming response that reportedly deceived some early users of the program. Weizenbaum named his program ELIZA after Eliza Doolittle , a working-class character in George Bernard Shaw 's Pygmalion (also appearing in the musical My Fair Lady , which was based on the play and was hugely popular at

2240-443: The possibility of programming computers to perform one task or another that humans also perform (i.e., whether Artificial Intelligence is achievable or not) is irrelevant to the question of whether computers can be put to a given task. Instead, Weizenbaum asserts that the definition of tasks and the selection of criteria for their completion is a creative act that relies on human values, which cannot come from computers. Weizenbaum makes

2296-421: The program called Ecala on a DEC PDP-10 minicomputer at Pomona College in 1973. The 2011 video game Deus Ex: Human Revolution and the 2016 sequel Deus Ex: Mankind Divided features an artificial-intelligence Picus TV Network newsreader named Eliza Cassan. In Adam Curtis 's 2016 documentary, HyperNormalisation , ELIZA was referenced in relationship to post-truth . The twelfth episode of

ELIZA - Misplaced Pages Continue

2352-460: The program, occasionally forgetting that they were conversing with a computer. Weizenbaum's own secretary reportedly asked Weizenbaum to leave the room so that she and ELIZA could have a real conversation. Weizenbaum was surprised by this, later writing: "I had not realized ... that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people." In 1966, interactive computing (via

2408-546: The reliability of the chatbots’ answers. Other than misinforming, the chatbot’s human-mimicking nature can also cause severe consequences, especially for younger users who lack a sufficient understanding of the chatbot’s mechanism. Although chatbots can only communicate to users in limited ways, it can be fatauly dangerous. In 2023, a young Belgian man commited suicide after talking to Eliza, an AI chatbot on Chai . He discussed his concerns about climat change and hoped for technologies to solve it. As this belief progressed, he saw

2464-401: The script, can be manipulated regardless of contextual considerations, such as switching first-person pronouns and second-person pronouns and vice versa, as these too had high precedence numbers. Such words with high precedence numbers are deemed superior to conversational patterns and are treated independently of contextual patterns. Following the first examination, the next step of the process

2520-431: The sentence is then "dismantled" and arranged into sections of the component parts as the "decomposition rule for the highest-ranking keyword" dictates. The example that Weizenbaum gives is the input "You are very helpful", which is transformed to "I are very helpful". This is then broken into (1) empty (2) "I" (3) "are" (4) "very helpful". The decomposition rule has broken the phrase into four small segments that contain both

2576-468: The strength of his SLIP (Symmetric Lisp Processing) software. Within four years, he had been awarded tenure and a full professorship in computer science and engineering (in 1970). In addition to working at MIT, Weizenbaum held academic appointments at Harvard, Stanford, the University of Bremen, and other universities. In 1966, he published a comparatively simple program called ELIZA , named after

2632-493: The time). According to Weizenbaum, ELIZA's ability to be "incrementally improved" by various users made it similar to Eliza Doolittle, since Eliza Doolittle was taught to speak with an upper-class accent in Shaw's play. However, unlike the human character in Shaw's play, ELIZA is incapable of learning new patterns of speech or new words through interaction alone. Edits must be made directly to ELIZA's active script in order to change

2688-583: The use of computers for writing literature. He conducted several conversations with an APL implementation of ELIZA and published them – in English, and in his own translation to Hebrew – under the title My Electronic Psychiatrist – Eight Authentic Talks with a Computer . In the foreword, he presented it as a form of constrained writing . There are many programs based on ELIZA in different programming languages. For MS-DOS computers, some Sound Blaster cards came bundled with Dr. Sbaitso , which functions like

2744-406: The use of open-ended questions to encourage patients to communicate more effectively with therapists. He was shocked that his program was taken seriously by many users, who would open their hearts to it. Famously, when he was observing his secretary using the software - who was aware that it was a simulation - she asked Weizenbaum: "would you mind leaving the room please?" Many hailed the program as

2800-480: The use of the magnetically encoded fonts imprinted on the bottom border of checks, allowing automated check processing via magnetic ink character recognition (MICR). He published a short paper in Datamation in 1962 entitled "How to Make a Computer Appear Intelligent" that described the strategy used in a Gomoku program that could beat novice players. In 1963 he took a position of associate professor at MIT on

2856-444: The user in the form of text on the screen. These steps represent the bulk of the procedures that ELIZA follows in order to create a response from a typical input, though there are several specialized situations that ELIZA/DOCTOR can respond to. One Weizenbaum specifically wrote about was when there is no keyword. One solution was to have ELIZA respond with a remark that lacked content, such as "I see" or "Please go on". The second method

SECTION 50

#1732797692597

2912-401: Was capable of engaging in discourse, it could not converse with true understanding. However, many early users were convinced of ELIZA's intelligence and understanding, despite Weizenbaum's insistence to the contrary. The original ELIZA source-code had been missing since its creation in the 1960s as it was not common to publish articles that included source code at that time. However, more recently

2968-498: Was released in 2007 and later dubbed in English. The documentary film Plug & Pray on Weizenbaum and the ethics of artificial intelligence was released in 2010. The interdisciplinary German Internet Institute (Weizenbaum Institute for the Networked Society) was named after Joseph Weizenbaum. PARRY PARRY was an early example of a chatbot , implemented in 1972 by psychiatrist Kenneth Colby . PARRY

3024-480: Was to use a "MEMORY" structure, which recorded prior recent inputs, and would use these inputs to create a response referencing a part of the earlier conversation when encountered with no keywords. This was possible due to Slip's ability to tag words for other usage, which simultaneously allowed ELIZA to examine, store, and repurpose words for usage in outputs. While these functions were all framed in ELIZA's programming,

3080-437: Was written in 1972 by psychiatrist Kenneth Colby , then at Stanford University . While ELIZA was a simulation of a Rogerian therapist, PARRY attempted to simulate a person with paranoid schizophrenia . The program implemented a crude model of the behavior of a person with paranoid schizophrenia based on concepts, conceptualizations, and beliefs (judgements about conceptualizations: accept, reject, neutral). It also embodied

3136-656: Was written shortly after that paper's publication by Bernie Cosell. A BASIC version appeared in Creative Computing in 1977 (although it was written in 1973 by Jeff Shrager). This version, which was ported to many of the earliest personal computers, appears to have been subsequently translated into many other versions in many other languages. Shrager claims not to have seen either Weizenbaum's or Cosell's versions. In 2021, Jeff Shrager searched MIT's Weizenbaum archives, along with MIT archivist Myles Crowley, and found files labeled Computer Conversations. These included

#596403