In programming and software design , an event is an action or occurrence recognized by software , often originating asynchronously from the external environment, that may be handled by the software. Computer events can be generated or triggered by the system, by the user , or in other ways. Events may be handled synchronously with the program flow . That is, the software may have one or more dedicated places where events are handled, frequently an event loop . However, in event-driven architecture , events are typically processed asynchronously .
52-417: The user can be the source of an event. The user may interact with the software through the computer's peripherals —for example, by typing on a keyboard or clicking with a mouse. Another source is a hardware device such as a timer . Software can also trigger its own set of events into the event loop, such as by communicating the completion of a task. Software that changes its behavior in response to events
104-502: A callback subroutine that handles inputs received in a program (called a listener in Java and JavaScript ). Each event is a piece of application-level information from the underlying framework , typically the GUI toolkit . GUI events include key presses, mouse movement, action selections, and timers expiring. On a lower level, events can represent availability of new data for reading
156-427: A computer is known as a " Human-computer Interface (HCI) ". As a field of research, human–computer interaction is situated at the intersection of computer science , behavioral sciences , design , media studies , and several other fields of study . The term was popularized by Stuart K. Card , Allen Newell , and Thomas P. Moran in their 1983 book, The Psychology of Human–Computer Interaction. The first known use
208-497: A computer with a man's name to cost more than a machine with a woman's name. Other research finds that individuals perceive their interactions with computers more negatively than humans, despite behaving the same way towards these machines. In human and computer interactions, a semantic gap usually exists between human and computer's understandings towards mutual behaviors. Ontology , as a formal representation of domain-specific knowledge, can be used to address this problem by solving
260-688: A current user interface , or designing a new user interface: The iterative design process is repeated until a sensible, user-friendly interface is created. Various strategies delineating methods for human–PC interaction design have developed since the conception of the field during the 1980s. Most plan philosophies come from a model for how clients, originators, and specialized frameworks interface. Early techniques treated clients' psychological procedures as unsurprising and quantifiable and urged plan specialists to look at subjective science to establish zones, (for example, memory and consideration) when structuring UIs. Present-day models, in general, center around
312-482: A device, such as a shake, tilt, rotation, or move. A common variant in object-oriented programming is the delegate event model , which is provided by some graphic user interfaces . This model is based on three entities: Furthermore, the model requires that: C# uses events as special delegates that can only be fired by the class that declares them. This allows for better abstraction , for example: In computer programming, an event handler may be implemented using
364-643: A different approach to an application framework, based on the OpenStep framework developed at NeXT . Since the 2010s, many apps have been created with the frameworks based on Google 's Chromium project. The two prominent ones are Electron and the Chromium Embedded Framework . Free and open-source software frameworks exist as part of the Mozilla , LibreOffice , GNOME , KDE , NetBeans , and Eclipse projects. Microsoft markets
416-468: A different machine than the consumer, or consumers. Event notification platforms are normally designed so that the application producing events do not need to know which applications will consume them, or even how many applications will monitor the event stream. Event notification is sometimes used as a synonym for publish-subscribe , a term that relates to one class of products supporting event notification in networked settings. The virtual synchrony model
468-669: A display is designed, the task that the display is intended to support must be defined (e.g., navigating, controlling, decision making, learning, entertaining, etc.). A user or operator must be able to process whatever information a system generates and displays; therefore, the information must be displayed according to principles to support perception, situation awareness, and understanding. Christopher Wickens et al. defined 13 principles of display design in their book An Introduction to Human Factors Engineering . These human perception and information processing principles can be utilized to create an effective display design. A reduction in errors,
520-645: A file or network stream. Event handlers are a central concept in event-driven programming . The events are created by the framework based on interpreting lower-level inputs, which may be lower-level events themselves. For example, mouse movements and clicks are interpreted as menu selections. The events initially originate from actions on the operating system level, such as interrupts generated by hardware devices, software interrupt instructions, or state changes in polling . On this level, interrupt handlers and signal handlers correspond to event handlers. Created events are first processed by an event dispatcher within
572-809: A framework for developing Windows applications in C++ called the Microsoft Foundation Class Library , and a similar framework for developing applications with Visual Basic or C# , named .NET Framework . Several frameworks can build cross-platform applications for Linux , Macintosh, and Windows from common source code , such as Qt , wxWidgets , Juce , Fox toolkit , or Eclipse Rich Client Platform (RCP). Oracle Application Development Framework (Oracle ADF) aids in producing Java -oriented systems. Silicon Laboratories offers an embedded application framework for developing wireless applications on its series of wireless chips. MARTHA
SECTION 10
#1732802578455624-489: A key on a keyboard or a combination of keys generates a keyboard event, enabling the program currently running to respond to the introduced data such as which key/s the user pressed. Moving a joystick generates an X-Y analogue signal. They often have multiple buttons to trigger events. Some gamepads for popular game boxes use joysticks. The events generated using a touchscreen are commonly referred to as touch events or gestures . Device events include action by or to
676-487: A manual). The use of knowledge in a user's head and knowledge in the world must be balanced for an effective design. 12. Principle of predictive aiding . Proactive actions are usually more effective than reactive actions. A display should eliminate resource-demanding cognitive tasks and replace them with simpler perceptual tasks to reduce the user's mental resources. This will allow the user to focus on current conditions and to consider possible future conditions. An example of
728-405: A number of software recognisable pointing device gestures . A mouse can generate a number of mouse events, such as mouse move (including direction of move and distance), mouse left/right button up/down and mouse wheel motion, or a combination of these gestures. For example, double-clicks commonly select words and characters within boundary, and triple-clicks select entire paragraphs. Pressing
780-456: A predictive aid is a road sign displaying the distance to a certain destination. 13. Principle of consistency . Old habits from other displays will easily transfer to support the processing of new displays if they are designed consistently. A user's long-term memory will trigger actions that are expected to be appropriate. A design must accept this fact and utilize consistency among different displays. Topics in human–computer interaction include
832-442: A reduction in required training time, an increase in efficiency, and an increase in user satisfaction are a few of the many potential benefits that can be achieved by utilizing these principles. Certain principles may not apply to different displays or situations. Some principles may also appear to be conflicting, and there is no simple solution to say that one principle is more important than another. The principles may be tailored to
884-402: A specific design or situation. Striking a functional balance among the principles is critical for an effective design. 1.Make displays legible (or audible) . A display's legibility is critical and necessary for designing a usable display. If the characters or objects being displayed cannot be discernible, the operator cannot effectively use them. 2.Avoid absolute judgment limits . Do not ask
936-404: A steady input and discussion between clients, creators, and specialists and push for specialized frameworks to be folded with the sorts of encounters clients need to have, as opposed to wrapping user experience around a finished framework. Displays are human-made artifacts designed to support the perception of relevant system variables and facilitate further processing of that information. Before
988-1010: Is an associated cost in time or effort. A display design should minimize this cost by allowing frequently accessed sources to be located at the nearest possible position. However, adequate legibility should not be sacrificed to reduce this cost. 9. Proximity compatibility principle . Divided attention between two information sources may be necessary for the completion of one task. These sources must be mentally integrated and are defined to have close mental proximity. Information access costs should be low, which can be achieved in many ways (e.g., proximity, linkage by common colors, patterns, shapes, etc.). However, close display proximity can be harmful by causing too much clutter. 10. Principle of multiple resources . A user can more easily process information across different resources. For example, visual and auditory information can be presented simultaneously rather than presenting all visual or all auditory information. 11. Replace memory with visual information: knowledge in
1040-415: Is more similar to A423B8 than 92 is to 93. Unnecessarily similar features should be removed, and dissimilar features should be highlighted. 6. Principle of pictorial realism . A display should look like the variable that it represents (e.g., the high temperature on a thermometer shown as a higher vertical level). If there are multiple elements, they can be configured in a manner that looks like they would in
1092-457: Is quite broad in scope. It is attended by academics, practitioners, and industry people, with company sponsors such as Google, Microsoft, and PayPal. There are also dozens of other smaller, regional, or specialized HCI-related conferences held around the world each year, including: Application framework In computer programming , an application framework consists of a software framework used by software developers to implement
SECTION 20
#17328025784551144-434: Is said to be event-driven , often with the goal of being interactive . Event driven systems are typically used when there is some asynchronous external activity that needs to be handled by a program, such as a user pressing a mouse button. An event driven system typically runs an event loop that keeps waiting for such activities, such as input from devices or internal alarms. When one of these occurs, it collects data about
1196-572: Is sometimes used to endow event notification systems, and publish-subscribe systems, with stronger fault-tolerance and consistency guarantees. Human%E2%80%93computer interaction Human–computer interaction ( HCI ) is research in the design and the use of computer technology , which focuses on the interfaces between people ( users ) and computers . HCI researchers observe the ways humans interact with computers and design technologies that allow humans to interact with computers in novel ways. A device that allows interaction between human being and
1248-533: Is the Three Mile Island accident , a nuclear meltdown accident, where investigations concluded that the design of the human-machine interface was at least partly responsible for the disaster. Similarly, accidents in aviation have resulted from manufacturers' decisions to use non-standard flight instruments or throttle quadrant layouts: even though the new designs were proposed to be superior in basic human-machine interaction, pilots had already ingrained
1300-607: Is understood correctly. 4.Redundancy gain . If a signal is presented more than once, it is more likely to be understood correctly. This can be done by presenting the signal in alternative physical forms (e.g., color and shape, voice and print, etc.), as redundancy does not imply repetition. A traffic light is a good example of redundancy, as color and position are redundant. 5.Similarity causes confusion: Use distinguishable elements . Signals that appear to be similar will likely be confused. The ratio of similar features to different features causes signals to be similar. For example, A423B9
1352-402: Is user satisfaction, also referred to as End-User Computing Satisfaction. It goes on to say: "Because human–computer interaction studies a human and a machine in communication, it draws from supporting knowledge on both the machine and the human side. On the machine side, techniques in computer graphics , operating systems , programming languages , and development environments are relevant. On
1404-508: The instruction set level, where they complement interrupts . Compared to interrupts, events are normally implemented synchronously: the program explicitly waits for an event to be generated and handled (typically by calling an instruction that dispatches the next event), whereas an interrupt can demand immediate service. There are many situations or events that a program or system may generate or to which it may respond. Some common user generated events include: A pointing device can generate
1456-508: The "standard" layout. Thus, the conceptually good idea had unintended results. The human–computer interface can be described as the point of communication between the human user and the computer. The flow of information between the human and computer is defined as the loop of interaction . The loop of interaction has several aspects to it, including: Human–computer interaction studies the ways in which humans make—or do not make—use of computational artifacts, systems, and infrastructures. Much of
1508-418: The associated conditions and may take actions triggered by events. Event notification is an important feature in modern database systems (used to inform applications when conditions they are watching for have occurred), modern operating systems (used to inform applications when they should take some action, such as refreshing a window), and modern distributed systems, where the producer of an event might be on
1560-457: The concepts of multimodality over unimodality, intelligent adaptive interfaces over command/action based ones, and active interfaces over passive interfaces. The Association for Computing Machinery (ACM) defines human–computer interaction as "a discipline that is concerned with the design, evaluation, and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them". A key aspect of HCI
1612-456: The data obtained from affect-detection channels to improve decision models. A brain–computer interface (BCI), is a direct communication pathway between an enhanced or wired brain and an external device. BCI differs from neuromodulation in that it allows for bidirectional information flow. BCIs are often directed at researching, mapping, assisting, augmenting, or repairing human cognitive or sensory-motor functions. Security interactions are
Event (computing) - Misplaced Pages Continue
1664-440: The effectiveness of human–computer interaction. The influence of emotions in human–computer interaction has been studied in fields such as financial decision-making using ECG and organizational knowledge sharing using eye-tracking and face readers as affect-detection channels. In these fields, it has been shown that affect-detection channels have the potential to detect human emotions and those information systems can incorporate
1716-417: The emerging multi-modal and Graphical user interfaces (GUI) allow humans to engage with embodied character agents in a way that cannot be achieved with other interface paradigms. The growth in human–computer interaction field has led to an increase in the quality of interaction, and resulted in many new areas of research beyond. Instead of designing regular interfaces, the different research branches focus on
1768-418: The event source to the handler about how the event should be processed. Events are typically used in user interfaces, where actions in the outside world (such as mouse clicks, window-resizing, keyboard presses, and messages from other programs) are handled by the program as a series of events. Programs written for many windowing environments consist predominantly of event handlers. Events can also be used at
1820-461: The event and dispatches the event to the event handler software that will deal with it. A program can choose to ignore events, and there may be libraries to dispatch an event to multiple handlers that may be programmed to listen for a particular event. The data associated with an event at a minimum specifies what type of event it is, but may include other information such as when it occurred, who or what caused it to occur, and extra data provided by
1872-721: The field seek to achieve might vary. When pursuing a cognitivist perspective, researchers of HCI may seek to align computer interfaces with the mental model that humans have of their activities. When pursuing a post-cognitivist perspective, researchers of HCI may seek to align computer interfaces with existing social practices or existing sociocultural values. Researchers in HCI are interested in developing design methodologies, experimenting with devices, prototyping software, and hardware systems, exploring interaction paradigms, and developing models and theories of interaction. The following experimental design principles are considered, when evaluating
1924-514: The following : Social computing is an interactive and collaborative behavior considered between technology and people. In recent years, there has been an explosion of social science research focusing on interactions as the unit of analysis, as there are a lot of social computing technologies that include blogs, emails, social networking, quick messaging, and various others. Much of this research draws from psychology, social psychology, and sociology. For example, one study found out that people expected
1976-444: The following are common reasons: Traditionally, computer use was modeled as a human–computer dyad in which the two were connected by a narrow explicit communication channel, such as text-based terminals. Much work has been done to make the interaction between a computing system and a human more reflective of the multidimensional nature of everyday communication. Because of potential issues, human–computer interaction shifted focus beyond
2028-471: The framework. It typically manages the associations between events and event handlers, and may queue event handlers or events for later processing. Event dispatchers may call event handlers directly, or wait for events to be dequeued with information about the handler to be executed. Event notification is a term used in conjunction with communications software for linking applications that generate small messages (the "events") to applications that monitor
2080-507: The human side, communication theory , graphic and industrial design disciplines, linguistics , social sciences , cognitive psychology , social psychology , and human factors such as computer user satisfaction are relevant. And, of course, engineering and design methods are relevant." Due to the multidisciplinary nature of HCI, people with different backgrounds contribute to its success. Poorly designed human-machine interfaces can lead to many unexpected problems. A classic example
2132-455: The interface between the two is crucial to facilitating this interaction. HCI is also sometimes termed human–machine interaction (HMI), man-machine interaction (MMI) or computer-human interaction (CHI). Desktop applications, internet browsers, handheld computers, and computer kiosks make use of the prevalent graphical user interfaces (GUI) of today. Voice user interfaces (VUI) are used for speech recognition and synthesizing systems, and
Event (computing) - Misplaced Pages Continue
2184-431: The interface to respond to observations as articulated by D. Engelbart: "If ease of use were the only valid criterion, people would stick to tricycles and never try bicycles." How humans interact with computers continues to evolve rapidly. Human–computer interaction is affected by developments in computing. These forces include: As of 2010 the future for HCI is expected to include the following characteristics: One of
2236-479: The main conferences for new research in human–computer interaction is the annually held Association for Computing Machinery 's (ACM) Conference on Human Factors in Computing Systems , usually referred to by its short name CHI (pronounced kai , or Khai ). CHI is organized by ACM Special Interest Group on Computer-Human Interaction ( SIGCHI ). CHI is a large conference, with thousands of attendants, and
2288-466: The represented environment. 7. Principle of the moving part . Moving elements should move in a pattern and direction compatible with the user's mental model of how it actually moves in the system. For example, the moving element on an altimeter should move upward with increasing altitude. 8. Minimizing information access cost or interaction cost . When the user's attention is diverted from one location to another to access necessary information, there
2340-458: The research in this field seeks to improve the human–computer interaction by improving the usability of computer interfaces. How usability is to be precisely understood, how it relates to other social and cultural values, and when it is, and when it may not be a desirable property of computer interfaces is increasingly debated. Much of the research in the field of human–computer interaction takes an interest in: Visions of what researchers in
2392-400: The semantic ambiguities between the two parties. In the interaction of humans and computers, research has studied how computers can detect, process, and react to human emotions to develop emotionally intelligent information systems. Researchers have suggested several 'affect-detection channels'. The potential of telling human emotions in an automated and digital fashion lies in improvements to
2444-501: The standard structure of application software . Application frameworks became popular with the rise of graphical user interfaces (GUIs), since these tended to promote a standard structure for applications. Programmers find it much simpler to create automatic GUI creation tools when using a standard framework, since this defines the underlying code structure of the application in advance. Developers usually use object-oriented programming (OOP) techniques to implement frameworks such that
2496-534: The study of interaction between humans and computers specifically as it pertains to information security . Its aim, in plain terms, is to improve the usability of security features in end user applications. Unlike HCI, which has roots in the early days of Xerox PARC during the 1970s, HCISec is a nascent field of study by comparison. Interest in this topic tracks with that of Internet security , which has become an area of broad public concern only in very recent years. When security features exhibit poor usability,
2548-618: The unique parts of an application can simply inherit from classes extant in the framework. Apple Computer developed one of the first commercial application frameworks, MacApp (first release 1985), for the Macintosh . Originally written in an extended (object-oriented) version of Pascal termed Object Pascal , it was later rewritten in C++ . Another notable framework for the Mac is Metrowerks' PowerPlant , based on Carbon . Cocoa for macOS offers
2600-433: The user to determine the level of a variable based on a single sensory variable (e.g., color, size, loudness). These sensory variables can contain many possible levels. 3.Top-down processing . Signals are likely perceived and interpreted by what is expected based on a user's experience. If a signal is presented contrary to the user's expectation, more physical evidence of that signal may need to be presented to assure that it
2652-425: The world . A user should not need to retain important information solely in working memory or retrieve it from long-term memory. A menu, checklist, or another display can aid the user by easing the use of their memory. However, memory use may sometimes benefit the user by eliminating the need to reference some knowledge globally (e.g., an expert computer operator would rather use direct commands from memory than refer to
SECTION 50
#17328025784552704-428: Was in 1975 by Carlisle. The term is intended to convey that, unlike other tools with specific and limited uses, computers have many uses which often involve an open-ended dialogue between the user and the computer. The notion of dialogue likens human–computer interaction to human-to-human interaction: an analogy that is crucial to theoretical considerations in the field. Humans interact with computers in many ways, and
#454545