Graphical user interface

GUIgraphicalgraphical interfaceuser interfacegraphical user interfacesinterfacegraphic user interfaceGUIsgraphical user interface (GUI)UI
The graphical user interface (GUI ) is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicator such as primary notation, instead of text-based user interfaces, typed command labels or text navigation.wikipedia
2,586 Related Articles

Text-based user interface

text-basedtext user interfaceTUI
The graphical user interface (GUI ) is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicator such as primary notation, instead of text-based user interfaces, typed command labels or text navigation.
Text-based user interfaces (TUI), alternately terminal user interfaces, to reflect a dependence upon the properties of computer terminals and not just text, is a retronym parallel to the concept of graphical user interfaces (GUI).

User interface

UIinterfaceweb interface
The graphical user interface (GUI ) is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicator such as primary notation, instead of text-based user interfaces, typed command labels or text navigation.
The most common CUI is a graphical user interface (GUI), which is composed of a tactile UI and a visual UI capable of displaying graphics.

Command-line interface

command linecommand-linecommand line interface
GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces (CLIs), which require commands to be typed on a computer keyboard.
Today, many end users rarely, if ever, use command-line interfaces and instead rely upon graphical user interfaces and menu-driven interactions.

Human–computer interaction

human-computer interactionhuman computer interactionHCI
The graphical user interface (GUI ) is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicator such as primary notation, instead of text-based user interfaces, typed command labels or text navigation.
Desktop applications, internet browsers, handheld computers,ERP, and computer kiosks make use of the prevalent graphical user interfaces (GUI) of today.

Distributed control system

DCSdistributed controldistributed control system (DCS)
Beyond computers, GUIs are used in many handheld mobile devices such as MP3 players, portable media players, gaming devices, smartphones and smaller household, office and industrial controls.
The processor nodes and operator graphical displays are connected over proprietary or industry standard networks, and network reliability is increased by dual redundancy cabling over diverse routes.

Graphical widget

widgetswidgetgraphical control element
Typically, users interact with information by manipulating visual widgets that allow for interactions appropriate to the kind of data they hold.
A graphical widget (also graphical control element or control) in a graphical user interface is an element of interaction, such as a button or a scroll bar.

Computer keyboard

keyboardkeyboardsalphanumeric keyboard
GUIs were introduced in reaction to the perceived steep learning curve of command-line interfaces (CLIs), which require commands to be typed on a computer keyboard. Human interface devices, for the efficient interaction with a GUI include a computer keyboard, especially used together with keyboard shortcuts, pointing devices for the cursor (or rather pointer) control: mouse, pointing stick, touchpad, trackball, joystick, virtual keyboards, and head-up displays (translucent information devices at the eye level).
By this time, text-only user interfaces with sparse graphics gave way to comparatively graphics-rich icons on screen.

Skin (computing)

skinsskinskinnable
This allows users to select or design a different skin at will, and eases the designer's work to change the interface as user needs evolve.
In computing, a skin (also known as visual styles in Windows XP) is a custom graphical appearance preset package achieved by the use of a graphical user interface (GUI) that can be applied to specific computer software, operating system, and websites to suit the purpose, topic, or tastes of different users.

Window (computing)

windowwindowsapplication window
Large widgets, such as windows, usually provide a frame or container for the main presentation content such as a web page, email message or drawing.
It consists of a visual area containing some of the graphical user interface of the program it belongs to and is framed by a window decoration.

Model–view–controller

model-view-controllerMVCModel View Controller
A model–view–controller allows flexible structures in which the interface is independent from and indirectly linked to application functions, so the GUI can be customized easily.
Traditionally used for desktop graphical user interfaces (GUIs), this pattern has become popular for designing web applications.

Window manager

window managerswindow managementwindowing manager
A window manager facilitates the interactions between windows, applications, and the windowing system.
A window manager is system software that controls the placement and appearance of windows within a windowing system in a graphical user interface.

Windowing system

window systemwindowingwindowing environment
A window manager facilitates the interactions between windows, applications, and the windowing system.
It is a type of graphical user interface (GUI) which implements the WIMP (windows, icons, menus, pointer) paradigm for a user interface.

Pointing device

pointerpointer-based motion controlspointing
The WIMP style of interaction uses a virtual input device to represent the position of a pointing device's interface, most often a mouse, and presents information organized in windows and represented with icons. Human interface devices, for the efficient interaction with a GUI include a computer keyboard, especially used together with keyboard shortcuts, pointing devices for the cursor (or rather pointer) control: mouse, pointing stick, touchpad, trackball, joystick, virtual keyboards, and head-up displays (translucent information devices at the eye level).
CAD systems and graphical user interfaces (GUI) allow the user to control and provide data to the computer using physical gestures by moving a hand-held mouse or similar device across the surface of the physical desktop and activating switches on the mouse.

Desktop environment

desktopdesktop environmentsdesktops
In personal computers, all these elements are modeled through a desktop metaphor to produce a simulation called a desktop environment in which the display represents a desktop, on which documents and folders of documents can be placed.
In computing, a desktop environment (DE) is an implementation of the desktop metaphor made of a bundle of programs running on top of a computer operating system, which share a common graphical user interface (GUI), sometimes described as a graphical shell.

Desktop metaphor

desktopdesktop paradigmdesktops
In personal computers, all these elements are modeled through a desktop metaphor to produce a simulation called a desktop environment in which the display represents a desktop, on which documents and folders of documents can be placed.
In computing, the desktop metaphor is an interface metaphor which is a set of unifying concepts used by graphical user interfaces to help users interact more easily with the computer.

Icon (computing)

iconiconscomputer icon
The graphical user interface (GUI ) is a form of user interface that allows users to interact with electronic devices through graphical icons and audio indicator such as primary notation, instead of text-based user interfaces, typed command labels or text navigation. The WIMP style of interaction uses a virtual input device to represent the position of a pointing device's interface, most often a mouse, and presents information organized in windows and represented with icons.
Icons as parts of the graphical user interface of the computer system, in conjunction with windows, menus and a pointing device (mouse), belong to the much larger topic of the history of the graphical user interface that has largely supplanted the text-based interface for casual use.

Post-WIMP

past the WIMP interface
Applications for which WIMP is not well suited may use newer interaction techniques, collectively termed post-WIMP user interfaces.
In computing, post-WIMP ("windows, icons, menus, pointer") comprises work on user interfaces, mostly graphical user interfaces, which attempt to go beyond the paradigm of windows, icons, menus and a pointing device, i.e. WIMP interfaces.

Computer mouse

mousemicecomputer mice
The WIMP style of interaction uses a virtual input device to represent the position of a pointing device's interface, most often a mouse, and presents information organized in windows and represented with icons. Human interface devices, for the efficient interaction with a GUI include a computer keyboard, especially used together with keyboard shortcuts, pointing devices for the cursor (or rather pointer) control: mouse, pointing stick, touchpad, trackball, joystick, virtual keyboards, and head-up displays (translucent information devices at the eye level). In the late 1960s, researchers at the Stanford Research Institute, led by Douglas Engelbart, developed the On-Line System (NLS), which used text-based hyperlinks manipulated with a then new device: the mouse.
This motion is typically translated into the motion of a pointer on a display, which allows a smooth control of the graphical user interface.

Direct manipulation interface

direct manipulationmanipulate direct 3D manipulation
The actions in a GUI are usually performed through direct manipulation of the graphical elements.
Individuals in academia and computer scientists doing research on future user interfaces often put as much or even more stress on tactile control and feedback, or sonic control and feedback than on the visual feedback given by most GUIs.

Personal computer

PCPCspersonal computers
In personal computers, all these elements are modeled through a desktop metaphor to produce a simulation called a desktop environment in which the display represents a desktop, on which documents and folders of documents can be placed. The most common combination of such elements in GUIs is the windows, icons, menus, pointer (WIMP) paradigm, especially in personal computers.
It had a graphical user interface (GUI) which later served as inspiration for Apple's Macintosh, and Microsoft's Windows operating system.

WIMP (computing)

WIMPWIMP interfacegraphical
The most common combination of such elements in GUIs is the windows, icons, menus, pointer (WIMP) paradigm, especially in personal computers.
In human–computer interaction, WIMP stands for "windows, icons, menus, pointer", denoting a style of interaction using these elements of the user interface.

Ivan Sutherland

Ivan E. SutherlandIvan Edward SutherlandSutherland
Ivan Sutherland developed Sketchpad in 1963, widely held as the first graphical computer-aided design program.
He received the Turing Award from the Association for Computing Machinery in 1988 for the invention of Sketchpad, an early predecessor to the sort of graphical user interface that has become ubiquitous in personal computers.

Xerox Alto

AltoAlto OSXerox Alto desktop
In the 1970s, Engelbart's ideas were further refined and extended to graphics by researchers at Xerox PARC and specifically Alan Kay, who went beyond text-based hyperlinks and used a GUI as the main interface for the Smalltalk programming language, which ran on the Xerox Alto computer, released in 1973.
The Xerox Alto is the first computer designed from its inception to support an operating system based on a graphical user interface (GUI), later using the desktop metaphor.

Pointer (user interface)

pointermouse cursorcursor
Human interface devices, for the efficient interaction with a GUI include a computer keyboard, especially used together with keyboard shortcuts, pointing devices for the cursor (or rather pointer) control: mouse, pointing stick, touchpad, trackball, joystick, virtual keyboards, and head-up displays (translucent information devices at the eye level).
It can be used in text-based or graphical user interfaces to select and move other elements.

Douglas Engelbart

Doug EngelbartDouglas C. EngelbartDouglas Englebart
In the late 1960s, researchers at the Stanford Research Institute, led by Douglas Engelbart, developed the On-Line System (NLS), which used text-based hyperlinks manipulated with a then new device: the mouse.
He is best known for his work on founding the field of human–computer interaction, particularly while at his Augmentation Research Center Lab in SRI International, which resulted in creation of the computer mouse, and the development of hypertext, networked computers, and precursors to graphical user interfaces.