The user interface or human–machine interface is the part of the machine that handles the human–machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the physical part of the Human Machine Interface which we can see and touch.1
In complex systems, the human–machine interface is typically computerized. The term human–computer interface refers to this kind of system. In the context of computing, the term typically extends as well to the software dedicated to control the physical elements used for human–computer interaction.
The engineering of human–machine interfaces is enhanced by considering ergonomics (human factors). The corresponding disciplines are human factors engineering (HFE) and usability engineering (UE) which is part of systems engineering.
Tools used for incorporating human factors in the interface design are developed based on knowledge of computer science, such as computer graphics, operating systems, programming languages. Nowadays, we use the expression graphical user interface for human–machine interface on computers, as nearly all of them are now using graphics.
Multimodal interfaces allow users to interact using more than one modality of user input.2
There is a difference between a user interface and an operator interface or a human–machine interface (HMI).
In science fiction, HMI is sometimes used to refer to what is better described as a direct neural interface. However, this latter usage is seeing increasing application in the real-life use of (medical) prostheses—the artificial extension that replaces a missing body part (e.g., cochlear implants).1011
In some circumstances, computers might observe the user and react according to their actions without specific commands. A means of tracking parts of the body is required, and sensors noting the position of the head, direction of gaze and so on have been used experimentally. This is particularly relevant to immersive interfaces.1213
The history of user interfaces can be divided into the following phases according to the dominant type of user interface:
In the batch era, computing power was extremely scarce and expensive. User interfaces were rudimentary. Users had to accommodate computers rather than the other way around; user interfaces were considered overhead, and software was designed to keep the processor at maximum utilization with as little overhead as possible.
The input side of the user interfaces for batch machines was mainly punched cards or equivalent media like paper tape. The output side added line printers to these media. With the limited exception of the system operator's console, human beings did not interact with batch machines in real time at all.
Submitting a job to a batch machine involved first preparing a deck of punched cards that described a program and its dataset. The program cards were not punched on the computer itself but on keypunches, specialized, typewriter-like machines that were notoriously bulky, unforgiving, and prone to mechanical failure. The software interface was similarly unforgiving, with very strict syntaxes designed to be parsed by the smallest possible compilers and interpreters.
Once the cards were punched, one would drop them in a job queue and wait. Eventually, operators would feed the deck to the computer, perhaps mounting magnetic tapes to supply another dataset or helper software. The job would generate a printout, containing final results or an abort notice with an attached error log. Successful runs might also write a result on magnetic tape or generate some data cards to be used in a later computation.
The turnaround time for a single job often spanned entire days. If one was very lucky, it might be hours; there was no real-time response. But there were worse fates than the card queue; some computers required an even more tedious and error-prone process of toggling in programs in binary code using console switches. The very earliest machines had to be partly rewired to incorporate program logic into themselves, using devices known as plugboards.
Early batch systems gave the currently running job the entire computer; program decks and tapes had to include what we would now think of as operating system code to talk to I/O devices and do whatever other housekeeping was needed. Midway through the batch period, after 1957, various groups began to experiment with so-called "load-and-go" systems. These used a monitor program which was always resident on the computer. Programs could call the monitor for services. Another function of the monitor was to do better error checking on submitted jobs, catching errors earlier and more intelligently and generating more useful feedback to the users. Thus, monitors represented the first step towards both operating systems and explicitly designed user interfaces.
Main article: Command-line interface
Command-line interfaces (CLIs) evolved from batch monitors connected to the system console. Their interaction model was a series of request-response transactions, with requests expressed as textual commands in a specialized vocabulary. Latency was far lower than for batch systems, dropping from days or hours to seconds. Accordingly, command-line systems allowed the user to change their mind about later stages of the transaction in response to real-time or near-real-time feedback on earlier results. Software could be exploratory and interactive in ways not possible before. But these interfaces still placed a relatively heavy mnemonic load on the user, requiring a serious investment of effort and learning time to master.14
The earliest command-line systems combined teleprinters with computers, adapting a mature technology that had proven effective for mediating the transfer of information over wires between human beings. Teleprinters had originally been invented as devices for automatic telegraph transmission and reception; they had a history going back to 1902 and had already become well-established in newsrooms and elsewhere by 1920. In reusing them, economy was certainly a consideration, but psychology and the rule of least surprise mattered as well; teleprinters provided a point of interface with the system that was familiar to many engineers and users.
The widespread adoption of video-display terminals (VDTs) in the mid-1970s ushered in the second phase of command-line systems. These cut latency further, because characters could be thrown on the phosphor dots of a screen more quickly than a printer head or carriage can move. They helped quell conservative resistance to interactive programming by cutting ink and paper consumables out of the cost picture, and were to the first TV generation of the late 1950s and 60s even more iconic and comfortable than teleprinters had been to the computer pioneers of the 1940s.
Just as importantly, the existence of an accessible screen—a two-dimensional display of text that could be rapidly and reversibly modified—made it economical for software designers to deploy interfaces that could be described as visual rather than textual. The pioneering applications of this kind were computer games and text editors; close descendants of some of the earliest specimens, such as rogue(6), and vi(1), are still a live part of Unix tradition.
In 1985, with the beginning of Microsoft Windows and other graphical user interfaces, IBM created what is called the Systems Application Architecture (SAA) standard which include the Common User Access (CUA) derivative. CUA successfully created what we know and use today in Windows, and most of the more recent DOS or Windows Console Applications will use that standard as well.
This defined that a pulldown menu system should be at the top of the screen, status bar at the bottom, shortcut keys should stay the same for all common functionality (F2 to Open for example would work in all applications that followed the SAA standard). This greatly helped the speed at which users could learn an application so it caught on quick and became an industry standard.15
Main article: Graphical user interface
Main article: User interface design
Primary methods used in the interface design include prototyping and simulation.
Typical human–machine interface design consists of the following stages: interaction specification, interface software specification and prototyping:
In broad terms, interfaces generally regarded as user friendly, efficient, intuitive, etc. are typified by one or more particular qualities. For the purpose of example, a non-exhaustive list of such characteristics follows:
The principle of least astonishment (POLA) is a general principle in the design of all kinds of interfaces. It is based on the idea that human beings can only pay full attention to one thing at one time,25 leading to the conclusion that novelty should be minimized.
If an interface is used persistently, the user will unavoidably develop habits for using the interface. The designer's role can thus be characterized as ensuring the user forms good habits. If the designer is experienced with other interfaces, they will similarly develop habits, and often make unconscious assumptions regarding how the user will interact with the interface.2627
Peter Morville of Google designed the User Experience Honeycomb framework in 2004 when leading operations in user interface design. The framework was created to guide user interface design. It would act as a guideline for many web development students for a decade.28
"Eurotherm Parker SSD Link Hardware L5392 | Automation Industrial". l5392.com. Retrieved 11 January 2024. https://l5392.com/blog ↩
Cohen, Philip R. (1992). "The role of natural language in a multimodal interface". Proceedings of the 5th annual ACM symposium on User interface software and technology - UIST '92. pp. 143–149. doi:10.1145/142621.142641. ISBN 0897915496. S2CID 9010570. 0897915496 ↩
"The User Experience of Libraries: Serving The Common Good User Experience Magazine". uxpamagazine.org. 7 May 2017. Retrieved 23 March 2022. https://uxpamagazine.org/the-user-experience-of-libraries/ ↩
Griffin, Ben; Baston, Laurel. "Interfaces" (Presentation): 5. Archived from the original on 14 July 2014. Retrieved 7 June 2014. The user interface of a mechanical system, a vehicle or an industrial installation is sometimes referred to as the human–machine interface (HMI). {{cite journal}}: Cite journal requires |journal= (help) http://peace.saumag.edu/faculty/kardas/Courses/CS/Interfaces2007_files/Interfaces2007.ppt ↩
"User Interface Design and Ergonomics" (PDF). Course Cit 811. NATIONAL OPEN UNIVERSITY OF NIGERIA: SCHOOL OF SCIENCE AND TECHNOLOGY: 19. Archived (PDF) from the original on 14 July 2014. Retrieved 7 June 2014. In practice, the abbreviation MMI is still frequently used although some may claim that MMI stands for something different now. http://www.nou.edu.ng/NOUN_OCL/pdf/SST/CIT%20811.pdf ↩
"Introduction Section". Recent advances in business administration. [S.l.]: Wseas. 2010. p. 190. ISBN 978-960-474-161-8. Other terms used are operator interface console (OIC) and operator interface terminal (OIT) 978-960-474-161-8 ↩
Cipriani, Christian; Segil, Jacob; Birdwell, Jay; Weir, Richard (2014). "Dexterous control of a prosthetic hand using fine-wire intramuscular electrodes in targeted extrinsic muscles". IEEE Transactions on Neural Systems and Rehabilitation Engineering. 22 (4): 828–36. doi:10.1109/TNSRE.2014.2301234. ISSN 1534-4320. PMC 4501393. PMID 24760929. Neural co-activations are present that in turn generate significant EMG levels and hence unintended movements in the case of the present human machine interface (HMI). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4501393 ↩
Citi, Luca (2009). "Development of a neural interface for the control of a robotic hand" (PDF). Scuola Superiore Sant'Anna, Pisa, Italy: IMT Institute for Advanced Studies Lucca: 5. Retrieved 7 June 2014. {{cite journal}}: Cite journal requires |journal= (help)[permanent dead link] https://7c4745ab-a-cdf32725-s-sites.googlegroups.com/a/neurostat.mit.edu/lciti/publications_files/LCitiPhD.pdf?attachauth=ANoY7cpwRib4-7KUnST5NrulgpbLiT3r10hOeyap9QXEgv64E1VioXR7n1pQYsNBNMZggwnI2V4KbZLgxVeKLcOgxz4XfJFAkqvddyQUnGqn4Mm5iLq9vDR02cHmYi6ULrK8IxWK150SirIt9acjMFcDon0dbnRwgYicc-2GeKZZCqtflZc4ZhEBORg8AzWE31XDAgoFFAfNtUxTcNR8IcJlsM7NYCGxY4M3Vn8WY6bsO1MEuyYIjmU%3D&attredirects=0 ↩
Jordan, Joel. "Gaze Direction Analysis for the Investigation of Presence in Immersive Virtual Environments" (Thesis submitted for the degree of Doctor of Philosophy). University of London: Department of Computer Science: 5. Archived (PDF) from the original on 14 July 2014. Retrieved 7 June 2014. The aim of this thesis is to investigate the idea that the direction of gaze may be used as a device to detect a sense-of-presence in Immersive Virtual Environments (IVE) in some contexts. {{cite journal}}: Cite journal requires |journal= (help) http://www0.cs.ucl.ac.uk/staff/j.jordan/thesis-jj-2011.pdf ↩
Ravi (August 2009). "Introduction of HMI". Archived from the original on 14 July 2014. Retrieved 7 June 2014. In some circumstance computers might observe the user, and react according to their actions without specific commands. A means of tracking parts of the body is required, and sensors noting the position of the head, direction of gaze and so on have been used experimentally. This is particularly relevant to immersive interfaces. http://ravi-softwares.blogspot.com/2009/08/introduction-of-hmi.html ↩
"HMI Guide". Archived from the original on 20 June 2014. http://www.anaheimautomation.com/manuals/forms/hmi-guide.php#sthash.2McqS5xo.dpbs ↩
Richard, Stéphane. "Text User Interface Development Series Part One – T.U.I. Basics". Archived from the original on 16 November 2014. Retrieved 13 June 2014. http://www.petesqbsite.com/sections/express/issue21/tuiseriespart1.htm ↩
McCown, Frank. "History of the Graphical User Interface (GUI)". Harding University. Archived from the original on 8 November 2014. {{cite journal}}: Cite journal requires |journal= (help) https://www.harding.edu/fmccown/gui/history-gui.pptx ↩
"The Xerox PARC Visit". web.stanford.edu. Retrieved 8 February 2019. https://web.stanford.edu/dept/SUL/sites/mac/parc.html ↩
"apple-history.com / Graphical User Interface (GUI)". apple-history.com. Retrieved 8 February 2019. https://apple-history.com/gui ↩
Raymond, Eric Steven (2003). "11". The Art of Unix Programming. Thyrsus Enterprises. Archived from the original on 20 October 2014. Retrieved 13 June 2014. http://homepage.cs.uri.edu/~thenry/resources/unix_art/ch11s03.html ↩
C. A. D'H Gough; R. Green; M. Billinghurst. "Accounting for User Familiarity in User Interfaces" (PDF). Retrieved 13 June 2014. {{cite journal}}: Cite journal requires |journal= (help) https://www.researchgate.net/publication/220998465 ↩
Sweet, David (October 2001). "9 – Constructing A Responsive User Interface". KDE 2.0 Development. Sams Publishing. Archived from the original on 23 September 2013. Retrieved 13 June 2014. http://openbooks.sourceforge.net/books/kde20devel/ch09.html ↩
John W. Satzinger; Lorne Olfman (March 1998). "User interface consistency across end-user applications: the effects on mental models". Journal of Management Information Systems. Managing virtual workplaces and teleworking with information technology. 14 (4). Armonk, NY: 167–193. doi:10.1080/07421222.1998.11518190. http://dl.acm.org/citation.cfm?id=1189510 ↩
Raskin, Jef (2000). The human interface : new directions for designing interactive systems (1. printing. ed.). Reading, Mass. [u.a.]: Addison Wesley. ISBN 0-201-37937-6. 0-201-37937-6 ↩
Udell, John (9 May 2003). "Interfaces are habit-forming". Infoworld. Archived from the original on 4 April 2017. Retrieved 3 April 2017. http://www.infoworld.com/article/2681144/application-development/interfaces-are-habit-forming.amp.html ↩
Wesolko, Dane (27 October 2016). "Peter Morville's User Experience Honeycomb". Medium. Retrieved 19 November 2019. https://medium.com/@danewesolko/peter-morvilles-user-experience-honeycomb-904c383b6886 ↩
"User Interface & User Experience Design | Oryzo | Small Business UI/UX". Oryzo. Retrieved 19 November 2019. https://oryzo.com/user-interface-design/ ↩
Errett, Joshua. "As app fatigue sets in, Toronto engineers move on to chatbots". CBC. CBC/Radio-Canada. Archived from the original on 22 June 2016. Retrieved 4 July 2016. http://www.cbc.ca/news/canada/toronto/toronto-chatbots-1.3581791 ↩
Martinez, Wendy L. (23 February 2011). "Graphical user interfaces: Graphical user interfaces". Wiley Interdisciplinary Reviews: Computational Statistics. 3 (2): 119–133. doi:10.1002/wics.150. S2CID 60467930. https://onlinelibrary.wiley.com/doi/10.1002/wics.150 ↩
Lamb, Gordana (2001). "Improve Your UI Design Process with Object-Oriented Techniques". Visual Basic Developer magazine. Archived from the original on 14 August 2013. Table 1. Differences between the traditional application-oriented and object-oriented approaches to UI design. https://web.archive.org/web/20130814153652/http://msdn.microsoft.com/en-us/library/aa227601(v=vs.60).aspx ↩
appleinsider.com Archived 2009-06-19 at the Wayback Machine http://www.appleinsider.com/articles/09/06/18/apple_exploring_motion_tracking_mac_os_x_user_interface.html ↩
Jakob Nielsen (April 1993). "Noncommand User Interfaces". Communications of the ACM. 36 (4). ACM Press: 83–99. doi:10.1145/255950.153582. S2CID 7684922. Archived from the original on 10 November 2006. /wiki/Jakob_Nielsen_(usability_consultant) ↩
Sharon, Taly, Henry Lieberman, and Ted Selker. "A zero-input interface for leveraging group experience in web browsing Archived 2017-09-08 at the Wayback Machine." Proceedings of the 8th international conference on Intelligent user interfaces. ACM, 2003. https://www.researchgate.net/profile/Ted_Selker/publication/221607708_A_zero-input_interface_for_leveraging_group_experience_in_Web_browsing/links/0912f50876bda91a5b000000/A-zero-input-interface-for-leveraging-group-experience-in-Web-browsing.pdf ↩