The IMPRINT tool grew out of manpower, personnel, and training (MPT) concerns identified in the mid-1970s by the U.S. Air Force, Navy, and Army. The U.S. Navy first developed the HARDMAN Comparability Methodology (HCM), with HARDMAN being a portmanteau of hardware and manpower. The Army then tailored the manual HCM, which became known as HARDMAN I, for application to a broad range of weapon systems and later developed an automated version, HARDMAN II.6 HARDMAN II.2 was first released by the Army Research Institute (ARI) in 1985. It required a VAX-11 computer to host its suite of analytical processes. An upgraded version was released in 1990.
In HARDMAN I and II, there was no direct link between MPT and performance. To directly remedy this shortcoming, the U.S. Army began the development of a set of software analysis modules in the mid-1980s.7 This set of modules was called HARDMAN III, and although the name was the same, it used a fundamentally different approach for addressing MPT concerns than previous methods by providing an explicit link between MPT variables and soldier–system performance.8 Software development was done by Micro Analysis and Design (MA&D).
HARDMAN III was a major development effort of the Army Research Institute's (ARI) System Research Laboratory. The contract that supported the work was let in a three-phase development process.9 HARDMAN III was government-owned and consisted of a set of automated aids to assist analysts in conducting MANPRINT analyses. As PC DOS-based software, the HARDMAN III aids provided a means for estimating MPT constraints and requirements for new weapon systems very early in the acquisition process. The DOS environment imposed several limitations on the HARDMAN III tool set. The most significant problem was the 640K RAM limitation; the original HARDMAN III tools were designed for pieces of analyses to fit within these RAM blocks. RAM constraints led to a restriction of 400 operations tasks and 500 maintenance tasks.
The nine modules in HARDMAN III were:
IMPRINT was originally named: Integrated MANPRINT Tools and was first released in 1995. It was a Windows application that merged the functionality of the 9 HARDMAN III tools into one application. In 1997 IMPRINT was renamed to the Improved Performance Research Integration Tool – the name changed but the IMPRINT acronym remained the same. Between 1995 and 2006 several enhancements were made to IMPRINT and new releases (Versions 2 through 6) were made available. IMPRINT Pro was introduced in 2007. It featured a new interface design and complete integration with the Micro Saint Sharp simulation engine. It had enhanced analytical capabilities and moved from being an Army tool to a tri-service tool. From the beginning IMPRINT has continued to evolve, new enhancements have been continually added, and new releases made freely available to the user community. IMPRINT has over 800 users supporting the Army, Navy, Air Force, Marine, NASA, DHS, DoT, Joint, and other organizations across the country.
Simulations, or Missions as IMPRINT refers to them, contain a task network called a Network Diagram. The network diagram contains a series of tasks connected by paths that determine control flow. System objects called entities flow through the system to create a simulation. IMPRINT also includes more low level features such as global variables and subroutines called macros.10
The task node is the primary element driving the simulation's outcome. Task nodes simulate system behavior by allowing programmer-specified effects, task duration, failure rates, and pathing. Task Effects are programmer-specified C# expressions where programmers can manipulate variables and data structures when a task is invoked. Task duration can be specified by the programmer as a specific value, through a probability distribution, or using a C# expression. Programmers can also specify task success in a similar way. Task success influences the effects of the task node and the pathing of the entity. Failure consequences include task repetition, task change, and mission failure among other options. Control flow and pathing can also be specified by the programmer. IMPRINT provides a series of other nodes that include special functionality:
Nodes include:
Entities are dynamic objects which arrive into the system and move through the task network. Entities flow from one task to the next based on the task's path logic. When an entity enters a task, the task's effects are triggered. When the task concludes, the entity moves to the next task. One entity is generated by default at the beginning of the simulation. More entities can be generated at any point in the simulation based on programmer specified logic. When all entities reach the end node or are destroyed, the simulation concludes.17
Events are occurrences that happen in an instant of simulated time within IMPRINT that change the global state of the system. This can be the arrival or departure of an entity, the completion of a task, or some other occurrence. The events are stored in a master event log which captures every event that will happen and the simulated time that the event occurred. Due to the stochastic nature of discrete-event simulation, an event will often trigger the generation of a random variate to determine the next time that same event will occur. Thus, as events occur, in the simulation, the event log is altered.18
Once a task concludes, the invoking entity moves to another node which is directly connected to the current node in the task network. Nodes can connect to any number of other tasks, so IMPRINT provides a number of pathing options to determine the task to which the entity moves.19
IMPRINT has a number of global variables used by the system throughout a simulation. IMPRINT provides the public global variable Clock which tracks the simulation's current time. IMPRINT also has private variables such as operator workload values. IMPRINT allows the modeler to create custom global variables that can be accessed and modified in any task node. Variables can be of any type native to C#, but the software provides a list of suggested variable types including C# primitive data types and basic data structures. IMPRINT also provides the programmer with the functionality to create globally accessible subroutines called macros. Macros work as C# functions and can specify parameters, manipulate data, and return data.23
IMPRINT's workload management abilities allow users to model realistic operator actions under different work overload conditions.24 IMPRINT allows users to specify Warfighters which represent human operators in the modeled system. Each task in IMPRINT is associated with at least one Warfighter. Warfighters can be assigned to any number of tasks, including tasks that execute concurrently.25 IMPRINT tasks can be assigned VACP workload values.26 The VACP method allows modelers to identify the visual, auditory, cognitive, and psychomotor workload of each IMPRINT task. In an IMPRINT task, each resource can be given a workload value between 0 and 7, with 0 being the lowest possible workload, and 7 being the highest possible workload for that resource. The VACP scale for each resource provides verbal anchors for certain scale values. For instance, a visual workload of 0.0 corresponds to "no visual activity", while a visual workload of 7.0 continuous visual scanning, searching, and monitoring.27 When a Warfighter is executing a task, their workload is increased using the VACP value assigned to that task. An IMPRINT plugin module was proposed in 2013 to improve the cognitive workload estimation within IMPRINT and make the overall calculation less linear.28 IMPRINT's custom reporting feature allows modelers to view the workload over time of the Warfighters in their models. Workload monitor nodes allow modelers to view the workload of a specific Warfighter as the simulation executes.29
IMPRINT has been used by scientists at the Army Research Lab to study Unmanned Aerial Systems,30 workload of warfighter crews,3132 and human-robot interaction.33 The United States Air Force and Air Force Institute of Technology have used IMPRINT to study automated systems,3435 human systems integration,36 and adaptive automation37 among other things. The Air Force Institute of Technology in particular is using IMPRINT to research the prediction of operator performance, mental workload, situational awareness, trust, and fatigue in complex systems.38
Mitchell, Diane K. (2003-09-01). Advanced Improved Performance Research Integration Tool (IMPRINT) Vetronics Technology Test Bed Model Development (Report). Fort Belvoir, VA: Defense Technical Information Center. doi:10.21236/ada417350 (inactive 2024-11-12).{{cite report}}: CS1 maint: DOI inactive as of November 2024 (link) https://dx.doi.org/10.21236/ada417350 ↩
Rusnock, Christina F; Geiger, Christopher D (2013). Using Discrete-Event Simulation for Cognitive Workload Modeling and System Evaluation. IIE Annual Conference. Proceedings. Norcross. pp. 2485–2494. ProQuest 1471959351. /wiki/ProQuest ↩
Laughery, Romn (1999). "Using discrete-event simulation to model human performance in complex systems". Proceedings of the 31st conference on Winter simulation Simulation---a bridge to the future - WSC '99. Vol. 1. pp. 815–820. doi:10.1145/324138.324506. ISBN 978-0-7803-5780-8. S2CID 18163468. 978-0-7803-5780-8 ↩
Mitchell, Diane K. (September 2003). Advanced Improved Performance Research Integration Tool (IMPRINT) Vetronics Technology Test Bed Model Development. Army Research Laboratory. DTIC ADA417350. /wiki/DTIC_(identifier) ↩
IMPRINT PRO User Guide Version 4.7. Huntington Ingalls Industries. October 2023 ↩
HARDMAN II was formerly called MIST (Man Integrated Systems Technology). ↩
Kaplan, J.D. (1991) Synthesizing the effects of manpower, personnel, training and human engineering. In E. Boyle. J. Ianni, J. Easterly, S. Harper, & M. Korna (Eds. Human centered technology for maintainability: Workshop proceedings (AL-TP-1991-0010) (pp. 273-283). Wright-Patterson AFB, OH: Armstrong Laboratory ↩
Allender, L., Lockett, J., Headley, D., Promisel, D., Kelley, T., Salvi, L., Richer, C., Mitchell, D., Feng, T. "HARDMAN III and IMPRINT Verification, Validation, and Accreditation Report." Prepared for the US Army Research Laboratory, Human Research & Engineering Directorate, December 1994." ↩
Adkins, R., and Dahl (Archer), S.G., "Final Report for HARDMAN III, Version 4.0." Report E-482U, prepared for US Army Research Laboratory, July 1993 ↩
IMPRINT PRO User Guide Version 4.7. Huntington Ingalls Industries. October 2023. ↩
Mitchell, D. K. (2000). Mental Workload and ARL Workload Modeling Tools (ARL-TN-161). Aberdeen Proving Ground. ↩
Cassenti, Daniel N.; Kelley, Troy D.; Carlson, Richard Alan (2013). Differences in performance with changing mental workload as the basis for an IMPRINT plug-in proposal. 22nd Annual Conference on Behavior Representation in Modeling and Simulation, BRiMS 2013 - Co-located with the International Conference on Cognitive Modeling. pp. 24–31. ISBN 978-162748470-1. 978-162748470-1 ↩
Hunn, Bruce P.; Heuckeroth, Otto H. (February 2006). A Shadow Unmanned Aerial Vehicle (UAV) Improved Performance Research Integration Tool (IMPRINT) Model Supporting Future Combat Systems. Army Research Laboratory. DTIC ADA443567. /wiki/DTIC_(identifier) ↩
Salvi, Lucia (2001). Development of Improved Performance Research Integration Tool (IMPRINT) Performance Degradation Factors for the Air Warrior Program. Army Research Laboratory. DTIC ADA387840. /wiki/DTIC_(identifier) ↩
Mitchell, Diane K. (September 2009). Workload Analysis of the Crew of the Abrams V2 SEP: Phase I Baseline IMPRINT Model. Army Research Laboratory. DTIC ADA508882. /wiki/DTIC_(identifier) ↩
Pomranky, R. a. (2006). Human Robotics Interaction Army Technology Objective Raven Small Unmanned Aerial Vehicle Task Analysis and Modeling. ARL-TR-3717. ↩
Colombi, John M.; Miller, Michael E.; Schneider, Michael; McGrogan, Major Jason; Long, Colonel David S.; Plaga, John (December 2012). "Predictive mental workload modeling for semiautonomous system design: Implications for systems of systems". Systems Engineering. 15 (4): 448–460. doi:10.1002/sys.21210. S2CID 14094560. /wiki/Doi_(identifier) ↩
Storey, Alice A.; Ramírez, José Miguel; Quiroz, Daniel; Burley, David V.; Addison, David J.; Walter, Richard; Anderson, Atholl J.; Hunt, Terry L.; Athens, J. Stephen; Huynen, Leon; Matisoo-Smith, Elizabeth A. (19 June 2007). "Radiocarbon and DNA evidence for a pre-Columbian introduction of Polynesian chickens to Chile". Proceedings of the National Academy of Sciences. 104 (25): 10335–10339. Bibcode:2007PNAS..10410335S. doi:10.1073/pnas.0703993104. PMC 1965514. PMID 17556540. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1965514 ↩
Miller, Michael; Colombi, John; Tvaryanas, Anthony (2013). "Human systems integration". Handbook of Industrial and Systems Engineering, Second Edition. Industrial Innovation. Vol. 20131247. pp. 197–216. doi:10.1201/b15964-15 (inactive 2024-11-12). ISBN 978-1-4665-1504-8.{{cite book}}: CS1 maint: DOI inactive as of November 2024 (link) 978-1-4665-1504-8 ↩
Boeke, Danielle K; Miller, Michael E; Rusnock, Christina F; Borghetti, Brett J (2015). Exploring Individualized Objective Workload Prediction with Feedback for Adaptive Automation. IIE Annual Conference. Proceedings. Norcross. pp. 1437–1446. ProQuest 1791990382. /wiki/ProQuest ↩
Rusnock, Christina F.; Boubin, Jayson G.; Giametta, Joseph J.; Goodman, Tyler J.; Hillesheim, Anthony J.; Kim, Sungbin; Meyer, David R.; Watson, Michael E. (2016). "The Role of Simulation in Designing Human-Automation Systems". Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience. Lecture Notes in Computer Science. Vol. 9744. pp. 361–370. doi:10.1007/978-3-319-39952-2_35. ISBN 978-3-319-39951-5. 978-3-319-39951-5 ↩