Understanding COTS and Human Factors Integration

The rise of Commercial-Off-The-Shelf (COTS) systems has revolutionised the landscape of complex systems engineering. COTS products offer a promise of cost-effectiveness and ease of implementation, making them increasingly prevalent in various technical industries. However, the integration of COTS systems comes with its own set of challenges, particularly concerning Human Factors Integration (HFI). In this blog post, we will explore the nature of COTS systems, the criticality of HFI, and the challenges faced in effectively integrating COTS products into complex systems.

What are COTS Products?

First, let’s establish what we mean by COTS systems.

A COTS system is a pre-existing, commercially available software or hardware product that requires minimal customisation or modification for its use.

These products cater to a broad market and are readily available from vendors or manufacturers. They are commonly employed in diverse sectors, including business, government, and military organisations, for a wide range of applications such as process control, building management systems, and communication systems. Examples include telephony systems and CCTV systems.

The Benefits of COTS

COTS systems present several advantages, which explain their growing popularity in technical industries. The primary benefit lies in their cost-effectiveness and quick deployment. Compared to developing a custom solution from scratch, COTS products can be procured and implemented more rapidly, saving precious development time and resources. Additionally, established vendors often provide technical support, maintenance, and regular updates, reducing the burden on organisations that lack in-house capabilities for product maintenance.

As we venture further into the domain of COTS integration, we encounter the critical aspect of Human Factors Integration (HFI). HFI is a multidisciplinary approach to system design that takes into account human capabilities, limitations, and needs, ensuring that systems are user-friendly, efficient, and effective. This consideration of the physical, cognitive, and social characteristics of users is paramount in delivering a successful COTS integration process.

Challenges in Human Factors Integration for COTS Systems

However, integrating HFI into COTS systems is not without its hurdles. One of the primary challenges lies in the lack of standardisation and consistency in HFI practices. Since COTS systems are often developed by different vendors using varied methods and frameworks, establishing uniform HFI guidelines becomes a complex task.

Another significant challenge is the dearth of knowledge and expertise in HFI among COTS vendors. Some vendors may not fully comprehend the importance of factoring human factors engineering into the design of their products. This lack of awareness can lead to systems that are difficult to use and fail to meet the specific needs of users.

Opportunities for Human Factors Integration

Despite these challenges, there are promising opportunities to integrate HFI effectively into the acquisition processes of COTS systems. COTS vendors can benefit from leveraging existing HFI frameworks and standards, such as the ISO 9241 series, to guide their development efforts. These established guidelines can offer valuable insights into designing user-centric systems.

Additionally, engaging with HFI experts can be instrumental in bridging the knowledge gap and enhancing HFI awareness among COTS vendors. By seeking expert advice, vendors can enhance the usability of their products and align them better with user needs.

As the adoption of COTS systems continues to rise in safety-critical industries, the significance of HFI in their integration cannot be overstated. While COTS products offer enticing benefits of cost-effectiveness and quick deployment, they present unique challenges in terms of HFI standardisation and vendor expertise. By embracing established HFI frameworks and collaborating with experts, organisations can navigate these challenges and ensure the successful integration of COTS systems into complex technical environments.

In our next blog post, we will delve deeper into the benefits and limitations of COTS products in technical industries. Stay tuned for more insights into the intersection of COTS technology and human factors integration.

Human Factors Engineer: What It Is and How To Become One

As we return after the Easter break, many of us start to reflect on our current career paths and consider making a change. One field that has gained popularity in recent years is Human Factors Engineering. Whether you’re a new graduate or someone looking to pivot to a new career, you may be curious about what it takes to become a Human Factors Engineer. In this article, we’ll break down what a Human Factors Engineer does, the necessary skills for the job, the path to certification, where you can work as a Human Factors Engineer, and the future of the industry. So, let’s dive in and discover whether a career in Human Factors Engineering is right for you!

What Is A Human Factors Engineer?

A Human Factors Engineer is someone who studies the interactions between people, technology, and the environment. The goal is to design products, systems, and environments that optimise safety, efficiency, and user satisfaction. Human Factors Engineers consider a wide range of factors, including user experience (UX), cognitive psychology, physical capabilities, environmental conditions, and safety. They use research methods, such as surveys, experiments, and simulations, to analyse data and identify potential problems. Based on their findings, they make recommendations for design changes or improvements to ensure that the product or system is safe, efficient, and easy to use.

What Do Human Factors Engineers Do?

Human Factors Engineers work on a variety of projects, from designing airplane cockpits to developing medical devices to creating video game interfaces. They may be involved in any stage of the design process, from initial concept development to testing and evaluation. They may conduct user research, create user profiles, design prototypes, and test products or systems to ensure they meet user needs and are easy to use.

They may also work closely with other professionals, such as designers, engineers, and project managers, to ensure that products meet their overall system performance goals.

What Skills Do I Need To Become A HF Engineer?

To become a Human Factors Engineer requires a diverse skill set that includes technical, analytical, and interpersonal abilities.

Technical skills are essential, including knowledge of human anatomy and physiology, statistics, and research methods.

Problem-solving skills are also crucial for Human Factors Engineers. You will need to identify potential problems and come up with creative solutions to improve the usability and safety of products, systems, and environments. This often involves observing users’ behaviours, analysing data, and designing studies to test potential solutions. Strong critical thinking skills and attention to detail are necessary to ensure that all factors are considered and that the final solution meets the needs of users while also meeting regulatory or stakeholders requirements.

You will also need strong analytical skills to evaluate data and identify trends, as well as communication skills to explain your findings and recommendations to others.

Additionally, you’ll need to be comfortable working collaboratively with others, as teamwork is often essential to success in this field.

How Do I Become A Human Factors Engineer?

While having a degree in a related field can be helpful in becoming a Human Factors Engineer, it is not always necessary. Pursuing certification through professional organisations such as the HFES or the UK Chartered Institute of Ergonomics and Human Factors technician grade membership can also provide the necessary training and knowledge to succeed in the field. In fact, some employers may even prioritise certification over traditional education.

To become certified as an Ergonomist or Human Factors Professional, you will need a minimum of two years of relevant experience and to have your qualifications reviewed by a board of peers.

This certification provides recognition of your expertise in the field and can improve your job prospects and earning potential. Certification can also be a valuable way to demonstrate your commitment to the profession and your dedication to staying up-to-date with the latest developments and trends.

One practical step that can be taken to gain experience in this field is through volunteering. This could involve volunteering at a conference, a non-profit organisation, or a charity. For example, some STEM charities in the UK offer opportunities for people to work with young people who are interested in STEM, and this can provide an opportunity for new graduates to develop their skills.

Additionally, the UK Chartered Institute of Ergonomics and Human Factors (CIEHF) message boards can be a good place to reach out and see if you can find a mentor. Mentors can also sometimes be found in your own organisation, so it can be beneficial to connect with experienced Human Factors Engineers in your workplace.

Another practical step to take is to seek out internships or apprenticeships in the field. Many companies offer internships or apprenticeships in Human Factors Engineering, which can provide valuable hands-on experience and help you develop a network of professional contacts.

How Do I Become a Human Factors Engineer? Image of a usability study conducted by a Human Factors Engineer.

Additionally, online courses can be a great way to learn more about Human Factors Engineering and develop new skills. Liv Systems, for example, offers practice-based online courses in HFE that are self-paced and supervised by experienced HF practitioners.

Where Do HF Engineers Work?

Human Factors Engineers have a wide range of career options and can work in various industries. Some of the most common industries for Human Factors Engineers include aviation, healthcare, transportation, and consumer products. However, they can also work in fields such as defence, energy, and finance.

Human Factors Engineers can work for a variety of organisations, including government agencies, academia, research firms, or consultancies. Some Human Factors Engineers also work for large corporations or startups, particularly those that focus on user-centered design.

Working as a Human Factors Engineer often involves working in interdisciplinary teams, collaborating with professionals in other fields. For example, they may work closely with designers, engineers, project managers, and marketing professionals to develop products that are both user-friendly and marketable. Collaboration is essential to ensure that a product is designed with the end-user in mind, as well as meeting business goals.

In addition to working in specific industries, Human Factors Engineers may also focus on specific types of products or systems. For example, they may specialise in medical devices or software interfaces. Specialisation in a specific area of Human Factors Engineering can lead to a more focused career and allow individuals to develop a deep understanding of a particular domain.

What’s The Future Of Human Factors Engineering?

The future of Human Factors Engineering is bright, as more companies recognize the importance of creating user-friendly products and systems. With advancing technology, Human Factors Engineers will have an increasingly critical role in designing safe and efficient products that cater to a diverse range of users.

Innovations in virtual and augmented reality offer new opportunities for Human Factors Engineers to develop groundbreaking solutions for various industries. While there may be challenges that come with these advancements, such as the shift towards automated systems, this presents an opportunity for Human Factors Engineers to focus on designing the interactions between the user and the system.

This shift reinforces the critical role of the human factor in the overall system, highlighting the need for Human Factors Engineers.

In addition to technological advances, other megatrends such as aging populations and the need for increased security will also put an emphasis on the importance of Human Factors Engineering. As our world becomes more complex, the need for products and systems that are easy to use and understand becomes increasingly important. Human Factors Engineers will be critical in ensuring that these products and systems are designed with the end user in mind, taking into account their abilities, needs, and preferences.

With the growing importance of creating user-friendly and accessible products and systems, the field of Human Factors Engineering will continue to be in high demand and play a crucial role in shaping our future.

Setting the Right Tone: A Human Factors Design Approach for Audible Alarms

A train driver in a modern train cab, listening to an audible alarm from the far side of the control desk.

The Problem

From experience on a number of projects in safety critical industries, including rail and nuclear power, it has become apparent to Liv Systems that audible alarms in operational environments such as control rooms are often a contentious Human Factors issue. It is commonplace for there to be disagreement amongst the operators regarding which alarm tones would be appropriate for various alarm priority levels. Also, the range of alarm tones which are available for operators to select from often seem to be rather unsatisfactory, with none of the available tones being quite what the operators are looking for. Whilst as Human Factors consultants we can provide guidance on the selection of appropriate alarm tones, we find we are often constrained by the need to choose from a limited selection of less than ideal tones.

Current approaches to developing alarm tones tend to focus primarily on ensuring that the following standards are met:

  • BS EN ISO 7731:2005 Ergonomics: Danger signals for public and work areas (auditory danger signals)
  • ISO 11429:1996 Ergonomics: System of auditory and visual danger and information signals

Whilst it is important to design alarm tones that comply with these standards, focusing purely on standards compliance, in a kind of box ticking approach, does not ensure the development of alarm tones which represent a well designed acoustic experience for the end user. Put in simple terms, the resulting alarm tones can sound pretty ghastly!

The Creative Human Factors Engineering Solution

What is needed therefore is to apply a creative Human Factors design approach to the development of alarm tones, so that not only are the standards met, but also alarms tones are created in which the whole is greater than the sum of its parts. The development of good alarm tones should be approached not purely as a science but also as an art, incorporating creativity into the design process.

Therefore, Liv Systems have been working with the sound designer Gareth Worthy (see our previous blog post interview with Gareth) to develop a number of audible alarm tone ‘families’. Each family of alarm tones has alarms of various priority levels from ‘critical’ down to ‘low priority’ and follows a consistent sound type/category.

Audible Alarm Tone Families

We have developed 3 alarm tone families:

Pure Tones

These are simple square waves or saw waves without any of the more complex acoustic features that may be associated with musical instruments such as timbre.

Musical

Sounds based on musical instruments are far more acoustically complex than ‘pure tones’ since for any given note on a musical instrument, in addition to the fundamental frequency of the note itself, there are numerous other frequencies which are emitted which give the instrument its characteristic sound or ‘timbre’.  Using musical instruments as the basis for alarm tones has the advantage that due to their acoustic complexity, they are more likely to stand out from other sounds in the environment. Also, as there may be multiple systems in a control room environment, each with their own alarm tones, the use of musical instrument based alarm tones increases the chance that the alarms will be distinct from other alarms in the environment.

Futuristic

We had not originally planned to develop a set of ‘futuristic’ tones, but through serendipity, as a result of the creative design process, we found that some of the alarms we were developing had more of a ‘sci fi’ feeling, and furthermore these alarm tones seemed effective and appealing. As with the musical tones, these tones have more acoustic complexity than the pure tones. They are based on an electronic/synthesiser sound.

Next Steps

We are close to completing the design of our 3 families of alarm tones. We believe that by working with a sound designer, we have developed sets of alarm tones which will not only prove to be highly effective and have clear and consistent stratification according to alarm priority levels but will also be appealing to the end users thereby providing a significantly enhanced user experience.

Our alarm tones will be available in the near future for purchase as downloadable WAV files, based on a one-time fee (non-royalty) under a standard licence. We will provide further updates once they are available for purchase.

Use of Augmented Reality (AR) to assist wayfinding in built environments, Part 2: AR in the wider context of emerging technologies and ‘Zero UI’

In our previous blog post we introduced some of the reasons Liv Systems are interested in exploring the use of AR to assist wayfinding in built environments and gave a brief overview of our own internal research project on this topic.

In this post we’ll discuss how AR aligns with some of the wider developments in emerging digital technologies and the types of user experience they enable.

We are currently living in exciting times in terms of the developments in a number of emerging digital technologies, in particular AR, VR and AI.

These technologies have had their ups and downs over the years and both VR and AI suffered from some false starts, possibly due to being over-hyped, especially in the case of AI, together with earlier limitations in computer processing power. Yet even during the ‘AI winter’ research on AI progressed and thanks to greatly improved computing power over the last 2 decades, together with the new approach to AI in the form of neural networks, AI is now having an enormous impact.  VR is also showing far more promise than before, thanks to the advances in computer graphics made possible by improvements in graphics processor units (GPUs) which have been largely driven by the computer gaming industry. AR has as yet to have an equally big impact, yet the technology driving it is mostly the same as that required for VR and AR is starting to be used for a number of enterprise applications as well as to a more limited extent in the consumer market (for instance with the huge popularity of ‘Pokémon Go’).

These technologies should be of great interest to human factors and UX (user experience) professionals, as they are set to fundamentally transform how we interact with the digital world and in the case of AR, even transform how we interact with the real world.

They will achieve this transformation through greatly increasing the bandwidth of the human-system interface. This will be accomplished by providing the basis for a move towards what has become known as ‘Zero UI’. A better phrase might be ‘invisible UI’ as the concept behind Zero UI is that the user interacts directly with digital information and only has minimal awareness of the graphical user interface (GUI) and physical devices required to achieve that interaction. The user interface therefore becomes in essence, invisible.

A well-known example of Zero UI from popular culture is ‘Jarvis’ from the film Iron Man. Jarvis is Tony Stark’s Virtual Personal Assistant (VPA) and AR based system.

Tony Stark with ‘Jarvis’ from the film ‘Iron Man’

Of course Jarvis packs a lot of cinematic wow factor and real world Zero UI is not likely to be as spectacular, however Jarvis does present a vision of how interaction with information systems can be provided in the future by interacting directly with digital information which is both presented as visually overlaid onto the real world view and through natural language based interaction with an AI driven virtual personal assistant.

Natural language UIs are already in widespread use, with the popularity of Alexa, Siri and Google Assistant, in addition to the increasing use of ‘Chatbots’.  These systems have been made possible thanks to a combination of greatly improved natural language processing (enabled by improvements in AI based pattern recognition) together with the neural network based AI which has enabled these systems to get better and better at responding appropriately to our verbal requests.

Whilst people may have at first felt a bit weird about talking to their computers or smart phones, the enormous popularity of stand-alone devices, such as the Amazon Echo and the Google Home demonstrates that many of us do like talking to our computers after all!

When natural language systems perform effectively, they can perform a task, such as playing a song we want to hear, or provide us with some transport related information such as a journey plan, based on a simple verbalised enquiry which may take just a few effortless seconds. Using a more traditional GUI based interaction, to achieve the same results may require multiple clicks, gestures and text inputs on a screen-based system, which is generally more time consuming and requires greater cognitive effort. In contrast merely talking to a computer is more intuitive, less effortful and requires virtually no computer literacy. Hence ironically the more advanced UIs become and the more we move towards achieving Zero UI systems, the more accessible such systems may become to the less tech savvy.

Another big advantage of Zero UI is it will allow us to escape from our screens whilst retaining the rapid access to the digital world we have become accustomed to. Optimists may hope this will enable us to engage more with the real world around us again. Such optimism could prove unfounded, but at least we may be less likely to carelessly step out in front of traffic or walk into lamp posts or each other! Also, it will give our eyes a much needed rest.

In our next blog post we’ll discuss what AR’s contributions are likely to be within this wider context of emerging technologies, the technology hardware used for AR and some of the possible application areas of AR.

We’d be delighted to hear from you if you’d like to discuss any of these topics with us, share ideas or find out more about our work.

Use of Augmented Reality (AR) to assist wayfinding in built environments: Introduction

Here at Liv Systems we’re currently running an internal project investigating the use of Augmented Reality (AR) as an aid to wayfinding within built environments. AR will assist wayfinding by overlaying digital navigational information onto the real-world view. This is part of our wider strategy to explore the applications and User Experience (UX) of emerging digital technologies, in particular AR, Virtual Reality (VR) and AI.

As such we’ll be running a series of blog posts related to how we believe AR can be used as a powerful and intuitive tool to aid wayfinding within built environments, transforming the task of finding our way around such environments from the disorientating and stressful experience it often is, with the current reliance on signage, or use of map based navigation apps, to a far less frustrating, greatly simplified task.

We’ll be approaching this topic not only from the point of view of the technology solutions available, but also based on a focus on the cognitive aspects of human navigation and the UX issues associated with designing effective AR navigation aids.

An illustration of how directional arrow and animated character based navigational information could be overlaid onto the real-world view at a London Underground station

 

Simplifying the wayfinding task in unfamiliar environments, will benefit a wide range of users, including users with impairments which may make wayfinding particularly challenging. For instance, a wide range of cognitive impairments, including impairments associated with early stage dementia, together with other issues such as autism spectrum disorder and some anxiety related conditions can make wayfinding in unfamiliar environments enormously difficult and stressful. In fact deterioration of navigation ability is known to be one of the earliest signs of dementia and as such tests of navigation skills, using VR environments, are now being introduced as a method of early warning detection of the onset of dementia.

Such cognitive impairments and neurological or psychological conditions, effectively act as a barrier for many people to making journeys to destinations in which they feel they may encounter such situations. For this reason, in addition to being of great benefit to us all, AR assisted navigation offers the promise of removing barriers to travel for many people and allowing them to retain their independence.

I am an active participant in the ‘Cognitive Navigation’  (CogNav) Special Interest Group https://rin.org.uk/page/CogNav of the Royal Institute of Navigation. The CogNav group is chaired by Professor Kate Jeffery, who is a behavioural neuroscientist at UCL specialising in the neuroscience of navigation. The CogNav group brings together academics and industry professionals from a variety of backgrounds including, neuroscience, cognitive psychology, industrial design, human factors and architecture, with the aim to better understand the cognitive aspects of human navigation so as to best design solutions to assist that.

As a result of my involvement in the CogNav group Liv Systems are able to bring the latest knowledge and research findings in this area to our exploration of the use of AR for wayfinding.

AR can be used to assist wayfinding in any large, complex built environment and the wider urban realm. Large built environments that can pose significant wayfinding issues include shopping malls, hospitals, museums, rail stations and airports.

We have already produced some early stage prototypes for concept development and are now in the process of developing a fully interactive navigational AR prototype. We are planning on conducting user research with our AR prototype within a large shopping mall in the near future.

A snapshot from one of our early prototypes, developed in Proto.io, depicting the use of navigational arrows together with faded landmark overlays to assist in providing an overall sense of orientation

In our next blog post we’ll discuss how AR fits into the wider context of emerging digital technologies and how these technologies may provide the basis for a form of user interaction that has become known as ‘Zero UI’.

Please get in touch if you are interested in our work in this area and would like to know more.