How can Augmented Reality improve our manufacturing processes?

What is Augmented Reality?

The aspiration of Industry 4.0 is to fuse cutting edge technologies to achieve a symbiosis between physical, digital and biological spheres. One of the practical means to achieve this from a human centric perspective, is the use of virtual visualisation of information. This takes the form of several comparable technologies under the umbrella term Extended Reality (XR). XR comprises Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR).

AR is a technology which superimposes images over real-world subjects, augmenting the view to the user with additional information. It is most commonly exploited in smartphone applications for games, language translation, navigation and mapping.

In all cases, the user’s phone is directed at a subject, where the phone’s camera system captures the surface to calculate the orientation and shape of the profile. It then projects information which aligns with that surface. In the case of a smartphone, this projection is seen virtually, through the screen. However, more bespoke systems can project directly onto a surface, so that the experience moves from single-user to multi-user.

How can AR be used in manufacturing?

In terms of manufacturing and production, AR has been utilised to provide overlays to the user to show various information, for example, the flow directions and temperature through pipes, safe zones for operation in a Computer Numerical Control (CNC) environment or with robotic arms.

One of the main reported benefits of AR over VR is that the information relates directly to the physical view of the user rather than a purely virtual environment. By recognising key objects in the field of view, and overlaying the physical and virtual environments, a user is able to understand better how the information relates precisely to the object in question. When operating correctly AR is perhaps even more immersive than VR.

One drawback of this is that reliable object detection systems need to be employed to detect and map important objects in the physical world through cameras to the virtual display that is shown to the user. This is a non-trivial task but is key to the work currently being delivered by UWE research teams as part of the Digital Engineering Technology & Innovation (DETI) initiative.

What are we currently researching?

UWE research teams are helping to deliver a proof-of-concept system that guides a user through a specific process using Augmented Reality. This represents a forward step in the use of state-of-the-art technology in manufacturing, that will provide contextually aware information to the user in real time.

This feedback to the user will show how correctly they are following a procedure, meaning that not only will an augmented view be projected onto the physical scene, but also that this view will change as the user interacts with it.

Two case studies have been selected to demonstrate this new use of AR systems within manufacturing processes; the vacuum bagging of a composite part, and agricultural fruit picking.

An example set up for a vacuum bagging procedure with projected image and AR overlay. The projected area provides feedback and input for the user to help guide them through the various stages of the procedure.

The vacuum bagging demonstration will guide users with the detection of parts, damage and foreign objects, and will feedback to the user the desired position, quantity and form of the various components. The fruit picking demonstration will use existing fruit-detection machine vision algorithms to project desired fruit locations to the fruit picker.

Some example projections that a user would be presented with to ensure components are in the correct place and work features meet specification.

To ensure the results from these case studies are relevant to our current workforce, the UWE project team have been working with local composite vacuum bagging operators to determine a priority list of processes which will most benefit from this AR guidance system.

They have also developed software to upload these priority processes to the system in a way that ensures common methodologies are used, supporting the system to be utilised in future developments beyond vacuum bagging.

What’s Next?

The next steps for this project are to integrate trained machine learning models which can detect the tools used for each vacuum bagging process, and to install the entire system into a fixed training bench for user testing.

The team will then use their learnings from the vacuum bagging case study to transfer the AR guidance system to the automated fruit picking scenario, where AR goggles will allow for a wearable AR guided system for fruit pickers in the field.

If you would like to know more about the research happening within the Faculty of Environment and Technology at UWE Bristol please visit the website.  

More information about the DETI Skills programme at UWE Bristol can be found here.

Sign up for DETI news and updates