What is Augmented Reality (AR), and where does it fit in relation to other ‘reality’ technologies in the market today? Locating these technologies isn’t cut and dry.
There is some overlap and they are better understood as a spectrum, rather than distinct items. The Milgram Extended Reality Spectrum is a useful framework for understanding these technologies.
Extended Reality (XR) refers to all the real-and-virtual environments generated by computer technology and wearables – in other words a spectrum of capabilities.
The AR that people are most familiar with is video see-through technology or VST, which allows you to view through a tablet or phone.
At Harpak-ULMA, we partner with PTC to utilize Vuforia Chalk - where the view on your tablet or phone, or even RealWear headset, allows you to annotate graphics over what you are seeing.
Watch Chalk™ from PTC in action with one of our TFS thermoformers >
The second type of AR is lens-based applications that employ Ocular see-through technology or OST such as Microsoft HoloLens.
AR information is projected in the lens and your brain puts two and two together, and stitches it together to create an “augmented view” of reality.
The third flavor is called Projection AR - where a projected image is mapped onto a specially designed work surface, creating a direct overlay on the parts where a user is working.
The objects and users can move around in the environment, but the AR experience is limited to the fields of view of both the fixed projector and supporting camera for tracking.
A projection-based Augmented Reality system can provide user instructions or assistance in a variety of media such as:
AR is making it possible to more quickly train or provide expert guidance to a workforce, and provide them the information they need at the right time and with the correct context. Staff can be guided through how to get a machine back up and running, perform a product line changeover, or alerted to a performance issue using AR technologies.
We see the AR value proposition as delivering significant productivity gains in human labor; improvements on the order of 30-60%, depending on the specific application.
Can you get a good, reliable Wi-Fi signal on the floor?
What devices will you have access to? The devices that best enable AR are relatively expensive so there needs to be a plan around how you manage those, and how you deploy them.
Beyond that, as an OEM, if we want to offer AR experiences to our customers or even implement them internally, we need good 3D data.
You also really need to have clear goals about what you want to accomplish with AR.
AR is a great and fascinating tool -- but unless we’re solving a specific problem there’s no hard return on that investment. Building an industrial-AR experience is unlike a typical engineering development pipeline most of us are used to.
The Vuforia platform, which we utilize, is architected to make building the experiences more intuitive without a lot of additional manipulation of the data.
AR for Remote service allows an expert resource to – on demand - interactively guide people through a support task.
We can guide the onsite team member on how to execute a task leveraging simple on screen annotations that quickly clarify instructions to the user.
So, if you have a machine that breaks, rather than waiting a day or however long it takes to get a tech out to you, you can hop on a call with that tech and be guided either with a mobile device or with a headset through that repair process.
The value prop for this is unequivocal – the machine gets fixed faster and without travel costs – a better option for both parties.
In our case, our machinery is produced overseas, so sometimes an onsite FAT is problematic.
But FATs must take place, so we've instituted virtual FATs where we create all the documentation and capture it in AR.
Technicians go around the machine wearing AR headsets capturing key data and processes while the machine is running, including HMI outputs, and any sort of validation criteria that our customer cares about.
We're able to document that all using AR so that a customer can experience it for themselves, and we can follow up with live AR sessions as necessary.
This approach can be particularly useful in an automation engagement where the packaging machine may already be in the customer facility, and the automation components are being assembled outside of the facility. We can use AR to recreate a “combined” view if you will.
Using AR technology and a process called Expert Capture, technicians create training content around a specific machine, so that it can be delivered right along with the machine itself.
Training sequences might include how to perform a film or tooling change, or guide a tech through diagnosing common faults or troubleshooting procedures, or show how to execute a homing adjustment on pusher arms – pretty much any procedure that is worthy of detailed explanation.
Capturing all of that and making it available as an on-demand library of AR experiences tremendously shortens learning curves – which is ideal in a high labor turnover context.
This allows staff to view key performance data as a machine is running.
It can include more complex data such as OEE metrics, or simple operating sensor data like sealing pressures or temperatures that could be indicating the onset of a component failure.
This kind of “x-ray” vision, so to speak, gets even more compelling a bit further down the roadmap.
Combining digital twin technology with AR takes this concept to another level. It allows technicians to “see” how and where internal components are designed to operate without opening the machine up.
The big payoff of collecting all of this IoT operating data is nothing less than the holy grail of true predictive maintenance. We’ll aggregate massive volumes of anonymized IoT data into cloud-based data lakes.
That will enable us to benchmark any individual machine's performance against the entire data set – a useful analysis on its own – but it also means that data can be mined using a combination of machine learning and Artificial Intelligence to develop the predictive algorithms that will minimize unplanned machine downtime.
Harpak-ULMA packaging solutions leverage Rockwell Automation’s smart connected automation platform, giving our customers a competitive edge by improving agility and performance at a lower total cost of ownership. We offer smart, connected packaging machines that reduce the complexity, time, and cost of building or extending any automated packaging solution; a major step towards building a richer, enterprise-wide, contextual view of packaging operations.