Augmented Reality

What is Augmented Reality (AR), and where does it fit in relation to other ‘reality’ technologies in the market today? Locating these technologies isn’t cut and dry.

There is some overlap and they are better understood as a spectrum, rather than distinct items. The Milgram Extended Reality Spectrum is a useful framework for understanding these technologies.

MilgrimExtendedRealitySpectrum

Extended Reality (XR)

Extended Reality (XR) refers to all the real-and-virtual environments generated by computer technology and wearables – in other words a spectrum of capabilities.

  • Reality: The most familiar end of the spectrum is reality, the physical world that we can experience with all of our senses, particularly physical touch.
  • Assisted Reality: Assisted reality refers to projecting additional information into a user's field of vision, hands-free. It doesn't change what the user is seeing, nor is it aware of the geometry of the user’s environment. It only adds an extra layer of information into their peripheral vision – Google glasses or RealWear headsets are great examples of this.
  • Augmented Reality (AR): AR is an overlay of computer-generated content in the real world that can superficially interact with the environment in real-time. With AR, there is no “occlusion” between CG content and the real-world. The end user can experience this on headsets but more commonly on typical mobile devices. Generally these experiences will include text, graphics, videos, animations and sounds. Essentially, AR superimposes digital information on the material or physical world, like heads-up displays in a plane or in cars.
  • Mixed Reality (MR): MR is an overlay of synthetic content anchored to, and interacting with, real world objects in real time. With Mixed Reality experiences, computer-generated objects are visibly obscured by objects in the physical environment. When implemented well, MR should be a perfect union of digital and physical content interacting seamlessly.
  • Augmented Virtual Reality (aVR): Immediately next to Virtual Reality on the spectrum, augmented virtual reality  takes content from the physical world and adds to a virtual experience to give the end user a more immersive feel. This could be 360º video capture for scenery or pre-recorded audio.
  • Virtual Reality (VR): On the opposite end of the spectrum is Virtual Reality (VR). These experiences are composed of entirely digital content, from computer generated graphics to sounds and locations. In some cases this can include content meant to mimic aspects or environments from the real-world, such as walking through a virtual layout of a plant. The defining trait of these experiences is not the content, but how the user is going to experience it.

AR for Your Packaging Operation

The AR that people are most familiar with is video see-through technology or VST, which allows you to view through a tablet or phone.

At Harpak-ULMA, we partner with PTC to utilize Vuforia Chalk - where the view on your tablet or phone, or even RealWear headset, allows you to annotate graphics over what you are seeing.

 

Screenshot_ChalkVideoWatch Chalk™ from PTC in action with one of our TFS thermoformers >

 

The second type of AR is lens-based applications that employ Ocular see-through technology or OST such as Microsoft HoloLens.

AR information is projected in the lens and your brain puts two and two together, and stitches it together to create an “augmented view” of reality.

MixedReality

The third flavor is called Projection AR - where a projected image is mapped onto a specially designed work surface, creating a direct overlay on the parts where a user is working.

The objects and users can move around in the environment, but the AR experience is limited to the fields of view of both the fixed projector and supporting camera for tracking.

A projection-based Augmented Reality system can provide user instructions or assistance in a variety of media such as:

  • Text (for example, cycle time count down)
  • Images (for example, blueprints or simple directional arrows)
  • Animations or Videos

ARToolChange_Tablet

AR is making it possible to more quickly train or provide expert guidance to a workforce, and provide them the information they need at the right time and with the correct context. Staff can be guided through how to get a machine back up and running, perform a product line changeover, or alerted to a performance issue using AR technologies.

We see the AR value proposition as delivering significant productivity gains in human labor; improvements on the order of 30-60%, depending on the specific application.

Challenges of Deploying AR

Can you get a good, reliable Wi-Fi signal on the floor?

What devices will you have access to? The devices that best enable AR are relatively expensive so there needs to be a plan around how you manage those, and how you deploy them.

Beyond that, as an OEM, if we want to offer AR experiences to our customers or even implement them internally, we need good 3D data.

You also really need to have clear goals about what you want to accomplish with AR.

AR is a great and fascinating tool -- but unless we’re solving a specific problem there’s no hard return on that investment. Building an industrial-AR experience is unlike a typical engineering development pipeline most of us are used to.

The Vuforia platform, which we utilize, is architected to make building the experiences more intuitive without a lot of additional manipulation of the data.

Remote Service

AR for Remote service allows an expert resource to – on demand - interactively guide people through a support task.

Screenshot_RemoteServiceClip

 

We can guide the onsite team member on how to execute a task leveraging simple on screen annotations that quickly clarify instructions to the user.

 

So, if you have a machine that breaks, rather than waiting a day or however long it takes to get a tech out to you, you can hop on a call with that tech and be guided either with a mobile device or with a headset through that repair process.

The value prop for this is unequivocal – the machine gets fixed faster and without travel costs – a better option for both parties.

Factory Acceptance Tests

In our case, our machinery is produced overseas, so sometimes an onsite FAT is problematic.

But FATs must take place, so we've instituted virtual FATs where we create all the documentation and capture it in AR.

Blog_RemoteFAT

Technicians go around the machine wearing AR headsets capturing key data and processes while the machine is running, including HMI outputs, and any sort of validation criteria that our customer cares about.

We're able to document that all using AR so that a customer can experience it for themselves, and we can follow up with live AR sessions as necessary.

This approach can be particularly useful in an automation engagement where the packaging machine may already be in the customer facility, and the automation components are being assembled outside of the facility. We can use AR to recreate a “combined” view if you will.

Training Solutions

Using AR technology and a process called Expert Capture, technicians create training content around a specific machine, so that it can be delivered right along with the machine itself.

Screenshot_ExpertCaptureClipTraining sequences might include how to perform a film or tooling change, or guide a tech through diagnosing common faults or troubleshooting procedures, or show how to execute a homing adjustment on pusher arms – pretty much any procedure that is worthy of detailed explanation.

 

Capturing all of that and making it available as an on-demand library of AR experiences tremendously shortens learning curves – which is ideal in a high labor turnover context.

IoT Service Diagnostics

This allows staff to view key performance data as a machine is running.

It can include more complex data such as OEE metrics, or simple operating sensor data like sealing pressures or temperatures that could be indicating the onset of a component failure.

This kind of “x-ray” vision, so to speak, gets even more compelling a bit further down the roadmap.

Screenshot_TFSAR

 

Combining digital twin technology with AR takes this concept to another level. It allows technicians to “see” how and where internal components are designed to operate without opening the machine up.

 

 

The big payoff of collecting all of this IoT operating data is nothing less than the holy grail of true predictive maintenance. We’ll aggregate massive volumes of anonymized IoT data into cloud-based data lakes.

That will enable us to benchmark any individual machine's performance against the entire data set – a useful analysis on its own – but it also means that data can be mined using a combination of machine learning and Artificial Intelligence to develop the predictive algorithms that will minimize unplanned machine downtime.

Harpak-ULMA packaging solutions leverage Rockwell Automation’s smart connected automation platform, giving our customers a competitive edge by improving agility and performance at a lower total cost of ownership. We offer smart, connected packaging machines that reduce the complexity, time, and cost of building or extending any automated packaging solution; a major step towards building a richer, enterprise-wide, contextual view of packaging operations.

Contact Us Today!