Spyglass Visual Inspection By the Numbers

Intelligently minimize defects and reduce costs

Spyglass Visual Inspection harnesses the power of AI, IIoT, and image recognition to help manufacturers improve product quality while significantly reducing the costs associated with manufacturing flaws.

Effectively addressing quality concerns is critical in manufacturing

AI helps drive improved defect detection and better business outcomes:

  • 10-15% of total operating costs often associated with poor quality product (Forbes, 2018)
  • 1/3 of manufacturing executives now identify AI-driven technologies
    as crucial to driving customer satisfaction (Forbes, 2018)
  • $3.7 Trillion - the value that McKinsey forecasts AI-powered “smart factories” will generate by 2025.

What is Spyglass Visual Inspection?

Spyglass Visual Inspection is a rapid time-to-value QA optimization solution for manufacturers of any scale. It is designed to:

  • Quickly and accurately detect defects so that action can be taken to reduce waste and improve customer satisfaction
  • Drive continuous quality improvement by enabling greater visibility with a bird’s eye view of product quality across multiple lines or facilities so you can proactively improve processes.
  • Use predictive analytics to proactively improve quality processes and perform root cause analysis
  • Implement and ramp-up quickly ensuring a rapid return on your investment
  • Augment your existing vision system (if you have one) to gain additional ROI on that investment
  • Use a lean approach to implementing AI and IIoT so that you control costs and gain value at every stage.

Global glass manufacturer saves over $1M quarterly with Spyglass Visual Inspection

A global automotive glass manufacturer is using Spyglass Visual Inspection today as their comprehensive platform for defect detection, prediction, and analysis. Their challenge was that they needed more accurate defect detection in their glass cutting process to reduce false positives from their vision system that resulted in high monetary losses of $30/unit over 40 production lines. They were looking for a solution that would use custom vision, image recognition, and machine learning to more accurately detect defects at high speed and in large volume. With Spyglass Visual Inspection, they are already achieving over $1 million in quarterly savings.

Where can I learn more?

You can review our presentation, solution overview, and customer case studies. We'd love to connect to learn more about your quality initiatives and how we could help. Connect with us here.

How do I get started? The easiest way is wIth a no-risk, Spyglass Visual Inspection 30-Day Proof of Value.

Every manufacturer is different and every defect detection requirement is unique. It's critical to determine quickly that Spyglass Visual Inspection is the right fit to meet your quality goals and match your operating conditions. We'll start with a risk-free Proof of Value engagement to get you up and running with image formation and labeling -- and we'll even loan you the hardware and software, if you need it -- and then our data scientists will use Deep Learning AI to train the machine learning model accordingly. For most customers, the Proof of Value stage can be completed in 30 days -- and if we can't demonstrate success, you're under no obligation to proceed. Then, we'll Operationalize the solution in your factories using Spyglass as the platform to implement your customized visual inspection solution. Finally, we will Maintain and Improve the Machine Learning Model. On a quarterly basis, we will meet with your quality teams and help the model learn from any mistakes it has made. In this way, the accuracy will continue to improve over time.

Ready to find out more? Connect with us here.


Maximize existing QA vision systems with Deep Learning AI

The costs of poor quality are high

Quality assurance matters to manufacturers. The reputation and bottom line of a company can be adversely affected if defective products are released. If a defect is not detected, and the flawed product is not removed early in the production process, the damage can be costly – and the higher the unit value, the higher those costs will be. Indeed, poor quality potentially contributes to cost in a variety of ways:

  • Re-work costs
  • Production inefficiencies
  • Wasted materials
  • Expensive and embarrassing recalls

And worst of all, dissatisfied customers can demand returns.

The problem with traditional machine vision systems

To mitigate these costs, many manufacturers install cameras to monitor their products as they move along their production lines.

However, the data obtained may not always be useful – or more appropriately said, the data is useful, but existing machine vision systems may not be able to accurately assess it at full production speeds. That’s because too many variables make product defect analysis and prediction difficult. Furthermore, manufacturers need to perform their root cause analyses across a manufacturing process that has complex variables in order to determine which combinations of variables create high-quality products versus those that create inferior products. But to achieve this precision, the manufacturer needs to aggregate data across multiple systems to return a comprehensive view.

Legacy vision systems typically lack the accuracy, speed, and analytic capabilities required to fulfill manufacturers’ QA wish lists – and again, that’s because manufacturing processes can be incredibly complex, and older vision systems are often unable to consistently and accurately identify small flaws that may have a large impact on customer satisfaction. To further aggravate the situation, false positives (i.e., falsely detecting defects that aren’t actually present) can bog down production schedules.

On a larger level, the inability to aggregate data from multiple production lines or factories to determine the cause of variations in quality across multiple sites also prevents a holistic view of operational efficiency.

From the top down, then, many manufacturers find the current state of machine vision driven QA to fall far short of its potential for reducing the costs of quality.

Integrating legacy systems and AI on Azure

To mitigate these and other problems, our Spyglass Visual Inspection solution uses Deep Learning AI to achieve visibility over the entire line, which catches defects more quickly and more accurately than existing machine vision systems. Furthermore, because of its alerting and root-cause analysis capabilities, Spyglass Visual Inspection also helps to prevent defects before they ever arise.

Spyglass Visual Inspection is an easily implemented, rapid time-to-value QA solution that can reduce costs associated with product defects and increase customer satisfaction.

It works with images from any vision system, so companies who already have systems in place can leverage them for additional return-on-investment (ROI). By using cameras and other devices already in use on the production floor, the solution takes a lean approach to implementing new and emerging technologies like IoT, Deep Learning AI, and computer vision. This ensures that manufacturers control costs and achieve value at every stage of production and are truly able to reduce their cost of quality.

 

This figure outlines the architecture of the solution. Data from existing systems is placed at the front. Edge computing provides on-premises processing and real-time, AI-driven decision-making. The data then moves to Azure, where it is further processed. AI is again applied in a variety of ways that iteratively improve the system, and the results can be viewed using Power BI for even further insights into the system.

Benefits of Spyglass Visual Inspection

Spyglass Visual Inspection harnesses the power of Deep Learning AI, IoT, machine vision, and Azure. The result is that manufacturers minimize defects and reduce costs through advanced analytics. For the manufacturer, the benefits that matter are:

  • Rapid ROI: Easy implementation and ramp-up enables immediate process improvements and a rapid return on your investment.
  • Greater visibility: Predictive analytics and root cause analysis drive quality improvements across multiple lines or sites.
  • Leverages existing vision systems: Extracts more value from existing industrial cameras and devices by augmenting them with AI-driven real-time insights.
  • Fully transactable on the Azure Commercial Marketplace: No lengthy delays with procurement departments – the transaction can take place entirely on Microsoft paper, fast-tracking the above benefits for manufacturers.

All of these benefits combine, of course, into one overarching, easily-understood benefit: Spyglass Visual Inspection reduces a manufacturer’s cost of quality.

Azure services

Spyglass Visual Inspection is powered by Microsoft Azure. It leverages the following Azure services:

  • Azure IoT Edge ingests images from industrial cameras on the production line and runs cloud AI algorithms locally.
  • Azure IoT Hub receives images, meta data from images, and results from the defect detection analysis on the Edge.
  • Azure Stream Analytics enables users to create dashboards that offer deep insights into the types and causes of defects that are occurring across a massive number of variables.
  • Azure Data Lake Storage/Blob Storage stores the data. Because heterogeneous data from multiple streams can be stored, additional data types can be added to image-based analysis.
  • Azure SQL Database is used to store the business rules that define what a good or bad product is and what alerts should be generated in the analytics.
  • Azure Functions/Service Bus generates rules that trigger alerts so you can capture the most meaningful data for business users.
  • Power BI provides interactive dashboards that make data easy to access and understand, so users can make analytics-driven decisions.
  • Power Apps creates additional applications for manufacturers to act on the data and insights they have received.

Recommended next steps

If you want to learn more about Spyglass Visual Inspection -- how it works, and the results that manufacturers are achieving – be sure and visit our Deep Learning / Machine Vision resources page for eBooks, infographics, video, and more.  You can also find Spyglass Visual Inspection on Microsoft's Azure Marketplace and AppSource.

Ready to take the next step? You can also ask for a risk-free fit assessment to see if Spyglass Visual Inspection is right for your facilities and products, or feel free to contact us with any questions you might have.

This article was originally published at https://azure.microsoft.com/en-in/blog/maximize-existing-vision-systems-in-quality-assurance-with-cognitive-ai/ by Diego Tamburini, Principal Manufacturing Industry Lead, Azure Industry Experiences Team, and is updated and republished here with his kind permission.


Microsoft Azure and Intel IoT Join Forces

Intel and Microsoft have formed a partnership and have launched a joint website to support their collaboration and inform customers on its benefits.

They are working together to deliver intelligent, highly scalable IoT products and services without unnecessary complexity. This collaboration supports a fast-growing ecosystem of more than two dozen partners who offer a range of end-to-end solutions for specific industries and use cases. The goal is to help their growing ecosystem of IoT partners bring ever-advancing innovation and insights to customers, allowing them to make improvements to their business, increase operational efficiency, and create new revenue streams.

Mariner is using edge-to-cloud IoT technology from Intel and Microsoft to enhance customer experiences and tackle new challenges with secure, scalable solutions. Click on the Mariner solution below to find out about how these intelligent tools can create better business outcomes in any industry with production lines.

Mariner is Microsoft’s 2020 Worldwide IoT Partner of the year and member of Intel’s IoT Solutions Alliance.  Mariner's Spyglass Visual Inspection and Spyglass Connected Factory solutions are also marketplace ready with Microsoft and Intel.

 


Insight.tech Article on Machine Learning

By Erica Stevens for Insight.tech

A Guaranteed Model for Machine Learning

On the factory floor, wasted resources stack up fast for every real or imagined defect. When a good part is mistakenly labeled flawed, there’s lost time, efficiency, and machine effort. And when a defective part goes unnoticed and becomes the end customer’s problem? The potential consequences are even more severe.  <Read More at Insight.tech>

 

See Erica's interview with Mariner's EVP of Product Development, Peter Darragh on our product Spyglass Visual InspectionMicrosoft and Intel marketplace ready!


Vision Systems Design Webinar: Leveraging Deep Learning and AI Applications

This live event has ended.
Click "HERE" to view the recording

Please join Mariner as we present the following Vision Systems Design webinar:

Leveraging Deep Learning and AI Applications in Manufacturing
Tuesday, December 15, 2020
12:00PM – 1:00PM EDT 

REGISTER

Manufacturers like you are successfully using artificial intelligence and deep learning in their operations today.

While these technologies help production processes reach new heights, one must carefully evaluate options before making any decisions.

Join us on December 15th to hear real-world case studies about how a chemical factory, a glass factory, and a fabric factory reduced their costs and increased their quality, and the role AI and Deep Learning played in those successes.

The webcast will cover Cloud limitations, the latest on the edge, hybrid edge/cloud setups for Industry 4.0, and how Intel and Microsoft technologies can help make it all come together. The webcast will conclude with a Q&A.

Sign up now to keep yourself on the cutting edge of machine vision technology.


Mariner – Best Place to Work Winner 2020

Every year, the Charlotte Business Journal sponsors a “Best Place to Work” contest.  Mariner reached the #2 slot in the Small Business Category for 2020!  All employees are surveyed by an outside company and that ranking is provided to CBJ.  Faced with unprecedented challenges in 2020, we still expanded our support for work-life balance and continue to have happy employees.

It’s great to be a Mariner!


Mariner selected as a finalist for 2020 Blue Diamond Award

10/6/2020 - Mariner was named a finalist in the "Business Impact - Analytics, AI & Big Data" category for 2020.  Although, we didn't win, we are honored to be selected.  Congratulations Quaero on a job well done!

 

CATC announces Charlotte’s
2020 Blue Diamond Technology Awards Finalists

Charlotte, N.C. –Charlotte Area Technology Collaborative (CATC) announces the nine award category finalists representing exceptional technology innovation and talent in the Greater Charlotte region.

Each category of nominations was reviewed and voted upon by a unique panel of judges made up of industry technology and business executives.

The winners will be announced at the virtual Blue Diamond Awards Celebration the morning of October 6, 2020, 7:30 – 8:30 a.m.

Business Impact – SMB
Cloosiv
Ekos
Pet Screening

Business Impact  –  Education, Government
Charlotte Mecklenburg Schools
Mecklenburg County
UNC Charlotte

Business Impact – Analytics, AI & Big Data
Mariner
Quaero
Sealed Air

Business Impact – Corporate
AvidXchange
Curvature  CRM Sales Automation
Curvature  Forward Stocking Locations

Cool Innovation
ChromaSol International
EcoClosure
Lucid Drone Technologies
Stratifyd

Community Outreach
AvidXchange
Curvature
ECRS

Human Capital
Genesis 10
Goodwill University
Innovate Charlotte

IT Entrepreneur
Bryan Delaney, Skookum
Brian Kelly, CloudGenera

Student Innovator
Adonis  Abdullah, UNC Charlotte CCI
Fidel Henriquez, UNC Charlotte CCi

 

 

The CATC, a 501(c)(3) organization , unites businesses, education, economic development  and community organizations to inspire, grow and advance an inclusive technology talent pipeline.   Proceeds from the annual Blue Diamond Awards Celebration support programs for middle, high school, women in tech, and community collaboration.


IndustryWeek Webinar: Leveraging Deep Learning and AI Applications in Manufacturing 10-13-2020

This live event has ended.
Click HERE to view the recording

Please join MarinerIntel and Microsoft as we present the following Industry Week webinar: 

Leveraging Deep Learning and AI Applications in Manufacturing
Tuesday, October 13, 2020
11:00AM – 12:00PM EDT 

REGISTER

Do AI and Deep Learning belong on the factory floor or are they just for those with their heads in the clouds?  Is being "on the edge" actually a good thing when you want to improve quality and reduce costs? 

As a manufacturer seeking to improve your production processes, you must consider these questions, and more. You face an array of technologies that promises to help you reach your goals, and you must carefully evaluate your options before making any decisions. 

The fact is that manufacturers like you are successfully using AI and Deep Learning in their operations. 

Come join us to hear real-world case studies about how a chemical factory, a glass factory and a fabric factory reduced their costs and increased their quality, and the role AI and Deep Learning played in those successes. 

In addition, we will also cover: 

  • Why the Cloud has limitations for AI and Deep Learning on the factory floor.                         
  • Why on-premise is fashionable again; now they call it "edge computing."                 
  • Why factory-floor AI and Deep Learning need both a hybrid edge/cloud to truly deliver 4.0 Smart Factory capabilities. 

 

NOTICE TO ALL PUBLIC SECTOR OR STATE-OWNED ENTITY EMPLOYEES – Federal [including Military], State, Local and Public Education

This is a Microsoft partner event. Should items of value (e.g. food, promotional items) be disbursed to event participants, these items will be available at no charge to attendees. Please check with your ethics policies before accepting items of value.


Peak Performance Symposium - 9/25/2020

The Peak Performance Symposium is a yearly event held for manufacturers, by manufacturers where leading industry professionals speak on a variety of topics relevant in today’s advanced manufacturing environment. There’s a jam packed agenda and Mariner is proud and excited to be one of the sponsors and presenters.  Please be sure to join us as we present with our customer, Milliken on “How Milliken leverages IoT and AI to Improve Asset Reliability” from 10:00a – 10:30a EDT in virtual Breakout Room 2.

Hope to see you there!

REGISTER

 


When visual inspection and process telemetry come together, they deliver Industry 4.0 smart factory capabilities

By Peter Darragh, EVP Product Engineering

When your job is to prevent your customers and supply chain partners from dealing with defects created by your out-of-control production line you have to decide if refinement of what you are doing now is good enough, or if you should solve the problem with entirely new capabilities.

Computer vision system providers have had decades of experience on productions lines. Patents for applying machine vision to manufacturing dates from the 1960s. Technology of image formation has an even longer historyPLCs also have a long history of use in manufacturing and made possible MES and SCADA.

Perhaps its isn’t realistic it to expect your next wave of refinement of technology won’t change something that already has decades of development poured into it and it is time to try something different. Progress may lie not in improving something these technologies do well and but rather to focus on what they were never designed to do at all.

Computer Vision systems performing visual inspection workloads are primarily responsible for annunciating defects on a HMI or sending a signal to downstream equipment to deal with the problem. But they are rarely capable of explaining how the defect was created.

Process telemetry collected by a Manufacturing Execution System (MES), data loggers or process historians are designed to quickly explain what was happening when a lot, batch, or serialized part was made. But they are rarely capable of explaining what the computer vision system actually saw.

So as a process engineer, you must make up for each technology’s respective limitations. If you don’t have a smart factory then you or someone you can boss around, must:

  1. Collect samples, or images of examples of the defects
  2. Define a classification or grading score to apply to the collection
  3. Analyze the frequency of the problem
  4. Match the defect to a specific lot/batch/serial number
  5. Pull the telemetry for all those defects
  6. Define a classification or grading score to ‘bin’ the data into an event.
  7. Combine those results into a workable dataset for analysis
  8. See if there is a relationship between how the item was made and the type of defect produced.
  9. Critically review the strength of those relationships to eliminate post-hoc bias and prove true causality.

The above steps are necessary to deliver insight and vision upon which to decide and act. As described by John Boyd’s work on Organic Design for Command and Control.

“Why insight and Vision? Without insight and vision there can be no orientation to deal with both present and future”.

Either after sufficient personal experience, or sage mentoring, you know how certain defects are created and, if the universe has smiled upon you, know how to prevent them.  But knowing isn’t the same as doing. You are only half way along your OODA loop.

Now John Boyd was a colonel in the air force and OODA has been successfully applied in military strategy, but Chet Richard’s book “Certain to Win” points out the similarities to OODA principles to the TPS and that is solidly in the realm of manufacturing.

And others such as Nigel Duffy when he was Innovation Leader AI Leader at EY saw OODA’s applicability in business workflow.

Regardless if you like it or not, as a process engineer, your job is a series of never-ending OODA loops and dealing with the present and future is the next half of the OODA, the DA - Deciding and Acting part.

In manufacturing, an automated OODA loop lives in an Industry 4.0 smart factory ‘where human beings, machines and resources communicate with each other as naturally as in a social network.’

When you blend process telemetry and visual inspection together, you can have your own 4.0 smart factory OODA loop. For one of our customers their smart factory OODA loop is one where SVI and SCF work together to:

(O)bserve: using a deep-learning model that reviews the images and recognizes a defect.

(O)rient: the raw telemetry from the machines that is converted into OOC events that are matched to the type of defect the deep-learning model identified.

(D)ecide: based on the frequency of the defect, the scale of the out-of-control process events and using rules or other AI to trigger a response.

(A)ct: with a situation report with suggested remedial actions sent via email or text to the process leads in the cell identified as the cause of the defect so they can take action.

With IoT, AI and cloud being commonplace, and the first industry 4.0 documents already consigned to the archives, why do they look so unusual? With a younger workforce rightly expecting social networks to be intrinsic to their workplace experience, then why are these 4.0 smart factory OODA loops not commonplace? Perhaps the answer lies in the words of Matthew Stockwin, Manufacturing Director, Coats.

"AI will come and digitisation is an unstoppable trend, but my view is that its penetration into the deep bowels of manufacturing will take more time than we think."

Stockwin was quoted in 2019 and attributes the problem of managers failing to learn and adapt and put themselves outside of their comfort zone. They fail to acknowledge connectivity is where you start a journey that ‘ends in the prediction power of systems to see problems before they occur’, and therefore cannot advocate for capital expense of connectivity for connectivity’s sake.

Without a desire to learn and adapt you cannot create the new capabilities needed to remove defects.

But if you are ready to learn and adapt to the new capabilities of combining visual inspection and process telemetry and have your very own 4.0 Smart factory OODA loops, then we are ready to help you.

…without the whole bowel penetration thing.