Thursday 18 April 2019

THE FUTURE OF INDUSTRIAL AUTOMATION


APRIL, 2019
Since the turn of the century, the global recession has affected most businesses, including industrial automation. After four years of the new millennium, here are my views on the directions in which the automation industry is moving.

THE REAR-VIEW MIRROR

Because of the relatively small production volumes and huge varieties of applications, industrial automation typically utilizes new technologies developed in other markets. Automation companies tend to customize products for specific applications and requirements. So the innovation comes from targeted applications, rather than any hot, new technology.

Over the past few decades, some innovations have indeed given industrial automation new surges of growth: The programmable logic controller (PLC) – developed by Dick Morley and others – was designed to replace relay-logic; it generated growth in applications where custom logic was difficult to implement and change. The PLC was a lot more reliable than relay-contacts, and much easier to program and reprogram. Growth was rapid in automobile test-installations, which had to be re-programmed often for new car models. The PLC has had a long and productive life – some three decades – and (understandably) has now become a commodity.

At about the same time that the PLC was developed, another surge of innovation came through the use of computers for control systems. Mini-computers replaced large central mainframes in central control rooms, and gave rise to "distributed" control systems (DCS), pioneered by Honeywell with its TDC 2000. But, these were not really "distributed" because they were still relatively large clumps of computer hardware and cabinets filled with I/O connections.

The arrival of the PC brought low-cost PC-based hardware and software, which provided DCS functionality with significantly reduced cost and complexity. There was no fundamental technology innovation here—rather, these were innovative extensions of technology developed for other mass markets, modified and adapted for industrial automation requirements.

On the sensor side were indeed some significant innovations and developments which generated good growth for specific companies. With better specifications and good marketing, Rosemount's differential pressure flow-sensor quickly displaced lesser products. And there were a host of other smaller technology developments that caused pockets of growth for some companies. But few grew beyond a few hundred million dollars in annual revenue.

Automation software has had its day, and can't go much further. No "inflection point" here. In the future, software will embed within products and systems, with no major independent innovation on the horizon. The plethora of manufacturing software solutions and services will yield significant results, but all as part of other systems.

So, in general, innovation and technology can and will reestablish growth in industrial automation. But, there won't be any technology innovations that will generate the next Cisco or Apple or Microsoft.

We cannot figure out future trends merely by extending past trends; it’s like trying to drive by looking only at a rear-view mirror. The automation industry does NOT extrapolate to smaller and cheaper PLCs, DCSs, and supervisory control and data acquisition systems; those functions will simply be embedded in hardware and software. Instead, future growth will come from totally new directions.

NEW TECHNOLOGY DIRECTIONS

Industrial automation can and will generate explosive growth with technology related to new inflection points: nanotechnology and nanoscale assembly systems; MEMS and nanotech sensors (tiny, low-power, low-cost sensors) which can measure everything and anything; and the pervasive Internet, machine to machine (M2M) networking.

Real-time systems will give way to complex adaptive systems and multi-processing. The future belongs to nanotech, wireless everything, and complex adaptive systems.

Major new software applications will be in wireless sensors and distributed peer-to-peer networks – tiny operating systems in wireless sensor nodes, and the software that allows nodes to communicate with each other as a larger complex adaptive system. That is the wave of the future.

THE FULLY-AUTOMATED FACTORY

Automated factories and processes are too expensive to be rebuilt for every modification and design change – so they have to be highly configurable and flexible. To successfully reconfigure an entire production line or process requires direct access to most of its control elements – switches, valves, motors and drives – down to a fine level of detail.

The vision of fully automated factories has already existed for some time now: customers order online, with electronic transactions that negotiate batch size (in some cases as low as one), price, size and color; intelligent robots and sophisticated machines smoothly and rapidly fabricate a variety of customized products on demand.

The promise of remote-controlled automation is finally making headway in manufacturing settings and maintenance applications. The decades-old machine-based vision of automation – powerful super-robots without people to tend them – underestimated the importance of communications. But today, this is purely a matter of networked intelligence which is now well developed and widely available.

Communications support of a very high order is now available for automated processes: lots of sensors, very fast networks, quality diagnostic software and flexible interfaces – all with high levels of reliability and pervasive access to hierarchical diagnosis and error-correction advisories through centralized operations.

The large, centralized production plant is a thing of the past. The factory of the future will be small, movable (to where the resources are, and where the customers are). For example, there is really no need to transport raw materials long distances to a plant, for processing, and then transport the resulting product long distances to the consumer. In the old days, this was done because of the localized know-how and investments in equipment, technology and personnel. Today, those things are available globally.

HARD TRUTHS ABOUT GLOBALIZATION

The assumption has always been that the US and other industrialized nations will keep leading in knowledge-intensive industries while developing nations focus on lower skills and lower labor costs. That's now changed. The impact of the wholesale entry of 2.5 billion people (China and India) into the global economy will bring big new challenges and amazing opportunities.

Beyond just labor, many businesses (including major automation companies) are also outsourcing knowledge work such as design and engineering services. This trend has already become significant, causing joblessness not only for manufacturing labor, but also for traditionally high-paying engineering positions.

Innovation is the true source of value, and that is in danger of being dissipated – sacrificed to a short-term search for profit, the capitalistic quarterly profits syndrome. Countries like Japan and Germany will tend to benefit from their longer-term business perspectives. But, significant competition is coming from many rapidly developing countries with expanding technology prowess. So, marketing speed and business agility will be offsetting advantages.

THE WINNING DIFFERENCES

In a global market, there are three keys that constitute the winning edge:
  • 1. Proprietary products: developed quickly and inexpensively (and perhaps globally), with a continuous stream of upgrade and adaptation to maintain leadership.
  • 2. High-value-added products: proprietary products and knowledge offered through effective global service providers, tailored to specific customer needs.
  • 3. Global yet local services: the special needs and custom requirements of remote customers must be handled locally, giving them the feeling of partnership and proximity.
To implementing these directions demands management and leadership abilities that are different from old, financially-driven models. In the global economy, automation companies have little choice – they must find more ways and means to expand globally. To do this they need to minimize domination of central corporate cultures, and maximize responsiveness to local customer needs. Multi-cultural countries, like the U.S., will have significant advantages in these important business aspects.

In the new and different business environment of the 21st century, the companies that can adapt, innovate and utilize global resources will generate significant growth and success.



TO KNOW MORE ABOUT ALLIED VISION INDUSTRIAL AUTOMATION CONTACT MENZEL VISION AND ROBOTICS PVT LTD CONTACT US AT (+ 91) 22 67993158 OR EMAIL US AT INFO@MVRPL.COM


Thursday 21 February 2019

THE INFLUENCE OF TEMPERATURE ON IMAGE QUALITY IN NIR / SWIR CAMERAS


Within the electromagnetic spectrum, infrared radiation is located between visible light and microwaves. It covers a wide spectrum from 750 nm to 14.000 nm wavelength. It is common to separate it into near-infrared (NIR), short-wave infrared (SWIR), mid-wave infrared (MWIR), and long-wave infrared (LWIR). Although infrared radiation in the SWIR region is not visible to the human eye, it interacts with objects in a similar manner as visible wavelengths.

Most SWIR camera sensors are based on InGaAs material and work like silicon-based CCD or CMOS sensors by converting photons into electrons - so called quantum detectors. InGaAs sensors are made of Indium, Gallium, Arsenide based photodiodes and silicon-based read-out integrated circuits (ROIC).

The band gap energy is typically smaller in InGaAs photodiodes than in silicon based pixels. As a result, the dark current is higher at similar temperatures.
Allied Vision industrial Camera Distributor in India

INFLUENCE OF DARK CURRENT ON IMAGE QUALITY


Dark current is current that flows (for example produces a signal) even if no light is hitting the sensor. It is caused by thermal excitation of electrons in the InGaAs material. The absolute value of the dark current can vary considerably from one sensor to another. Dark current increases strongly with higher sensor temperature. As a rule of thumb, it doubles approximately every 9°C.

For example, when the same picture is taken twice with the same camera at 20°C and 45°C sensor temperature, the histograms show very different results. At 45°C sensor temperature the minimum value is higher (0 being black to 255 being white) as well as the average value. At 20°C there are no saturated pixels (max. value 254) whereas at 45°C there are quite a few.

TEMPERATURE IMPACT ON SPECTRAL SENSITIVITY


Sensor temperature also has a high influence on the spectral sensitivity of SWIR cameras. When sensor temperature is lowered by 40°C (from 25°C to -15°C) it results in a shift of the spectral sensitivity of about 25 nm towards lower wavelengths. This might be of great importance for applications operating at the low or high end of the sensitivity curve. When sensitivity above 1.700 nm is required, sensor temperature should not be too low (which means below 20°C). Even though dark current (and noise) increase with higher temperature, the signal might still be better at the higher end of the spectrum.

SENSOR TEMPERATURE CONTROL IS CRUCIAL


The temperature level influences the spectral sensitivity and the dark current. The dark current has a high impact on image quality (black level and noise). For applications where image quality is important and when operating at the low or high end of the sensitivity curve, temperature control is crucial.

To accurately monitor and control the temperature, all our SWIR Goldeye cameras implement three temperature sensors:
  • - inside InGaAs sensor housing
  • - on sensor board
  • - on mainboard
In addition, our cameras provide advanced features to correct and minimize those effects (for example advanced background correction).

To counterbalance the temperature difference between ambient and sensor temperature and to stabilize the set sensor temperature, most of the Goldeye cameras are equipped with a thermo-electric cooling device. Available cooling devices are:
  • - TEC1: single-stage thermo-electric sensor cooling (for example: Goldeye G/CL-033 TEC1)
  • - TEC2: dual-stage thermo-electric sensor cooling (for example: Goldeye G/CL-032 Cool TEC2)
  • - Additional cooling fan (for example: Goldeye G/CL-032 Cool TEC2)
In addition, the Goldeye Cool models enclose the sensor in a Nitrogen filled cooling chamber (for example: Goldeye G/CL-008 Cool TEC1). That protects it from condensation and makes it suitable in environments where condensation is likely to occur (that is high humidity, high ambient temperatures).


TO KNOW MORE ABOUT ALLIED VISION INDUSTRIAL CAMERA DISTRIBUTOR IN INDIA CONTACT MENZEL VISION AND ROBOTICS PVT LTD CONTACT US AT (+ 91) 22 67993158 OR EMAIL US AT INFO@MVRPL.COM


Thursday 24 January 2019

A LOOK AT THE PROGRESSION OF MACHINE VISION TECHNOLOGY OVER THE LAST THREE YEARS

company logo

Machine vision represents a diverse and growing global market, one that can be difficult to keep up with, in terms of the latest technology, standards, and product developments, as they become available from hundreds of different organizations around the world.

If you are looking for an example of how fast the market moves, and how quickly trends and new technologies emerge, our Innovators Awards provides a good reference point. In 2015, we launched our first annual Innovators Awards program, which celebrates the disparate and innovative technologies, products, and systems found in the machine vision and imaging market. In comparing the products that received distinction in 2015 to this past year’s crop of honorees, it does not take long to draw some obvious conclusions. First, let’s start with the most noticeable, which was with the cameras that received awards.

In 2015, five companies received awards for cameras. These cameras performed various functions and offered disparate capabilities, including pixel shifting, SWIR sensitivity, multi-line CMOS time delay integration, high-speed operation, and high dynamic range operation. In 2018, 13 companies received awards for their cameras, but the capabilities and features of these cameras look much different.

Vision Inspection Systems In India

CAMERAS THAT RECEIVED AWARDS IN 2018 OFFERED THE FOLLOWING FEATURES:

Polarization, 25GigE interface, 8K line scan, scientific CMOS sensor, USB 3.1 interface, fiber interface, embedded VisualApplets software, 3-CMOS prism design, and subminiature design. Like in 2015, a few companies were also honored for high-speed cameras, but overall, it is evident that most of the 2018 camera honorees are offering much different products than those from our inaugural year.
There are two other main categories that stick out, in terms of 2018 vs. 2015, the first of which is software products. In 2015, two companies received awards for their software—one for a deep learning software product and another for a machine learning-based quality control software. In 2018, eight companies received awards for software.

THESE SOFTWARE PRODUCTS OFFERED THE FOLLOWING FEATURES OR CAPABILITIES:

Deep learning (three honorees), data management, GigE Vision simulation, neural network software for autonomous vehicles, machine learning-based desktop software for autonomous vehicle vision system optimization, and a USB3 to 10GigE software converter.

Lastly, the category of embedded vision looked much different in 2018 than it did in 2015. In the embedded vision category—which I am combining with smart cameras due to overlap—there were two companies that received awards in 2015, both of which were for smart cameras that offered various capabilities. This year, however, there were 12 companies that were honored for their embedded vision innovations, for products that offered features including: embedded software running on Raspberry Pi, computer vision and deep learning hardware and software platform, embedded vision development kits, embedded computers, 3D bead inspection, as well as various smart cameras.

Throughout the other categories, there was equal or similar number of honorees from both years, but there were several interesting technologies or applications that products that popped up in 2018 offered. This includes a lens for virtual reality/augmented reality applications, a mobile hyperspectral camera, a 3D color camera, and various lighting products that targeted multispectral and hyperspectral imaging applications.

This is all to say that, when looking back to 2015 to today, machine vision technology has grown quite a bit. With the rapid pace of advancements, the growing needs of customers and end users, the miniaturization and smaller costs of components, and so on; it is exciting to think about what machine vision products in 2021 might look like.

TO KNOW MORE ABOUT VISION INSPECTION SYSTEMS IN INDIA CONTACT MENZEL VISION AND ROBOTICS PVT LTD CONTACT US AT (+ 91) 22 67993158 OR EMAIL US AT INFO@MVRPL.COM


Monday 14 January 2019

BITFLOW PREDICTS VISION-GUIDED ROBOTICS TO BECOME MAJOR DISRUPTIVE FORCE IN GLOBAL MANUFACTURING

company logo



As the plant floor has become more digitally connected, the relationship between robots and machine vision has merged into a single, seamless platform, setting the stage for a new generation of more responsive vision-driven robotic systems. BitFlow, Inc., a global innovator in frame grabbers used in industrial imaging, predicts vision-guided robots will be one of the most disruptive forces in all areas of manufacturing over the next decade.

"Since the 1960s robots have contributed to automation processes, yet they've done so largely blind," said Donal Waide, Director of Sales for BitFlow, Inc. "Vision-equiped robots are different. Now, just like a human worker, robots can see a specific part to validate whether it is being placed correctly in a pick and place application, for example. Cost savings will be realized since less hard fixturing is required and the robot is more flexible in its ability to locate a variety of different parts with the same hardware."

Bitflow Frame Grabber Cards Dealer Mumbai India

HOW ROBOTIC VISION WORKS

Using a combination of camera, cables, frame grabber and software, a vision system will identify a part, its orientation and its relationship to the robot. Next, this data is fed to the robot and motion begins, such as pick and place, assembly, screw driving or welding tasks. The vision system will also capture information that would be otherwise very difficult to obtain, including small cosmetic details that let the robot know whether or not the part is acceptable. Error-proofing reduces expensive quality issues with products. Self-maintenance is another benefit. In the event that alignment of a tool is off because of damage or wear, vision can compensate by performing machine offset adjustment checks on a periodic basis while the robot is running.

DUAL MARKET GROWTH

In should come as no surprise that the machine vision and robotic markets are moving in tandem. According to the Association for Advancing Automation (A3), robot sales in North America last year surpassed all previous records. Customers purchased 34,904 total units, representing $1.896 billion in total sales. Meanwhile total machine vision transactions in North America increased 14.8%, to $2.262 billion. The automotive industry accounts for appoximately 50% of total sales.

THE ROLE OF FRAME GRABBERS

Innovations in how vision-guided robots perceive and respond to their environments are exactly what manufacturers are looking for as they develop automation systems to improve quality, productivity and cost efficiencies. These types of advancements rely on frame grabbers being paired with high-resolution cameras to digitize analog video, thus converting the data to a form that can be processed by software.

BitFlow has responded to the demands of the robotics industry by introducing frame grabbers based on the CoaXPress (CXP) machine vision standard, currently the fastest and most powerful interface on the market. In robotics applications, the five to seven meters restriction of a USB cable connection is insufficient. BitFlow CXP frame grabbers allow up to 100 meters between the frame grabber and the camera, without any loss in quality. To minimize cabling costs and complexity, BitFlow frame grabbers require only a single piece of coax to transmit high-speed data, as well as to supply power and send control signals.

BitFlow's latest model, the Aon-CXP frame grabber, is engineered for simplified integration into a robotics system. Although small, the Aon-CXP receives 6.25 Gb/S worth of data over its single link, almost twice the real world data rate of the USB3 Vision standard and significantly quicker than the latest GigE Vision data rates. The Aon-CXP is designed for use with a new series of single-link CXP cameras that are smaller, less expensive and cooler running than previous models, making them ideal for robotics.

TO KNOW MORE ABOUT BITFLOW FRAME GRABBER CARDS DEALER MUMBAI INDIA CONTACT MENZEL VISION AND ROBOTICS PVT LTD CONTACT US AT (+ 91) 22 67993158 OR EMAIL US AT INFO@MVRPL.COM