Avid today announced that CRN®, a brand of The Channel Company, is honoring the company with a 2017 IoT Innovators award. CRN’s newest awards program, IoT Innovators, recognizes North American solution providers and systems integrators whose design and implementation of breakthrough solutions place them at the forefront of the IoT revolution.
Avid has built automation and process optimization solutions across multiple industries using many tools and platforms. The system integration company’s uncanny ability to listen to the client’s needs and its reputation for complex problem solving led to the creation of a Trusted Partner program where customers rely on Avid to manage technology as well as long-term process improvement.
“We are excited to be part of CRN’s debut Innovators List,” says Jeff Miller, Chief Technologist for Information Solutions at Avid. “This recognition validates our investment in IoT platforms and services and our partnership with clients to drive continuous process improvement.”
“The solution providers and systems integrators on our first-ever CRN IoT Innovators list are confidently leading the channel into the exciting, opportunity-rich new era of IoT,” said Robert Faletra, CEO of The Channel Company. “Each of these ambitious companies has brought to market a complex and cutting-edge integration of platforms, software tools and technologies, managing the entire process from design to deployment. Their remarkable IoT innovations are paving the way for an unprecedented level of global interconnectivity that will transform the way we live and work.”
The IoT Innovators list were announced at the IoTConnex Virtual Conference (www.iotconnex.com) today and featured on CRN.com at crn.com/IoTInnovators.
A fast-growing system integrator specializing in industrial automation and information solutions, Avid collaborates with clients to identify and select best-in-class control system automation platforms, design and implement systems, and provide verification and validation documentation and services. Turnkey capabilities include UL panel fabrication and installation and system commissioning. The company has developed exceptional depth in the following process industries: chemical, food & beverage, life sciences, power generation and pulp & paper. Avid maintains long-standing customer relationships that are built on technical acumen, two-way communication and mutual trust. Avid has provided industrial automation and information solutions across the U.S. and globally for 30 years. The combination of domain expertise and adoption of new innovative technologies help make Avid a top leader in the market. Find out more at www.avid.run.
About the Channel Company
The Channel Company enables breakthrough IT channel performance with our dominant media, engaging events, expert consulting and education, and innovative marketing services and platforms. As the channel catalyst, we connect and empower technology suppliers, solution providers and end users. Backed by more than 30 years of unequaled channel experience, we draw from our deep knowledge to envision innovative new solutions for ever-evolving challenges in the technology marketplace. www.thechannelco.com
Follow The Channel Company: Twitter, LinkedIn
Follow Avid Solutions: Twitter, LinkedIn
If you're trying to troubleshoot or optimize different aspects of your manufacturing process, consider adding an historian. This software is used to store and analyze vital process and industrial data. Historians fall into their own category in the world of industrial software due to the critical role they play in the success of analysis and decision making. Unfortunately many companies have lots of data, but do not use it effectively.
Many companies are looking for ways to determine the root causes of why their plant’s performance is not at its best on a consistent basis. For example, a plant manager may know that the plant is losing money because a process is taking an extra three minutes. However, determining why this process is taking those three extra minutes can prove to be challenging. Using historian software may provide the needed insight into this issue.
Historians are designed to capture and store a large amount of data from many different sources using disk space efficiently so that years of data can be quickly retrieved. For instance, OSI’s PI Historian can handle millions of points, archive thousands of events per second and quickly retrieve data from a million gigabytes of data (many years of data for most users!).
In a process control system, very common items that are historized include temperature, flow rate, pressure, level and other types of analog data. Also common is the historization of digital data, such as the output or feedback states associated with valves, pumps and other discrete control devices. Integration with other software systems, such as Laboratory Information Systems (LIMS), allows the comparison of process data with lab results.
There are many advantages to adding historian software to your manufacturing process. They include:
Visibility Across Different Systems/Vendors
If a site has multiple, separate systems, historian software can bring all those disparate systems into one view.
The historian can be used to analyze data for Overall Equipment Effectiveness (OEE) and batch cycle improvements.
Increased Availability of Data
Efficient Storage of Data and Quick Retrieval when Needed
A historian can store many years of data and make it available for immediate retrieval. Some of our customers have systems that were installed more than 15 years ago and all of the data since the initial installation can be easily retrieved for analysis and viewing. There are some installations where the historian can display historical data better than the control system. In one instance, operators actually preferred pulling up trends in PI Processbook because of the faster performance and the more convenient user interface.
Connectivity with Other Business Systems
By connecting the historian to business systems, it is possible to reduce downtime and more accurately report material consumption.
If you are interested in adding historian software to your manufacturing process or using the data you have more effectively, consider us. We have installed many historian systems including those from OSI, Rockwell, Honeywell, Wonderware and GE in a variety of industries, and our teams have interfaced with PLC and DCS control systems as well as a myriad of business systems. We work with a wide variety of industries including life sciences, chemical, food/beverage and consumer packaging, power, and pulp & paper. We understand the type of information your engineering, operations, technical services, maintenance, and business groups need. Big data is the wave of the future, don’t be intimidated, it is possible to use large volumes of data to improve your processes.
Like many in the field of Information Technology (IT), I find that technology shifts before our eyes at a nosebleed pace. Technologies come and go - some technologies are replaced quickly, while others seem to hang around for the long haul. As seen in Automation World, I believe it is important to stay relevant and knowledgeable about new technologies for those of us who are in the field of Industrial IT.
At each shift in my career, from Support Specialist on a global banks stock trading network, to Network and Security trainer, and then virtualized datacenter designing engineer, I spent some time thinking about my role and how I could best contribute, often in ways that are not confined to the title of the role. I would like to think that my hard work has led me to where I am today, as part of an Information Solutions team at Avid, a leader in the field of automation and information solutions. At this point with Avid, I’m asking about the next thing that I should be learning, which led me right to the world of Industrial Control Systems or ICS.
I’m new to ICS. But, contrary to popular belief, being new doesn’t mean being unproductive in a new industry. This is the busiest I’ve been in my entire career, and I’m enjoying it in ways that I didn’t expect. I firmly believe that, in this role, I can help our customer’s Business IT (Information Technology) organization to better understand and support the OT (Operations Technology) design and requirements. Customers in the industrial space need a translator between their IT and OT organizations.
Remember, I come from the Business IT side of the technology world. I’ve seen how many of the IT organizations react to individuals or groups trying to run their own networks and servers. IT often feels that they are the defacto experts and should own all aspects of the server’s operation. Sometimes, it’s a lack of trust by the IT group – they worry that a new server that they have not personally built is not secured properly or will become an unmanaged ad-hoc rogue device that will become forgotten about until it becomes a problem.
There are also IT organizations that operate with a bit more hostility towards others. These exist and operate through resource power-grab primarily for reasons of fear or in order to justify their existence. In this situation, IT may resent the firewall team and vice-versa. Private Cloud Virtualization teams don’t trust Public Cloud initiatives and cybersecurity trusts neither.
Where does this leave OT? In my short time here at Avid, it’s become painfully apparent that OT is misunderstood. IT wants to run the OT networks the same way it runs the business networks. People in IT wonder if it’s possible to run the network on behalf of OT. For my part, I think it’s a completely plausible expectation, but I have not seen it work - yet.
An IT network is a relatively available infrastructure. But, there is an argument that the HA (Highly Available) network infrastructure that has been built will failover and the client server stream will continue. I agree, but is it OT quality? IT managed networks don’t conform to the needs of OT in many cases. OT needs the network to be up, all the time. Rebooting a router to fix a problem is often not an option. If there is not a 30ms to 250ms failover between redundant paths, the network is not good enough. In a shared network between IT and OT, an employee’s large file transfer of an ISO file could bottleneck and impede OT telemetry. Of course, all efforts must be made to prevent this.
Similar arguments can be made for virtualization. Could an IT virtualization group run the OT Virtual Machines? Sure, it could. But first, it’s important to understand that the cornerstone of IT virtualization in an enterprise is to put lots and lots of VMs on one or a few pieces of hardware to optimize resource sharing. This doesn’t work for OT without defining guaranteed up-time, reserved memory and CPU, real-time hitless failover, protection for OS changes (including not patching), and being secured from hostile business and internet networks. Lastly, placing OT’s VMs on a distant IT network can lead to major problems if the real-time telemetry traffic from the controls network are dropped.
IT Security compliance is also not directly compatible with OT at the lower levels of manufacturing. These systems are operating in real-time, collecting records, controlling equipment, and informing engineers of status on an HMI. Interrupting this flow of traffic, even by accident, could damage property or worse, it could harm the humans who are operating the system. Applying a patch on a live ICS network could impact data collection, controls, and regulatory compliance. There is a need for IT to understand that it could be a year or even a decade before a plant is offline for maintenance and patching. The focus needs to be one ensuring that only secured, authorized access is permitted from the outside, since the inside OT network needs to remain in a steady state for long periods of time.
In other words, it’s important for OT to explain its needs better to get the other groups to listen.
The National institute of Science and Technology (NIST) has an excellent publication that defines the core differences between IT and OT. Reviewing this publication led me to imagine a person standing between the IT and ICS columns. This was my ‘a-ha’ moment and where I discovered that I can uniquely contribute by helping customers in IT to stop, step back for a moment and review what they are attempting to take on. They often need to decide if it is within their scope to try and be successful at running an OT infrastructure or if a dedicated group would be best suited to do the job. If they are already operating the OT network and it’s not working well, a systems integrator can assist in identifying the changes needed to make the endeavor a success.
Another quick note about the NIST 800-82 publication. If you look at the diagrams 5-1 through 5-5, there is a common Control Server that remains near the controls network. In some of the designs today, this particular server has been moved out to a virtualization infrastructure.
My thought is that its importance is being overlooked since being a server, it’s being moved to the IT Virtualization in the corporate network. This object server is the first in the line of real-time telemetry collection for the manufacturing processes. Consider what happens if this is moved across a best-effort network and multiple routers and switches.
Let’s take a look at how the disconnect formed between IT and OT.
For those unaware or not familiar with Industrial Control System (ICS) requirements, the whole concept of the OT requirements seems backwards when compared to the rapidly shifting Enterprise and Consumer technologies. Someone familiar with IT networking and security best-practices at the enterprise level will rightfully get agitated at how crazy OT sounds, at least until we explain why OT is this way and it’s not easy to change without breaking the manufacturing functionality.
First, they need to get past the gut reaction, ask questions and most importantly listen. Not listening can result in a tough situation where (as in one real customer case) an IT team thought it knew better than those providing the site requirements. They didn’t apply the stated requirements and it resulted in dropped traffic across a shared business/production network. Also, they incorrectly defined VMs on a shared infrastructure. The IT department is an expert at enterprise best-effort and shared resource design, but, it just didn’t work for the ICS and had to be heavily modified.
So, my shared words of wisdom today for all IT and OT staff – don’t assume that you are more knowledgeable than the other and listen. In other words, ‘the quieter you become, the more you will hear.’
For our customers, having a seasoned Information Solutions team as part of the solution integration process can have unexpected value, including clarification of any requirements early on during the design process, and a set of educated eyes to identify any potential concerns that may cause issues in the final product.
Work in the Pulp & Paper manufacturing industry is never without unexpected challenges. In order to ensure a successful project, our engineers had to adapt to unplanned scheduling changes in such a way as to provide the customer with a safe, secure startup while the plant was still online.
Meeting the customer's production schedule meant installing a new/modified batch processing management system without having to stop overall plant production. Simulations and extensive testing substantially mitigated the chance for error during this high-risk/high-reward strategy. Although this type of software testing increased upfront labor costs, it provided significant return on the investment. The result was a downtime of one batch cycle, zero wasted raw materials or product, and uninterrupted paper-machine production.
To ensure mutual success, Avid worked with the customer to choose an option that would maximize safety and also meet production demands while minimizing labor cost and material waste. For this project, the customer had an existing Honeywell Experion Batch Management solution which successfully ran its three mixers and supporting equipment. But, plans for a new coating formulation required that the old system be updated. The change introduced a new ingredient, a new dispersing machine, several VFDs, and Ethernet/IP for device control. Both the batch management system and the graphics needed to be updated to accommodate these changes.
Avid’s site familiarity and extensive experience in creating feature rich C&A solutions, allowed our contractors to work with operations to identify and incorporate many quality of life (QoL) features into the project. At each step, the consultants at Avid proved to the customer that the install could be done safely and within the time window.
Some of the key factors in the success of the project included holding a 3-day Factory Acceptance Test (FAT), simulation, virtulization, backup planning and documentation. Given the overall success of this project, an online startup could be a preferential option for future companies.
General feedback from some of the largest companies in the world indicate that they do not feel comfortable storing intellectual property or sensitive industrial data in a public cloud facility. Largely, this is due to fear of this information falling into the hands of competitors.
But, for many reasons, we believe that business and company data is not safe in the typical private data centers. Most companies use security policies, two factor authentication and RSA tokens to secure their systems and data. But, in practice this only offers a great sense of false security. The truth is, those measures only secure networks from outside penetration. There are many other methods of intrusion and security breach that are concerning.
The advent of public secure data centers created by Amazon, Google and Microsoft are shifting the paradigm of need for private data centers. These cloud-based data centers are designed with similar
security measures to those that are used to safeguard our nation’s private information but the information is held even more securely.
There are many reasons for this, including increased:
Public cloud solutions offer not only data security but also cost savings. To adopt cloud services is a choice many companies will soon face.
As a project manager for a system integration company, I am constantly quoting projects with customers. An example of a typical scenario for a large project would be receiving a bid package, having a walk-down meeting, asking questions, and then working on the proposal. The proposal is then submitted and I anxiously await the call to find out if we're in the running for the final decision. One of the people on that call (and many times the first time I meet this person), is the purchasing agent. Then comes that dreaded question, “How can we get your price down?”
The age-old war between purchasing and contractors. No matter how hard we work on proposals to be as competitive as possible, the price question always comes up and the battle rages on. We're now time-locked to get a project off the ground, but must begin the fight on rates, T&C’s, and other fine details. It often ends with both sides feeling defeated because the purchasing agent has the very important job of driving costs as low as possible to ensure the company remains profitable. The contracting company has the task of performing the work for a profit, otherwise they go out of business. What isn’t accounted for in this method, however, is the total cost. The battle starts at a point in time to drive one cost down rather than being involved in the entire project execution process to keep the total cost down. I believe the key is for purchasing to be involved in the entire project life cycle.
The first step is to vet potential contractors. This doesn’t mean you have an active bid package or project, it simply means purchasing needs to get out and start looking. In general, plant personnel know whether they will need integration help or not and what size projects they are expecting in the future. The purchasing agent can create criteria for the companies they want serving the plant and then find them. Then, as projects come up to bid, you know right away who is a valid bidder and who not to waste time/energy on. It's critical to know the difference between being qualified to bid and qualified to do. Think about company size, number of resources with the correct skillset, insurance requirements, etc. Getting calls by business developers for integration companies? Now they can be vetted right away. The earlier you get to know integrators, the better positioned you'll be at the bid table to ensure you have the right companies. This also prevents the usual mistake of just going with the lowest bidder. If they aren’t qualified, you can expect a huge cost impact as the design and implementation phases progress.
Next, purchasing agents need to know about projects before they are sent out for bid. One of the biggest factors that goes into every bid package from consultants is risk. There's an old adage in project management, do you want it fast, good, or cheap? If the scope is unclear or the schedule is not defined, it adds risk. The higher the risk, the higher the price. Remember, contractors are trying to run a profitable company as well and we must protect ourselves from the unknown. Purchasing should get ahead of this dilemma by ensuring the scope is very well defined and the schedule allows for proper execution. If you can’t determine this internally, contact one of your trusted providers to perform a FEED/FEL to establish this baseline scope and schedule. The upfront cost is worth a lot when considered against unexpected surprises on the back end of projects or higher prices during the bid process.
Lastly, understand the value the integrator is bringing to the table. It’s common practice for the first project that the industrial automation firm works on to have a lower price in order to get in the door with the client. But, the purchasing agent shouldn’t expect reduced pricing every time. Once the value of the industrial automation firm is realized, the commensurate price should be expected. No one expects to pay more than the services provided, but you shouldn’t expect to pay less for higher skills or more value. It’s understandable to ask for a discount from the industrial automation firm for taking the initial risk of working with them, but once they're a proven partner, a higher price needs to be acceptable to keep projects running on track.
It’s important for purchasing to get involved early with the industrial automation company. Before the project is bid, get to know the company, and understand if they are a good fit. When necessary, let the firm get involved in scope development, base design considerations, and developing the expected project schedule all the way from design to implementation and installation. Purchasing shouldn’t focus so much on the price at the bid table but instead should think more about the total project cost and how developing relationships will result in better lifetime costs for the company. With this approach, not only will you gain a valuable partner, but you will also reach the objectives of saving costs for your company.
We've all had that wonderful experience of getting to the end of a project and stepping back to admire the glory after months of hard work. After all the hours of programming, testing the programming, getting new information from the customer so that it’s necessary to go back and change the programming – the end is in sight. At this point, for most projects, there still remains the Factory Acceptance Test (FAT), where we get to show off all that hard work in front of the customer. During this time, we must summon the fortitude to remain sharp to carry the project over the finish line mistake free. With this in mind, I would like to share a cautionary tale of an FAT that ended much like the Olympic hurdler face-planting on the final hurdle.
For most projects, pre-FAT testing involves simulation that is virtually accomplished through software and not actual hardware. This is all well and good for logic testing, and even HMI testing virtually is fundamentally safe, but when it comes to actual hardware testing, the inadequacies are obvious. Therefore, while our FAT design is largely derived from our pre-FAT simulation testing, we are left to conceive of a functional design to test the hardware. It is in this design phase where we once again must put on our engineering hard hats when it is so tempting to think they are no longer needed.
In this particular situation, I was approaching the hardware FAT much like the software FAT. My thought bubble was saying, “All I need are some screens that quickly show an Input turning on a corresponding Output.” The customer could easily see from the HMI, for each digital output channel, the little circle turns from red to green when the channel was turned on. I was even prepared to point out the channel LEDs on the output module to show the channels turning on and off. At the time, for a simple hardware check, this seemed adequate.
If this were a movie, we are at the point where there would be a dramatic flash forward scene. Picture this, I am standing with the customer and other contractors and we are ready to start the system up. This is a waste-water expansion project, so we are filling up the sump tank to reach the high-level trip point, and when we do, low and behold, not one sump pump starts as per the design, but both the primary and backup pumps start. Since I have extensively tested the logic to know that only one pump should be running, while also confirming that only one pump is being commanded to start - this led me to conclude that the electricians must have wired something wrong.
Let’s fast forward through the frantic troubleshooting efforts that took place next of draining the large tank, tracing and checking the pump wiring, triple checking the logic, and visually seeing only the one channel LED illuminating on the output module. At that point, I was yet to arrive at the culprit. But, it was then meter time where every system integrator gets to impress the electrical contractors, or plant electricians, and get some “street cred” for not just being a programming nerd.
Again, if this were a movie, the dramatic music would slowly build as I traced with the meter from the module down to the terminal block, only to reveal the flaw in my hardware FAT design. Although the logic was not turning on the backup pump output channel, and although the output module channel LED was not on, and although my HMI screen only showed one channel at a time turning on during the hardware FAT, the meter proved that both channels were on from the terminal block. So, the culprit was none other than a faulty IFM cable that had a short between the module and the terminal block assembly.
You may be thinking at this point, “Yes, but this was beyond the controller hardware because it was an IFM cable that plugs into the module and extends to the terminal block.” I confess that that was my knee jerk self-defense statement at the time, but as I thought further about the hardware testing design, I began to see how relevant this was to even the most basic hardware architecture.
First of all, the hardware for this project was all contained inside a control panel, and it should have been tested all the way down to where the outside connection was made. Secondly, even if the hardware scope only extended to the local chassis, channel shorts could still be missed if a simple hardware testing design were used that only looked at one channel at a time. After all, before finding the shorted IFM cable, my first thought after seeing both channels on at the terminal block was that possibly a metal filing had gotten between the channels because the electricians had to drill another bottom hole in the panel when it was brought on site.
To bring this cautionary tale to a close, I'll summarize what I learned from this. First of all, I learned to not think of the hardware side in the same way as I do the software side when it comes to testing. Thinking this way led me to oversimplify the hardware testing and rely on testing methods that are better suited to the software side. Secondly, I was reminded that the signal should be tested at the scope endpoint. I needed to determine at what point the signal left my scope and became someone else’s responsibility. This is common sense but can be forgotten when the temptation of quicker and easier, yet less reliable methods are available. Third, I needed to watch out for the dreaded short. Testing only one channel at a time, even with a meter, can lead to missing shorted channels. For digital hardware, taking the time to devise a test for all channels to ensure that no shorts exist may add some time, but could eliminate a costly error.
My final and most important learning is that after all the endurance it takes to get through the initial programming and pre-testing phase of a project, the project is not truly over until the system is installed and performing to the customer’s expectations. The same level of diligence is needed to carry the project over the finish line each and every time.
As seen in Automation World, the idea that your company can gain a significant leg up on the competition by implementing a Manufacturing Execution System (MES) is over. If your plant is not already reducing waste, identifying bottlenecks, and coordinating data across the corporation, then you are behind the curve and playing catch up.
Over the past 30 years, advancements and improvements have been made to the simple concepts contained within the general term MES. Many dedicated applications have been developed and many companies solely dedicated to providing MES production process solutions have sprung up.
In addition to a general surge in MES concepts and technologies, there have been efforts to drive segment specific MES application targeted to industry specific needs:
Of course, those are only a few of the industries. Most, if not all, of the heavy players in these industries have been involved in these technologies for years. However, many companies and industries have not stayed up-to-date on adopting new technologies and strategies and several of them are no longer in business or are not thriving.
One example of an industry that did not take improving technology into consideration is the American newspaper industry. Managing editors at newspapers across the country likely cringed when Craigslist showed up on the scene with a dynamic, ever-changing classified section that was online and free. A major source of income for most newspapers went away in a matter of years. Very few newsprint organizations developed even a simple counter to this threat before it was too late. I’m sure you’ve heard stories about local newspapers across the country that folded because of the advances in technology in these and other areas.
Another example is the American auto industry in the 1970’s and early 80s. With a focus on consistency, detailed engineering, and quality control - the Japanese took away the U.S. auto market from local manufacturers. It took decades for domestic car makers to recover. Many are still feeling the effects of not being prepared and forward-looking.
With new technologies like machine learning for predictive maintenance, IoT sensors, and other cutting edge technologies just beginning to enter the manufacturing space, those companies that are poised to take the next strategic leap forward are those that have already embraced and adopted the concepts of MES/MOM (manufacturing operations management) and advanced data solutions like dashboarding of integrated system data.
With the introduction of MES in the early 1990’s, the concepts and solutions have had plenty of time to saturate through various manufacturing spaces as diverse as food and beverage through pharma to pulp and paper.
It's hard to name one major manufacturing software vendor that does not have some flavor of MES/MOM offering. Rockwell has Production Centre, Wonderware has MES Operations and Performance, Emerson has Syncade, and so forth.
The manufacturers that have embraced and adopted some version of these solutions are the ones that are prepared to pivot to address market forces and customer demands as well as move into the future and adopt higher end new and improved solutions.
Surprisingly, technology is the easiest sector to research for companies that missed the boat. Many companies either ignored or adopted new concepts and strategies in their marketplace too late. They paid heavily for not being prepared. If you’re interested, a simple Geographic Information System (GIS) for “Companies that Missed the Boat” returns over a million results.
With operating margins shrinking and the continuous quest for the best deal, it’s important to protect your company from ending up like the familiar example of the mom-and-pop family business staring at the groundbreaking for a big box store in the neighborhood.
The key is to find your company’s niche or method of reducing cost or improving efficiency and then creating a solid data foundation to build on.
Systems Integrators like Avid can help companies to catch up to the competition and surpass them using solid proven data acquisition/MES solutions and implementing advanced data analytics and business intelligence answers. The time to act is now.
In the United States, 94% of plants miss their scheduled start date after a process control systems upgrade. When schedules slip, expenses add up quickly. For plants operating 24 hours a day, 7 days a week, a single day of lost production can cost $600,000 or more. At that rate, the price of a delayed start-up will surpass your capital investment within a week. Fortunately, with some advance planning, you can ensure a timely start-up and avoid such losses.
We collaborate with clients to identify, design, and implement the best industrial automation solutions. We also provide the installation and commissioning expertise you need to “Get Your Plant Back Online, On Time.” Based on that experience, we’ve compiled the following start-up guidelines for a successful automation system upgrade:
Because a process control systems upgrade is a significant investment, you may find yourself looking for ways to reduce start-up expenses. But, outsourcing your start-up project to your install contractor or trying to manage it in-house comes with risks. Working with a dedicated start-up specialist, on the other hand, allows you to foresee and resolve issues that could threaten your project timeline.
The process of replacing an existing batch system and providing for more transparency involves developing a system capable of full batch processing but also of running phases manually if the full system becomes unavailable.
This, of course, incorporates the basic concepts of the S88 standard for batch process control but unfortunately, a strict adherence to the S88 standard does not provide all the functionality that is needed. For this implementation at a chemical plant, operators needed to have the ability to run process phases independent of the unit.
So, to complete this replacement, a system needed to be developed that included all programming levels, but allowed for more manual control of the functions in case the batch executive failed. The system also needed to fit into the graphic and operator interface standards established at the site.
The solution involved moving the configuration to the equipment module level. This provided the ability to individually start and stop the equipment modules without having to start a phase. Not only did this change help accomplish the goal at hand, it also fit in with how the current system already worked.
The new system allows for recipe parameters to be set at the equipment level in the event of a batch system failure. But, if the batch executive is running correctly, default recipe parameters are enforced and that, in turn, provides consistent quality. The new operator interface has been designed such that the operator can seamlessly alternate between running with the batch system or without the batch system.
The end result is that the system is set up for the best results possible in either situation. This type of flexibility is particularly helpful in chemical applications where the ability to react to unforeseen circumstances is critical.
Successful batch control implementations allow companies to optimize timing and run multiple batches simultaneously. With batch processing solutions, companies gain quality standardization and optimize overall equipment effectiveness. They’re able to increase throughput, reduce risk of cross contamination, and reduce risk of human error because the automated system takes the place of operator decisions. Systems can be validated and guaranteed to run the same way every time and also to present failure notifications when needed, giving greater assurance of adherence to the standards set by regulatory organizations like the FDA.
In some cases, a strict reliance upon the automated batch can increase reaction time, which, in turn, increases risk. This solution combines the advantages of a full batch system with the flexibility required to adequately react to adverse conditions.