Pulp and Paper Canada

News
Mill Process Optimization

September 1, 2005  By Pulp & Paper Canada


A century ago, optimization of any sort was a problem in pure mathematics, because without modern computing capabilities, there was no way that a mere engineer could perform the necessary computations…

A century ago, optimization of any sort was a problem in pure mathematics, because without modern computing capabilities, there was no way that a mere engineer could perform the necessary computations. By the 1960s, large mainframe computers in universities enabled an explosion of research efforts in areas like operations research, planning and scheduling, and gradient (“hill-climbing”) optimization methods. Only in the 1970s, when the first mill control systems brought plant-floor computers and large volumes of data together, could serious optimization of real-world industrial processes be attempted.

Today, mill process optimization is “front and centre” among management concerns for several reasons. Critical raw materials are either in short supply (fibre) or available only at ever-rising cost (energy). Downstream paper users and final consumers are demanding higher quality and greater performance in the paper products they use. More than one paper mill has had a rude awakening when management realized they could not possibly meet their customers’ quality standards with their existing control systems. Governments and regulatory bodies are also imposing ever more stringent constraints on process operations. Add to all this the emergence of global competition. North America, with its high labour costs, can compete with low-cost producers around the world only if it uses its labour and other resources at optimum efficiency. All these pressures are making corporate managers take a second look at how they can get more out of their assets.

Advertisement

Robert Eamer, a consultant with many years of experience in the industry, has commented on this situation. “Industry managers are coming to realize that for each of the major commodity categories in wood, pulp, and paper,” he says, “there is less than a handful of mills in Canada that can profitably compete in the global markets for these products, and the Canadian cost disadvantage is rapidly growing as new and much more productive capacity comes on stream elsewhere each year. For the majority of our mills, survival will be dependent upon the ability to develop, produce and market specialized or customized products for consumers. This will demand flexibility with frequent, rapid and efficient process changes on the part of the mills intent on surviving — adding a new dimension to the always-critical need for process optimization.”

Approaches to optimization

Not all optimization involves sophisticated mathematical analyses. Good process housekeeping, such as finding and fixing steam leaks and insulating process lines, waste minimization, adherence to good operating procedures, and general cost awareness and control can move a mill a long way towards better operations. Much can also be accomplished through focused engineering studies in specific process areas such as pulp washing and white water recirculation.

That said, the tools available for analyzing entire processes, or substantial parts thereof, have become much more powerful and affordable over the last two decades. For example, process simulation software packages have evolved from command-line driven data manipulation languages to interactive graphic flowsheet configurators and analyzers. Pinch analysis tools are another example, allowing the process engineer to examine and analyze the complete pattern of heat exchange flows in an entire mill. With loop tuning software, a process engineer can evaluate the performance of hundreds of control loops across the plant.

As Paul Stuart, chairholder of an NSERC Design Chair at cole Polytechnique focusing on optimization methodologies, has commented, “The data we have in mills is a sleeping giant which, if exploited, could represent a tremendous opportunity for mills. For success, we need powerful optimization tools but, more importantly, we need to assemble the necessary expertise and process understanding. Appropriate process models, complete and well-validated, are essential. So while implementing plant-wide optimization methodologies can potentially provide an important competitive advantage at minimal capital cost, they would require that more of our limited engineering resources at mills be dedicated to process optimization.”

These tools benefit from another recent development: the availability of mill-wide process databases, which can automatically collect and store hundreds or thousands of data points every day. Engineers can now routinely collect and analyze months and years of process data, looking for patterns and trends. The complaint today is no longer too little data, but too much.

Tools for process optimization

Even the operator on the plant floor can play a part in mill process optimization. Process control systems now have interactive graphical displays that provide the operator with an up-to-the-minute view of process status, with alarms and out-of-range variables flagged for attention. Statistical process control (SPC) methods are also designed for use by people with minimal mathematical knowledge. With appropriate training, operators can use SPC control charts to identify variables that are heading out of control and take action to keep them within the desired range. The basic statistical functions in commonly available software packages like MS Excel can also be used by plant-floor personnel to gain valuable insights into the behaviour of key process variables.

Tools for more advanced in-depth process analysis are no longer the province of academic researchers, but have become commercially available software packages that are intuitive, interactive and robust enough to be used by mill engineers. They provide the ability to perform regression and correlation analyses, time series, power spectra, and principal components analysis, among others. With these functions, the engineer can identify anomalous data points, trends, or sensor malfunctions, quantify relationships among process variables, and clarify the underlying dimensions of variation of key process parametres. Other tools enable the engineer to design and perform experiments to isolate important effects from process background noise and draw statistically valid conclusions while perturbing the process as little as possible.

For serious process optimization, a good process model is a must. Only with a model can the analyst quantify the impacts of proposed changes and track the effects of actions taken at a particular point on the rest of the process, including time lags and recycle effects. In fact, the up-front work of building and validating a model often leads to deeper process understanding and more informed operating decisions, before the model is even used for analysis. In some cases, some of the eventual benefits were achieved before the model was commissioned.

As Neil McCubbin, a pulp and paper industry consultant who has built over 20 models of mill processes, puts it, “I have never completed a full steady-state model of an operating mill without uncovering some change(s) that could be made at trivial capital cost which reduced mill operating costs by enough to pay for the modeling in under a year.” Static models are essentially mass and energy balances; dynamic models add a description of how process variables change over time, which is essential for studying the impacts of different control strategies. By changing inputs, configurations, and operating points, the engineer can run repeated simulations and examine a variety of options for process improvement.

The easiest modelling tools to use are those that allow the user to build a process flowsheet graphically on-screen by connecting pre-defined blocks representing pieces of equipment or operations like mixing, phase change or separation. By adjusting parametres in each process block, the engineer can make the block behave like the real equipment. Once such a model is built and calibrated to a base case representing typical mill operation, simulation experiments can be easily carried out by adding, removing or changing process
blocks or varying the operating parametres. With repeated experiments, close-to-optimal operating points can be identified, but these tools cannot by themselves find a process optimum.

If the process can be described as a matrix of algebraic equations, another approach is possible: the use of mathematical techniques such as linear programming which define an objective function (the expression to be maximized or minimized) and find its optimum point automatically. However, these models normally take a higher level of mathematical knowledge to build and operate. Under certain mathematical conditions, they may also become unstable or fail to find a solution. Sometimes matrix models are embedded inside a block-diagram graphical interface to combine the best of both worlds.

Loop tuning software focuses on the parametres that make control loops more or less responsive and more or less vulnerable to cycling and overshoot. With access to a dynamic model of process response, the software can evaluate how the process will react to changes in parametre settings. Since many loops in a typical mill operate in manual and 30% of those in automatic have been shown to increase rather than reduce variability, there is lots of room for improvement!

Pinch analysis is a technique for optimizing energy flows and usage in a process. Essentially it views the process as a network of heat exchangers and analyzes that network to find a configuration that minimizes external energy requirements and maximizes process heat recovery while remaining consistent with the laws of thermodynamics. Savings of as much as 45% in energy consumption have been realized by this method. In addition to energy economies, pinch analysis can support recommendations for relieving capacity constraints, selecting new equipment, and reducing capital costs for plant construction or retrofit.

Barriers to optimization

Optimizing a mill begins with knowing the values of key process variables. For all the progress in sensor development over the last several decades, there are still substantial gaps in measurement technology, meaning that important variables are not measured and therefore not available for optimization. Wood chips are the feedstock for the entire mill and the foundation of all pulp and paper properties, yet many chip properties like density and brightness often remain unknown because a way has not yet been found to measure them accurately and reliably in a mill environment. If a sensor is available, it may still be deemed too expensive in today’s constrained economic environment, or may not be rugged enough to stand up to conditions in a woodroom or a bleach plant.

The severe reductions in on-site technical personnel that many mills have experienced over the last decade have made it much more difficult to maintain an effective program of measuring and evaluating important variables, evaluating and re-tuning key control loops and continuous process improvement. Many companies have called on suppliers and consultants to fill the gap, but their personnel find themselves supporting multiple mills and multiple projects and often cannot respond as quickly as desired or maintain the level of familiarity with local equipment and processes that used to exist in on-site mill staff. Short-term optimization projects that are undertaken may be abandoned afterwards as the responsible people leave for another project elsewhere.

The “time crunch” experienced by most engineering and management personnel today makes it difficult to maintain process optimization as a high priority in the face of demands to keep the mill running at top speed at all times. Too many mill staff find themselves “fighting fires” on an ongoing basis, with no time or resources to delve deeper into the root causes of process upsets and quality deviations. And corporate financial resources are too often required to fund crisis responses to market changes, rather than building long-term capability via process optimization.

Strategies for process optimization

Any optimization project must first answer the question: what to optimize? Money (in the form of maximum profitability and/or minimum cost) is crucial but not the only consideration. Among possible focus areas for optimization in a mill: standard deviation of process variables, amplitude of cycling and oscillations, settling time after a process disturbance, purchased energy consumption, effluent volume and toxicity, percentage of time the process is outside normal operating conditions, and percentage of product outside quality specifications. Real-world problems often involve multi-criteria optimization, in which a function of two or more of these factors must be maximized or minimized, often with some compromise among the objectives in different areas. It may well be impossible to minimize energy consumption and maximize production rate at the same time.

Optimization can be performed “off-line,” that is, without immediate impact on the process, or real-time “on-the-fly” where changes take effect right away. A common off-line optimization exercise is setpoint optimization, where engineers try to determine what setpoint of a process variable will produce the best results in terms of efficiency, quality and stability. Another problem commonly addressed off-line is the winder cutting pattern or trim optimization, which is a trade-off between minimizing waste on the one hand (narrow trim) versus ensuring consistent paper quality across the sheet (wide trim, eliminating problematic sheet edges).

Real-time optimization adds several new and challenging tasks to those addressed off-line. Data reconciliation involves finding the “best fit” between observed process values and the requirements of mass and energy balance and physical feasibility that all processes must obey. A process unit cannot have greater than 100% efficiency, even if your data say so. Automatic loop tuning can adjust control loop responses to maintain top performance in the face of changes in incoming raw material or factors such as refiner plate wear. Dynamic scheduling algorithms can help the mill adjust to incoming rush orders or equipment breakdowns.

Optimization can take place in various time frames. It has been suggested that an optimally functioning process has in fact been optimized at four levels: basic stabilization (control intervals less than one minute), coordination (one minute to two days), production scheduling (two days to one month), and long-range planning (one month to one year). Clearly the tools used to achieve optimal performance at these different time scales will be different and may well be used by different people (process engineers for the shorter time scales, mill or corporate management for the longer ones). Short-term optimization provides the foundation for longer-term efforts: corporate planning objectives cannot possibly be achieved if short-term process variation is not well-controlled.

Summary

Process optimization seldom has the urgency of headline-grabbing events like mill closures or corporate quarterly deficits, or of the daily challenges of operating a mill. However, the best longer-term strategy for survival may be to step back and look at improving the capability of that mill to provide top-quality product safely and efficiently at minimum cost, and with minimum impact on the environment. The tools and approaches mentioned here are just some of the possibilities that exist for undertaking that effort to the long-term benefit of the Canadian industry.

Diana Bouchard is an industry consultant with 26 years of experience in the fields of statistical data analysis, process modelling and simulation, expert systems and process optimization.


Print this page

Advertisement

Stories continue below


Related