To download the full version of this article including images and table, please click here.
Abstract: The discrete wavelet transform provides a numerically efficient method for decomposing basis weight and moisture data into multi-resolution components carrying frequency-related information. Wavelet decomposition allows noise filtering without loss off signal detail. Wavelet-packet analysis has shown to give a more accurate diagnostic assessment of controller performance. An analysis of the improved quality and reduced waste that can result from optimized roll cutting based on wavelet-packet analysis will also be given.
Paper machine data-analysis methods using wavelet and wavelet packets have been shown by Nesic  to be effective in filtering two-dimensional scanned data to remove noise while maintaining all significant process variations. While wavelet decomposition restricts this multi-scale decomposition to binary frequency band subdivisions at each stage, Coiffman  showed that wavelet packet methods remove this restriction and allow more complete control over the frequency ranges available at each level of decomposition.
The thresholding techniques used to remove noise components from the transformed data also result in a substantial reduction in the number of coefficients required for an accurate description of the signal. The resulting signal compression allows the more-efficient transmission of high-quality images over data networks, and reduced archiving storage demand for process diagnostics.
Actuator spacing at the headbox of the paper machine is closely related to the range of spatial wavelengths that the cross machine (CD) control system can influence. A software wavelet toolbox has been developed by Nesic  to provide an improved operator display of two-dimensional sheet variations, and to guide the operator by displaying those components of the CD variations within the spatial bandwidth of the controller. The presence of such controllable variations is an indication of the effectiveness of the control, and can be quantified into a control performance index.
A wavelet packet W is a square integrable modulated waveform, well localized in both position and frequency. As shown in the following equation, it has three parameters: scale parameter j (resolution level), time-localization parameter k (translation level) and oscillation parameter p
Wj,p,k(x) = 2-j/2 Wp(2-jx - k) (1)
In the wavelet packet framework, the projections on the scaling function and the wavelet are recursively decomposed to obtain a binary tree. Figure 1 shows the filter bank implementation of Discrete Wavelet Packet Transform. It is noted that in the wavelet decomposition procedure, the generic step splits only the approximation coefficients into two parts at each stage. However, in the case of wavelet packet decomposition, both the approximation and the detail coefficients are decomposed into two parts.
Wavelet-packet decomposition provides increased flexibility due to the larger number of bases available for decomposition. The projection of a signal of length N on to wavelet packet components produces a tree with a total of N log N coefficients. A signal of length N = 2M can be decomposed in as many as 2N different ways. To reduce the complexity and redundancy, a design objective must be used to choose a best representation of the original signal. The Best Basis Algorithm proposed by Wickerhauser  is a widely used procedure. This algorithm finds the most economical tree structure for representing a given signal. Since the number of non-zero coefficients is minimized, this method is widely used for compression. The concept of entropy is used to estimate the information concentration in a signal; and several possible entropy definitions are discussed in . Before each decomposition, the algorithm calculates the entropy value of the child nodes and compares this value to that of the parent node as a basis for deciding if further splitting is necessary. If the parent node is of lower entropy, the decomposition is not carried out. If the children have lower entropy values, then further decomposition is carried out.
Other issues in wavelet-packet analysis include the removal of noise components through thresholding, and the resulting compression of the data sets. The two-dimensional data describing sheet properties also require special handling, as do edge effects. These issues are discussed at greater length in Jiao .
Controller performance is assessed using a performance index that relates the noise-free over-all profile variance to the variance of the controllable component. It is critical that an accurate estimate of the controllable component be available at this stage. However, once the scanned data resolution has been set, the set of frequency bands into which the data can be divided by wavelet analysis are also set. Once the wavelength, l, of the original data is determined, wavelet analysis only permits separation at wavelengths corresponding to 2nl. Significant errors in assessing controller performance may thus be present if the controller spatial bandwidth is not close to one of these pre-set bands. With wavelet-packet analysis, both the detail and the approximation components of the profile are decomposed, with the result that the wavelength division can be at any value nl. Greater precision in dividing controllable from uncontrollable components is therefore possible.
The paper machine profile measurements contain various frequency components. Relating the different frequency ranges to the process is discussed in Cutshall , where the variation is classified as:
1. Short-term variation: <1s period;
2. Medium-term variation: 1s to 200s period; and
3. Long-term variation: Longer than 200s period.
Short-term variation starts where formation leaves off (wavelength of 100-mm) and includes all wavelengths up to the length of paper made in one second. Short-term variation is primarily affected by hydraulic pulsation, hydraulic stability and equipment vibration. Medium-term variation is primarily affected by blending, flow stability and fast control loops. Long-term variation is primarily affected by system stability and slow control loops.
After wavelet decomposition to a suitable level, the signal is decomposed into different resolution components that correspond to different wavelengths. Based on this, the process variations can then be divided into categories associated with the controllable and uncontrollable wavelengths and noise. If there exist controllable components in the profile, improved control actions are required to remove those variations. Here the CD control bandwidth is assumed to be of wavelengths above two actuator spacings. This is an optimistic assumption that may be modified in those cases when the response to an individual actuator adjustment is known.
After separating the controllable and uncontrollable components, the potential improvement in control that can be achieved for a given actuator spacing can be quantified in terms of the following performance index.
Let s2pr be variance of the de-noised profile, and let s2contr be the variance of the controllable component. The performance index is defined  as
C = ------------ (2)
s2pr - s2contr
If C=1, perfect control is achieved, meaning that all controllable variations have been removed. C<1 is not possible since s2contr < s2pr . C>1 means that controllable variations are still present in the profile. The performance index provides a straightforward quantitative assessment of the process operating condition. It can be calculated on-line once the scan data is available. Together with the profile decomposition plots, the operator can be provided with a quick and effective assessment of manufactured reel and the control performance.
Industrial scanned data is used to illustrate the approach. The on-line scanner data consist of 130 scans and 685 data boxes. The data was calibrated by separate off-line analysis of both MD and CD samples using a TAPIO Analyzer. The wavelet decomposition level that separates the CD controllable and uncontrollable variations is level 3. By choosing the most appropriate wavelet-packet decomposition tree a more accurate estimate of the controllable variance is obtained. The packet decomposition tree is shown in Fig. 2. This results in an estimate of the controller performance index that is significantly different from that given by wavelet analysis, as illustrated in Fig. 3.
In Fig. 3, perfect control corresponds to a performance index value of 1, while values greater than 1 indicate the presence of controllable variations. Note that the wavelet analysis gives a significantly greater estimate of the performance index than the wavelet packet decomposition. In this case, the wavelet value is an unnecessarily pessimistic performance estimate. In other cases the estimate may be optimistic.
TRIM LOSS OPTIMIZATION
The trim loss problem arises when a large stock is cut into smaller sections of dimensions determined by customer requirements. In this case, the jumbo paper reel is slit into individual rolls that meet inventory needs. Because inefficient cutting may result in a large amount of trim loss, material and production resources may be lost or recycled. This paper does not develop new optimization methods, but modifies and applies a simple optimization model. Paper quality is taken into account during optimization by dividing the reel into usable and unusable sections based on the measured sheet quality. The approach thus builds a connection between the sheet profile data analysis and production optimization by the paper manufacturer.
First, based on the wavelet data analysis, the paper quality is assessed. The usable sections of the paper reel are identified and the trim-loss problem is solved while avoiding the unusable sections of the jumbo reel. For illustrative purposes in the example considered, acceptable product quality is represented by those portions of the sheet for which the basis weight estimate is within two standard deviations of the mean value. The Lingo Modeling language and solver  is then used as a tool for solving the trim-loss optimization problem subject to the requirement that only acceptable product is to be sold. The approach assumes that slitter settings may be adjusted to avoid unacceptable sections of the jumbo roll.
In recent years, the trim-loss problem in the paper industry has been solved with the aid of linear programming combined with heuristic rules in order to handle the non-linearities and discrete decisions. A survey of methods for solving the non-linear problem can be found in Hinxman , whose taxonomy and terminology is used here. More recently, however, Sweeney and Haessler  have proposed a procedure using a two-stage sequential heuristic for solving one-dimensional cutting stock problems when both the master rolls and customer orders have multiple quality gradation. The objectives are to minimize trim-loss, avoid production overrun and avoid unnecessary slitter set-ups. At the first stage, the slitting decisions were made for the non-perfect master rolls. At the second stage, an LP model is solved for the remaining demands using the available perfect master rolls. In most cases, the practical problem formulation of the trim-loss problem are restricted by the fact that the solution methods should handle the entire problem. Thus, only a sub-optimal solution has been obtained. Harjunkoski and Westerlund  formulated the trim-loss problem to minimize the raw material used, as well as processing time and production waste. They compared different methods of describing the problem as one of an Integer Linear Programming/Multi-Integer Linear Programming.
Here, several different application software tools are used together to accomplish the trim optimization. One is Lingo, which provides the optimization-modelling environment and optimization solver. Another is the Wavelet Toolbox, which implements the wavelet filtering of the sheet property profile to evaluate the paper quality.
The roll order information, such as roll width, minimum demand and maximum demand is supplied by the user. In order to integrate the different programs together, an application interface for the input information has been designed using Visual Basic as shown in Fig. 4. This information is passed to the Lingo solver through Visual Basic script. The optimization is solved by the Lingo solver and the results are saved in the data files and, as well, in Microsoft Excel spreadsheets through Object Link Embedding. Several command buttons are also available to allow the user to select data sets, optimize these under various conditions, and export the results.
Three cases are considered and, once again, an industrial data set is used to illustrate the approach. Three different cut roll widths are specified, and minimum and maximum demand for each product is specified. Each is assumed equal in length. The first case corresponds to roll production without taking into account quality issues. Rolls containing unacceptable production must then be discarded after cutting. The second case corresponds to the same jumbo reel being divided into rolls after first determining the low quality regions. The optimization avoids these regions. In the third case, it is assumed that an improved CD control design is introduced so that 60% of the controllable variations have been removed. The improved jumbo reel is then subjected to optimized cutting. As an example, Fig. 5 shows an optimized cut in which the trim has been set to avoid including any unacceptable production (dark bands) in the rolls.
Using the optimization results, the total amount of reject paper is calculated and the three schemes are evaluated in terms of absolute reject rates. Comparison of these three optimization schemes is shown in Table I.
In Table I, the number of rolls produced for each roll category is listed, and the total waste is calculated. It can be seen that the number of rolls produced under Scheme 1 is highest. However, because paper sheet quality has been ignored, a substantial number of rolls of unacceptable quality need to be discarded. The total waste will therefore be the largest among the three schemes.
For Scheme 2, the paper quality is taken into consideration and the unacceptable areas are bypassed during the cutting. The total waste is now 28%, much less than the 61% in Scheme 1. The total waste for Scheme 3 is the least since, after improving control, the quality of the entire reel is improved and the amount of unacceptable product is reduced so reducing trim-loss. This illustrates the potential benefit of improving control in terms of paper savings during trim-loss optimization.
The project reported here had as its primary aim the improvement of paper machine productivity through increased insight into operating conditions.
The use of wavelets and wavelet packets in paper machine data analysis was shown to be effective in determining the controllable and uncontrollable components of cross machine response. Process monitoring and control performance assessment were achieved by first separating the controllable and uncontrollable variations in the cross machine direction profile using multi-resolution analysis. The performance and the control potential of the system were then evaluated using both wavelet and wavelet packets, and the results compared. The wavelet and wavelet packets analyses have the following advantages:
1. Wavelet transforms can represent paper machine process data economically and also provide excellent operator visualization.
2. Wavelet methods provide high compression that allows efficient data storage and transfer between mills.
3. Wavelet-packet analysis can achieve an accurate separation of CD controllable and uncontrollable variation and control performance assessment.
4. A control performance index can be calculated on-line to provide the operator with a quick assessment of the process.
The processed paper machine profile was then used for trim-loss optimization in roll cutting. The method discussed here, which takes into account the quality of the paper sheet during optimization, was used to demonstrate the potential economic benefits of optimized cutting. Three different optimization schemes have been discussed and compared. The potential savings through optimization after improving control were also analysed. It is found that the optimized cutting improves the saleable fraction from 39% to 72% of the produced paper. A further improvement to 85% utilization is available if improved CD control can be realized. Clearly, there is scope for major improvement in product quality and for a reduction in recycled broke
The financial support of the Natural Sciences and Engineering Research Council of Canada and the Network of Centres of Excellence of Canada is gratefully acknowledged. Zoran Nesic provided expert assistance in applying the wavelet toolbox.
1. NESIC, Z., DAVIES, M.S., DUMONT, G.A. Paper Machine Data Analysis and Compression Using Wavelets. Tappi J. 80(10): 191-204 (1997).
2. COIFFMAN, R., MEYER, Y. Signal Processing and Compression with Wavelet Packets. Progress in Wavelet Analysis and Applications. Toulouse, France: Editions Frontiers, 77-93 (1992).
3. NESIC, Z., DUMONT, G.A., DAVIES, M.S., BREWSTER, D. CD Control Diagnostics Using a Wavelet Toolbox. Proc., CD Symposium, IMEKO, Tampere, Finland, Vol XB, 120-125 (1997).
4. WICKERHAUSER, M.V. Lecture on the Wavelet Packet Algorithm. Department of Mathematics. Washington University (1991).
5. JIAO, XUEJUN. Paper Machine Data Analysis and Optimization using Wavelets. MASc Thesis, Department Electrical and Computer Engineering, University of British Columbia (1999).
6. CUTSHALL, K.A., ILOTT, G.E., ROGERS, J.H. Grammage Variation - Measurement and Analysis. Grammage Variation Subcommittee, Process Control Committee, TS, CPPA, Montreal (1988).
7. LINGO: The Modelling Language and Optimizer. Chicago: Lingo Systems (1998).
8. HINXMAN, A.I. The Trim-loss and Assortment Problems: A Survey. European Journal of Operations Research 5:8-18(1980).
9. SWEENEY, P.E., HAESSLER, R.W. One-Dimensional Cutting Stock Decisions for Rolls with Multiple Quality Grades. European Journal of Operations Research 44:224-231(1990).
10. HARJUNKOSKI, I., WESTERLUND, T. Different Formulations for Solving Trim Loss Problems in a Paper-converting Mill with ILP. Computers in Chemical Engineering 20 (Suppl): 121-126 (1996).
Résumé: La transformée par ondelettes offre une méthode numériquement efficace permettant de décomposer les données sur le grammage et la teneur en eau en éléments multi-résolution porteurs de renseignements en rapport avec la fréquence. La décomposition des ondelettes permet de filtrer le bruit sans perdre de détails des signaux. L'analyse des paquets d'ondelettes offre un diagnostic plus précis de la performance du régulateur. Une analyse de l'amélioration de la qualité et de la réduction des déchets pouvant découler de la coupe optimisée d'une bobine basée sur l'analyse du paquet d'ondelettes est aussi fournie.#text2#