ࡱ> VY #qbjbjWW  b== h]LLL````8L`(^rr'{{D|$`bbbbbb$ےϔ>-L5|5x'{5|5|ra5| rL```5|`jzFL`dV``FAIM Artificial Neural Networks 1. Introduction In this section of the course we are going to consider neural networks. More correctly, we should call them Artificial Neural Networks (ANN) as we not building neural networks from animal tissue. Rather, we are simulating, on a computer, what we understand about neural networks in the brain. But, during this course we will use the term neural network and artificial neural network interchangeably. We start this section of the course by looking at a brief history of the work done in the field of neural networks. Next we look at how a real brain operates (or as much as we know about how a real brain operates). This will provide us with a model we can use in implementing a neural network. Following this we will look at how we can solve simple algebraic problems using a neural network. In doing so we will discover the limitations of such a model. As with other parts of the course I have used AIMA (Russell, 1995), where possible. In addition, some of the material is also based on (Fausett, 1994), which is a good introductory text to neural networks as is (Aleksander, 1995). You might also take a look at (Davalo, 1991) and (Callan, 1998). In addition, many of the seminal papers (for example, McCulloch/Pitts, Minksy/Papert, Widrow, Hebb etc.) can be found in (Anderson, 1988). 2. Research History McCulloch & Pitts (McCulloch, 1943) are generally recognised as being the designers of the first neural network. They recognised that combining many simple processing units together could lead to an overall increase in computational power. Many of the ideas they suggested are still in use today. For example, the idea that a neuron has a threshold level and once that level is reached the neuron fires is still the fundamental way in which artificial neural networks operate. The McCulloch and Pitts network had a fixed set of weights and it was Hebb (Hebb, 1949) who developed the first learning rule. His premise was that if two neurons were active at the same time then the strength between them should be increased. In the fifties and throughout the sixties many researchers worked on the preceptron (Block, 1962, Minsky & Papert, 1988 (originally published in 1969) and Rosenblatt, 1958, 1959 and 1962). This neural network model can be proved to converge to the correct weights, if there are weights that will solve the problem. The learning algorithm (i.e. weight adjustment) used in the perceptron is more powerful than the learning rules used by Hebb. The perceptron caused great excitement at the time as it was thought it was the path to producing programs that could think. But in 1969 (Minsky & Papert, 1969) it was shown that the perceptron had severe limitations which meant that it could not learn certain types of functions (i.e. those which are not linearly separable). Due to Minsky and Paperts proof that the perceptron could not learn certain type of (important) functions, research into neural networks went into decline throughout the 1970s. It was not until the mid 80s that two people (Parker, 1985) and (LeCun, 1986) independently discovered a learning algorithm for multi-layer networks called backpropogation that could solve problems that were not linearly separable. In fact, the process had been discovered in (Werbos, 1974) and was similar to another algorithm presented by (Bryson & Ho, 1969) but it took until the mid eighties to make the link to neural networks. 3. The Brain We still do not know exactly how the brain works. For example, we are born with about 100 billion neurons in our brain. Many die as we progress through life, and are not replaced, yet we continue to learn. Although we do not know exactly how the brain works, we do know certain things about it. We know it is resilient to a certain amount of damage (in addition to the continual loss we suffer as we get older). There have been reports of objects being passed (if passed is the right word) all the way through the brain with only slight impairment to the persons mental capability. We also know what certain parts of the brain do. We know, for example, that much information processing goes on in the cerebral cortex, which is the outer layer of the brain. From a computational point of view we also know that the fundamental processing unit of the brain is a neuron. A neuron consists of a cell body, or soma, that contains a nucleus. Each neuron has a number of dendrites which receive connections from other neurons. Neurons also have an axon which goes out from the neuron and eventually splits into a number of strands to make a connection to other neurons. The point at which neurons join other neurons is called a synapse. A neuron may connect to as many as 100,000 other neurons. A simplified view of a neuron is shown in the diagram below.  Signals move from neuron to neuron via electrochemical reactions. The synapses release a chemical transmitter which enters the dendrite. This raises or lowers the electrical potential of the cell body. The soma sums the inputs it receives and once a threshold level is reached an electrical impulse is sent down the axon (often known as firing). These impulses eventually reach synapses and the cycle continues. Synapses which raise the potential within a cell body are called excitatory. Synapses which lower the potential are called inhibitory. It has been found that synapses exhibit plasticity. This means that long-term changes in the strengths of the connections can be formed depending on the firing patterns of other neurons. This is thought to be the basis for learning in our brains. 4. The First Artificial Neuron Much of this section is taken from (Fausett, 1994). As mentioned in the research history McCulloch and Pitts (1943) produced the first neural network, which was based on their artificial neuron. Although this work was developed in the early forties, many of the principles can still be seen in the neural networks of today. We can make the following statements about a McCulloch-Pitts network The activation of a neuron is binary. That is, the neuron either fires (activation of one) or does not fire (activation of zero). For the network shown below the activation function for unit Y is f(y_in) = 1, if y_in >=  else 0 where y_in is the total input signal received  is the threshold for Y. Neurons is a McCulloch-Pitts network are connected by directed, weighted paths. If the weight on a path is positive the path is excitatory, otherwise it is inhibitory. All excitatory connections into a particular neuron have the same weight, although different weighted connections can be input to different neurons. Each neuron has a fixed threshold. If the net input into the neuron is greater than the threshold, the neuron fires. The threshold is set such that any non-zero inhibitory input will prevent the neuron from firing. It takes one time step for a signal to pass over one connection.  A sample McCulloch-Pitts network is shown above and some of the statements can be observed. In particular, note that the threshold for Y has equal 4 as this is the only value that allows it to fire, taking into account that a neuron cannot fire if it receives a nonzero inhibitory input. Using the McCulloch-Pitts model we can model logic functions. Below we show and describe the architecture for four logic functions (the truth tables for each function is also shown)   ANDORAND NOTXORX1X2YX1X2YX1X2YX1X2Y111111110110100101101101010011010011000000000000 AND Function As both inputs (X1 and X2) are connected to the same neuron the connections must be the same, in this case 1. To model the AND function the threshold on Y is set to 2. OR Function This is almost identical to the AND function except the connections are set to 2 and the threshold on Y is also set to 2. AND NOT Function Although the truth table for the AND NOT function is shown above it deserves just a small explanation as it is not often seen in the textbooks. The function is not symmetric in that an input of 1,0 is treated differently to an input of 0,1. As you can see from the truth table the only time true (value of one) is returned is when the first input is true and the second input is false. Again, the threshold on Y is set to 2 and if you apply each of the inputs to the AND NOT network you will find that we have modeled X1 AND NOT X2. XOR Function XOR can be modeled using AND NOT and OR; X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1) (To prove it draw the truth table) This explains the network shown above. The first layer performs the two AND NOTs and the second layer performs the OR. Both Z neurons and the Y neuron have a threshold of 2. As a final example of a McCulloch-Pitts network we will consider how to model the phenomenon that if you touch something very cold you initially perceive heat. Only after you have left your hand on the cold source for a while do you perceive cold. This example (from Fausett, 1994) is an elaboration of one originally presented by (McCulloch and Pitts, 1943). To model this we will assume that time is discrete. If cold is applied for one time step then heat will be perceived. If a cold stimulus is applied for two time steps then cold will be perceived. If heat is applied then we should perceive heat. Take a look at this figure. Each neuron has a threshold of 2. This, as we shall see, allows us to model this phenomenon.  First though, remember that time is discrete, so it takes time for the stimulus (applied at X1 and X2) to make its way to Y1 and Y2 where we perceive either heat or cold. Therefore, at t(0), we apply a stimulus to X1 and X2. At t(1) we can update Z1, Z2 and Y1. At t(2) we can perceive a stimulus at Y2. At t(2+n) the network is fully functional. Before we see if the network performs as we hope lets consider what we are trying to do. Input to the system will be (1,0) or (0,1), which represents hot and cold respectively. We want the system to perceive cold if a cold stimulus is applied for two time steps. That is Y2(t) = X2(t 2) AND X2(t 1) (1) This truth table (i.e. the truth table for AND) shows that this model is correct X2(t 2)X2( t 1)Y2(t)111100010000 We want the system to perceive heat if either a hot stimulus is applied or a cold stimulus is applied (for one time step) and then removed. We can express this as follows Y1(t) = [ X1(t 1) ] OR [ X2(t 3) AND NOT X2(t 2) ] (2) This truth table shows that it gives us the required result  X2(t 3)X2(t 2)AND NOTX1(t 1)OR1101110111010110001111000101010100000000 So, if we are convinced that we have the correct logical statements to represent the hot/cold problem, we now need to convince ourselves that the network above represents the logical statements. The figure of the network shows that Y1(t) = X1(t 1) OR Z1(t 1) (3) (compare this to the OR network we developed earlier). Now consider the Z1 neuron. This is how it is formed Z1(t 1) = Z2( t 2) AND NOT X2(t 2) (4) (again, compare this to the AND NOT network above). Now Z2, this is simply Z2(t 2) = X2(t 3) (5) If we take formula (3), and substitute in formula (4) and (5) we end up with Y1(t) = [ X1(t 1) ] OR [ X2(t 3) AND NOT X2(t 2) ] (6) which is the same as formula (2), showing that our network (Y1 anyway) works correctly (as we have proved formula 2 works using a full analysis by using the truth table). We can perform a similar analysis for Y2, and show that Y2 in the network acts in the way we developed above (formula (1)). You should do this to convince yourself that this is correct. If you still dont believe it, there is a spreadsheet available from the course web site that implements this network so that you can see that it works as we expect. 5. Modelling a Neuron To model the brain we need to model a neuron. Each neuron performs a simple computation. It receives signals from its input links and it uses these values to compute the activation level (or output) for the neuron. This value is passed to other neurons via its output links. The input value received of a neuron is calculated by summing the weighted input values from its input links. That is  An activation function takes the neuron input value and produces a value which becomes the output value of the neuron. This value is passed to other neurons in the network. This is summarised in this diagram and the notes below.  aj : Activation value of unit j wj,i : Weight on the link from unit j to unit i ini : Weighted sum of inputs to unit i ai : Activation value of unit i (also known as the output value) g : Activation function Or, in English. A neuron is connected to other neurons via its input and output links. Each incoming neuron has an activation value and each connection has a weight associated with it. The neuron sums the incoming weighted values and this value is input to an activation function. The output of the activation function is the output from the neuron. Some common activation functions are shown below.  These functions can be defined as follows. Stept(x) = 1 if x >= t, else 0 Sign(x) = +1 if x >= 0, else 1 Sigmoid(x) = 1/(1+e-x) On occasions an identify function is also used (i.e. where the input to the neuron becomes the output). This function is normally used in the input layer where the inputs to the neural network are passed into the network unchanged. 6. Some Simple Networks We can use what we have learnt above to demonstrate a simple neural network which acts as a logic gate. The diagram below is modelling the following truth tables ANDORNOTInput 10011001101Input 201010101Output0001011110  In these networks we are using the step activation function. You should convince yourself that these networks produce the correct output for all the allowed inputs. You will notice that each neuron has a different threshold. From a computational viewpoint it would be easier if all the neurons had the same threshold value and the actual threshold was somehow modelled by the weights. In fact, it is possible to do exactly this. Consider this network  It is the same as the AND network in the diagram above except that there is an extra input neuron whose activation is always set to 1. The threshold of the neuron is represented by the weight that links this extra neuron to the output neuron. This means the threshold of the neuron can be set to zero. You might like to work through this network using the four possible combinations of x and y and convince yourself that the network operates correctly. This advantage of this method is that we can always set the threshold for every neuron to zero and from a computational point of view, when training the network, we only have to update weights and not both thresholds and weights. 7. Types of Network The simple networks we have considered above only have input neurons and output neurons. It is considered a one layer network (the input neurons are not normally considered to form a layer as they are just a means of getting data into the network). Also, in the networks we have considered, the data only travels in one direction (from the input neurons to the output neurons). In this respect it is known as a feed-forward network. Therefore, we have been looking at one-layer, feed-forward networks. There are many other types of network. An example of a two-layer, feed-forward network is shown below.  The Perceptron The name perceptron is now used as a synonym for single-layer, feed-forward networks. They were first studied in the 1950s and although other network architectures were known about the perceptron was the only network that was known to be capable of learning and thus most of the research at that time concentrated on perceptrons. The diagram below shows example of perceptrons.  You will see, in the left hand network that a single weight only affects one of the outputs. This means we can make our study of perceptrons easier by only considering networks with a single output (i.e. similar to the network shown on the right hand side of the diagram). As we only have one output we can make our notation a little simpler. Therefore the output neuron is denoted by O and the weight from input neuron j is denoted by Wj. Therefore, the activation function becomes (Note, we are assuming the use of additional weight to act as a threshold so that we can use a step0 function, rather than stept). What can perceptrons represent? We have already seen that perceptrons can represent the AND, OR and NOT logic functions. But does it follow that a perceptron (a single-layer, feed-forward network) can represent any boolean function? Unfortunately, this is not the case. To see why, consider these two truth tables ANDXORInput 100110011Input 201010101Output00010110 We can represent these two truth tables graphically. Like this  (where a filled circle represents an output of one and a hollow circle represents an output of zero). If you look at the AND graph you will see that we can divide the ones from the zeros with a line. This is not possible with XOR. Functions such as AND are called linearly separable. It was the proof by Minsky & Papert in 1969 that perceptrons could only learn linearly separable functions that led to decline in neural network research until the mid 1980s when it was proved that other network architectures could learn these type of functions. Although, only being able to learn linearly separable functions is a major disadvantage of the perceptron it is still worth studying as it is relatively simple and can help provide a framework for other architectures. It should however, be realised that perceptrons are not only limited to two inputs (e.g. the AND function). We can have n inputs which gives us an n-dimension problem. When n=3 we can still visualise the linear separability of the problem (see diagram below).  But once n is greater than three we find it difficult to visualise the problem. 8. Learning Linearly Separable Functions With a function such as AND (with only two inputs) we can easily decide what weights to use to give us the required output from the neuron. But with more complex functions (i.e. those with more than two inputs it may not be so easy to decide on the correct weights). Therefore, we would like our neural network to learn so that it can come up with its own set of weights. We will consider this aspect of neural networks for a simple function (i.e. one with two inputs). We could obviously scale up the problem to accommodate more complex problems, providing the problems are linearly separable. Consider this truth table (AND) and the neuron that we hope will represent it. ANDInput 10011Input 20101Output0001  In fact, all we have done is set the weights to random values between 0.5 and 0.5 By applying the activation function for each of the four possible inputs to this neuron it actually gives us the following truth table ???Input 10011Input 20101Output0010 Which can be graphically represented as follows  The network, obviously, does not represent the AND function so we need to adjust the weights so that it learns the function correctly. The algorithm to do this follows, but first some terminology. Epoch : An epoch is the presentation of the entire training set to the neural network. In the case of the AND function an epoch consists of four sets of inputs being presented to the network (i.e. [0,0], [0,1], [1,0], [1,1]). Training Value, T : When we are training a network we not only present it with the input but also with a value that we require the network to produce. For example, if we present the network with [1,1] for the AND function the training value will be 1. Error, Err : The error value is the amount by which the value output by the network differs from the training value. For example, if we required the network to output 0 and it output a 1, then Err = -1. Output from Neuron, O : The output value from the neuron Ij : Inputs being presented to the neuron Wj : Weight from input neuron (Ij) to the output neuron LR : The learning rate. This dictates how quickly the network converges. It is set by a matter of experimentation. It is typically 0.1. The Perceptron Training Algorithm While epoch produces an error Present network with next inputs from epoch Err = T O If Err <> 0 then Note : If the error is positive we need to increase O. If the error is negative we need to decrease O. Each input contributes WjIj to the total input so if Ij is positive, an increase in Wj will increase O. If Ij is negative an increase in Wj will decrease O). This can be achieved with the following Wj = Wj + LR * Ij * Err Note : This is often called the delta learning rule. End If End While Perceptron Learning An Example Lets take a look at an example. The initial weight values are 0.3, 0.5, and -0.4 (taken from the above example) and we are trying to learn the AND function. If we present the network with the first training pair ([0,0]), from the first epoch, nothing will happen to the weights (due to multiplying by zero). The next training pair ([0,1]) will result in the network producing zero (by virtue of the step0 function). As zero is the required output there is no error so training continues. The next training pair ([1,0]) produces an output of one. The required output is 0. Therefore the error is 1. This means we have to adjust the weights. This is done as follows (assuming LR = 0.1) W0 = 0.3 + 0.1 * -1 * -1 = 0.4 W1 = 0.5 + 0.1 * 1 * -1 = 0.4 W2 = -0.4 + 0.1 * 0 * -1 = -.04 Therefore, the new weights are 0.4, 0.4, -0.4. Finally we apply the input [1,1] to the network. This also produces an error and the new weight values will be 0.3, 0.5 and 0.3. As this presentation of the epoch produced an error (two in fact) we need to continue the training and present the network with another epoch. Training continues until an epoch is presented that does not produce an error. If we consider the state of the network at the and of the first epoch (weights = 0.3, 0.5, -0.3) we know the weights are wrong as the epoch produced an error (in fact, the weights may be correct at the end of the epoch but we need to present another epoch to show this). We can also produce a graph that shows the current state of the network. We are trying to achieve the AND function which can be represented as follows  The current linear separability line can be drawn by using the weights to draw a line on the graph. Two points on the I1 and I2 axis can be found as follows I1 point = W0/W1 I2 point = W0/W2 Note : As discussed above, W0 actually represents the threshold. Therefore, we are dividing the threshold by the weights. That is, I1 = 0.6 and I2 = -1 This line is shown (roughly in the right position) on the above graph. It is clearly not in the correct place as the line does not divide the ones and the zeros correctly. If we continue with the training until we get no errors from an entire epoch the weights would be 0.4, 0.4, 0.1. If we plot these weights on the graph we get (again the line is roughly in the correct position).  And we can see that the lineraly separability lies is in a valid position and this can be confirmed by checking that the neural network is producing the correct outputs. You might like to do this for weights of 0.4, 0.4 and 0.1. Unless I plan to set some coursework that involves you building a neural network, a spreadsheet available via the course web site will show you how we can implement a perceptron using a spreadsheet. This will allow you to experiment with it (for example, showing that an XOR type problem can never be learnt). 9. References Aleksander, I., Morton, H. 1995. An Introduction to Neural Computing (2nd ed). Chapman and Hall Anderson, J.A., Rosenfeld, E. (eds). 1988. Neurocomputing: Foundations of Research. Cambridge, MA: MIT Press Block, H.D. 1962. The Perceptron: A Model for Brain Functioning, I. Reviews of Modern Physics, Vol 34, pp 123-135. Reprinted in Anderson and Rosenfeld,1988, pp 138-150 Bryson, A.E., Ho, Y-C. 1969. Applied Optimal Control. New York: Blaisdell Callan, R. (1999) The Essence of Neural Networks, Prentice Hall, ISBN 0-13-908732-X Davalo, E., Naim, P. (1991). Neural Networks, The Macmillan Press Ltd, ISBN 0-333-54996-1 Fausett, L. 1994. Fundamentals of Neural Networks : Architectures, Algorithms and Applications. Prentice-Hall, ISBN 0-13-103805-2 Hebb, D.O. 1949. The Organization of Behavior. New York: John Wiley & Sons. Introduction and Chapter 4 reprinted in Anderson & Rosenfeld, 1988, pp 45-56 Le Cun, Y. 1986. Learning Processes in an Asymmetric Theshold Network. Disordered Systems and Biological Organization (Bienenstock, E., Fogelman-Smith, F., Weisbuch, G. (eds)), NATO ASI Series, F20, Berlin: Springer-Verlag McCulloch, W.S., Pitts, W. 1943. A Logical Calculus of the Ideas Immanent in Nervous Activity. Bulletin of Mathematical Biophysics, Vol 5, pp 115-133. Reprinted in Anderson & Rosenfeld, 1988, pp 18-28. Minsky, M.L., Papert, S.A. 1988. Perceptrons, Expanded Edition. Cambridge, MA: MIT Press. Original Edition, 1969. Parker, D. 1985. Learning Logic. Technical Report TR-87, Cambridge, MA: Center for Computational Research in Economics and Management Science, MIT Rosenblatt, F. 1958. The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, Vol 65, pp 386-408. Reprinted in Anderson and Rosenfeld, 1988, pp 92-114 Rosenblatt, F. 195. Two Theorems of Statistical Separability in the Perceptron. Mechanzization of Thought Processes of a Symposium held at the National Physical Laboratory, November, 1958. London: HM Stationery Office, pp 421-456 Rosenblatt, F. 1962. Principles of Neurodynamics. New York: Spartan Russell, S., Norvig, P. 1995. Artificial Intelligence A Modern Approach. Prentice-Hall. ISBN 0-13-103805-2 Werbos, P. 1974. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. Ph.D Thesis, Cambridge, MA: Harvard U. Committee on Applied Mathematics. AI Methods  FILENAME \p \* MERGEFORMAT D:\My Documents\Training & Courses\Lecture Courses\G5BAIM\Nott Handouts\011 Neural Networks.doc  author Graham Kendall -  date 03/04/00 - Page  PAGE \* MERGEFORMAT 17 of  NUMPAGES \* MERGEFORMAT 1 X1 2 2 -1 I2 I1 1,1 1,0 0,1 X2 X1 Y Z2 Z1 0,0 I2 I1 1,1 1,0 0,1 0,0 1,1 1,0 0,1 0,0 W = 0.5 W = -0.4 W = 0.3 -1 x y t = 0.0  EMBED PBrush  XOR AND 1,1 1,0 0,1 0,0 1,1 1,0 0,1 0,0  EMBED Equation.3   EMBED PBrush   EMBED PBrush  -1 Y X3 X2 W = 1 W = 1.5 W = 1 -1 x y t = 0.0  EMBED PBrush   EMBED Equation.3   EMBED PBrush   EMBED PBrush   EMBED PBrush  -1 2 2 2 2 Y X2 X1 2 -1 Y X2 X1 2 2 Y X2 X1 1 1 XOR Function AND NOT Function OR Function AND Function 1 -1 2 1 2 2 2 Cold Heat Y2 Y1 Z2 Z1 X2 X1 Hot Cold Whether heat is perceived If heat is applied at t 1 then we perceive heat If cold is applied, then removed, we perceive heat l v %)`iah3= 2:2 3 ""$"0"2"?"]"^""""""""""""#######0$A$H&I&S&T&W&d&&&&&&&&&&&&&l*m*******++X+Y+H* 56CJB*B* jUmH6 jU56Y/45< = Q A . " # ^ $%/45< = Q A . " # ^ $%ghD'j@  "Vfg&+0 1 2 4 5 6 7 8 9 :                   PghD'j@  "Vffg&+0 1 2 4 5 6 7 8 9 : ; < = > 8 8h 8$ & F : ; < = > ? @ A B C D E e!f!"""" "!"""%"&"'"(")"*"+","-"."/"0"3"4"5"6"7"8"9":";"<"=">"?"C"D"E"F"I"J"K"L"T"U"V"W"["\"]"^"a"d"f"g"j"m"o"p"s"v"x"y"|"""""""""""""""""""""""" b> ? @ A B C D E e!f!"""" "!"""%"&"'"(")"*"+","-"."/"0"0"3"4"5"6"7"8"9":";"<"=">"?"C"D"E"F"I"J"K"L"T"U"V"W"["\"]"$]"^"a"d"f"g"j"m"o"IFFFFFFF$$$Tl:d4$ T$D!$                  o"p"s"v"x"y"|"""$"""""""=p88888$$$$TlNd4$ T$D!$                                              """""""""""$$ """""""""NpIIIIIII$$$$TlNd4$ T$D!$                                                """"""""""""""""""""""""""""""""""""""""""""""####/$0$A$%V&W&d&&&&&&''())i*j*k*l*n*o*p*q*r*s*t*u*v*w*x*y*z*{*|*}*~****,+b+++++7,8, b"""""""""$$"""""""""NpIIIIIII$$$$TlNd4$ T$D!$                                                """""""""$$"""""""""NpIIIIIII$$$$TlNd4$ T$D!$                                                """""""""$$"""####/$0$NLHLLHLLx$$TlNd4$ T$D!$                                                0$A$%V&W&d&&&&&&''())i*j*k*l*n*o*p*q*r*s*t*u*v*$xv*w*x*y*z*{*|*}*~****,+b+++++7,8,,,,--f-g-h-r-$$  Y+_+`+y+z+}+~+++++,,,,,,--- ---h-i-j-s-t-~---N.P.Q.T.Y.Z.[.`.a.b.j.k.l.q.r.s.|.}.~...........@0B0C0G0I0J0K0P0Q0S0V0W0X0]0^0`000000000000000j5UmH jUmH5H*56H*6H*Y8,,,,--f-g-h-r-}---------------------M.N.................///// / /////////// /"/$/&/(/)/+/-///1/3/4/6/8/://?/A/C/E/G/I/J/L/N/P/R/T/U/V/00 br-}---------------------$$Tl  !$$Tl  $$-M.N................./0M$$Tl^  $$  ///// / /////////// /"/$/&/,,,F$$Tl^ $$&/(/)/+/-///1/3/4/6/8/://?/A/C/E/G/I/J/,,,,F$$Tl^ $$J/L/N/P/R/T/U/V/00?0@0d0e0000011  F$$Tl^ $$0?0@0d0e00000116171N1O1j1k1111122^3_3444/505555V6W66666666 7L7d7e7u78888888%9&9E9e9|9}9e:f:~:: ;!;";#;';(;);*;+;.;/;0;1;2;6;7;8;@;A;C;E;G;I;J;L;N;P;R;S;U;W;X;`;a; a0000000<1>1O1Q1R1W1X1Y1\1]1^1c1d1e11111111111111111111111526222225566666666 7 788*9+9x9z9!;@;X;`;v;};;;c=d=BB/D0DEEFF|F}FFFGGH5H* jU jUmHH*6H*6[16171N1O1j1k1111122^3_3444/505555V6W66666  6666 7L7d7e7u78888888%9&9E9e9|9}9e:f:~:: ;!; n 7!;";#;';(;);*;+;.;/;0;1;2;6;7;$7;8;@;A;C;E;G;I;J;L;N;[XXXXXXXXX$$$l8 O  F y N;P;R;S;U;W;X;`;a;c;e;Xx$$l8 O  F y$ a;c;e;g;i;j;l;n;p;r;s;t;u;v;};~;;;;;;;;;;;;;;;;;;;?<@<=^=_=`=a=b=c=e=f=g=h=i=j=k=l=m=n=o=p=q=r=s=t=u=i>>>i>>>H?H~HHHHHHHHHHHHHHHHHHHHHH|I}IIIJJK>LLLLLLLLLM%NNNpO aGGGGGGGGGHHHH|lyyyyyyyyyyy$$$l O  F HHHHHHHHHHHH!H|lyyyyyyyyyyy$$$l O  F HH"H)HHHIILL)L*LCLDLLLLLOOOOOOOOPPPQQQ Q'QcQdQ:R@RS/ST$TTTUUU UGUHUIUJUgUhUUU V+VVWWWWWWWW XX@XYYZZ[[/[0[]]^^____=_>_H_I_K_H*5H* jU656 jUmH5Z!H"H)H*H,H.H0H2H3H5H7H9H;HH?H~HHHHHHHHHHHHHHHHHHHHHH|I}I $d%d&d'd}IIIJJK>LLLLLLLLLM%NNNpOqOOOOOOOOO$pOqOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOPPPPPPP P P P P PPPPPPPPPhPiPPPPPPPPPPPQQQ Q Q QQQQQQQQ Q'Q(Q*Q,Q.Q0Q1Q2QbQcQeQfQgQhQiQjQkQlQmQnQoQpQ bOOOOOOOOOOOOHdHL$$lֈ O  $L$$lֈ O   OOOOOOOOOOOPPPPPPP P P PDL$$lֈ O  $ P P PPPPPPPPPhPiPPPPPPPPPP$PPQQQ Q Q QQQQQHdHL$$lֈ O $L$$lֈ O  QQQQ Q'Q(Q*Q,Q.Q0Q1Q2QbQcQeQfQgQhQiQjQDL$$lֈ O $jQkQlQmQnQoQpQqQQQ9R:RSTTUGUUVV V+VIVvVVVW O 3Ox j  Oj pQqQQQ9R:RSTTUGUUVV V+VIVvVVVWW XXXX@XXXvY*ZZZZZ[.[N[O[~[[\\\\\\:^;^^^^^^^^^^^^^^^^^^^^;_<_M_^_______``{a|a}aaaaaaaaaaaaaaaaasbtbcdWW XXXX@XXXvY*ZZZZZ[.[N[O[~[[\\\\\  7 J x O 3O\\:^;^^^^^^^^^^^^^^^^^^^^;_<_M_^_____K_L_N_O_Y_Z_\_^_{_|_____ ``Ya`a}a~acdddCdldddLeceeeee9ffffSggBhh i)ini|ij`jjkkk l2lClUlell mmmm3m4mmmmmmmmmmmmmmmmmmmmmmmCJmHCJ jCJU 5CJ$mH mH 6H* jUmH6H*U___``{a|a}aaaaaaaaaaaaaaaaasbtbccccccccdd/eyee'ffBg!hh]iijkkVlmmmm mmmm n n nnnnnnnnnnn!n"n&n'n+n,n0n1n4n5n8n9n;nn?n@nAnBnCnDnHnInJnKnLnMnNnOnPnQnUnVnZn[n_n`ndneninjnnnonsntnxnynnnnnnnnnnnnnnnnnn jUH*mH mH CJmH jCJUCJZ nnnnnnnnnnn!n"n&n'n+n,n0n1n4n5n8n9n;no?oBoCoIoJoRoSoYoZo]o^o`oaocodolomooooooooooodHnInLnMnPnQnUnVnZn[n_n`ndneninjnnnonsntnxnynnnnnnnnn$nnnnnnnnnnnnnnnnnnnnnnnnnnnnnn$nnnnnnnnnnnnnnnnnnnnnnnnnnoooo o oooooo o/o0o1o2o4o7o8o:o;oo?o@oAoBoCoIoJoRoSoYoZo]o^o`oaocodolomono}oþH*mH  j}Ujp8 UV jYUj8 UV j EHUj8 UVmHmH  5CJmH  jUjv8 UVFnn o ooo3o4o7o8o:o;o>o?oBoCoIoJoRoSoYoZo]o^o`oaocodolomo$}o~oooooooooooooooooooooooooooooooooooooooooooooooooooooopppppppp p p p pppH*mH mH  j@Uj8 UV j4Uj–8 UV jE.Ujț8 UV j+EHUjm͛8 UVmH jU j%Ujzڛ8 UVmHDmoooooooooooooooooooooooooooooo$ooooooooooooooooooooooopppp p p p ppppppppppp+p,p=p>pJpKpXpYpZp\p]p`papcpdpfpgpipjplpmpopppupvp{p|ppppppppppppppppppppp q!q"q#q[oooopppp p p p ppppppppppp+p,p=p>pJpKpXp$ppppppppppppp+p,p=p>pJpKpXpZp\p]p`pdpfpppupvp{p|p}p~ppppppppppppppppppppppppppppppp q#q5mH CJmH H*mH mH @XpYpZp\p]p`papcpdpfpgpipjplpmpopppupvp{p|ppppppppp$$ppppppppppppp q!q"q#q$ /R / =!"#$%n]үֱ݊ PNG  IHDRAz:gAMAPLTEU~ pHYs+XIDATxMOFpbnͪQދJߢGG8U]Hi)C=CSKzCuJ ZJ/1g}'2̌q `0 n4).Z2I$ .Th0^)IAA"K\ ,a\RrpGĹ P?轹|z875>>-HO¡? xơ$PYpGqGqyK| ;wwT;rh<מ.vjos.LmW202h'Qpqu4xwk8^/9H;ml1K<6C!K⏘p9c!N 88#8s'/ ~ᆱ8E"K\NV_d; qmnP7ף5ݙ^"%N: 1Ĝg@LWKO`# H*N jW_ o-h V][)+%rnAnǔqG%viv 85CeC!x֊n/6ø1~R ķA\{~_J N_{?l8tFBi_~`}>> m+!Kmg0C؇هKCbKuȟ!~mugunmC>CeDۆ=jZtmȊuFXZV؅ȶ0}2v! K¤,- s..̠b}] 3$,- ,̘YUvd]+nd]Ua:Ua: eiUnKH s`2t0c+U4LY)̪wH$6 9U/?|w/@qLXq&LKK%kC4/mr&LK'v/+U0H,=fjyLܰC.IHG S5yIJU[rҍ$+kSC:K!*9KQQ8.ChdaSy!d dc̿jM"&Ν:E?CWDB&ALNl5SUd,jrFeX=T*jG'6TRQ4eJ'?N@*m"o~ Sl2G1HEM=7jT%{ pϸzonw@`PQFX 4(UhKx*ٜJ1MOs9 }ɲ%o$z $K3a2cSiKP\ECTY Y"&+POP}9wꙁtXKՐd1,Nk9܆8˶ z~I%Ka )L Locդ=r])A T %˜iMĜa");N|Aa"aX kH籠ZXLQɂ2VK=e侕H9S.LA`%싉 aA`پtFXX6oFgrU3hqm1`ڔdrPTy,D#lCl{%0f,!z,Rgbm(JNBzhCD쵕GyJ^shEuZ+F!}IAF1{=bM(ڸt@ 8_A\e,E:2Y%!㗙 |n!UҡYϠ"0jK天hgq)OQ2XVFX.>ey\eGS*&ڒv)$QjKYbyX=c(z|Z.b.K,ےmMe7 Hq`/u9|e#H/ڒFhKMux @uʒ/ <q뒇Bl lȈUQ#K%uSMZG4~l?~Nh/?dn?fqK#0dy:"ڴ{?td9[=?|K%dǝ'͋Ő7G^/o(v@?\>mz@-}lmXo\h!vzk²Ӆ*!O=~~u?џ+<{/wՇ Ë:->__=~O~~@֖n8KEp ?ʈBڵ?I1gA*IDATxMlGvU==ɘ&fFk00-'{۶"0ئ(C"EӐ|TY\x!8PH NtuWucf$ӿ__ Ld"+10xPG= >z&&룇{dY{L3?J)jΏ 1@10ΎCnc>GĿ=ُmcΐ1@1@t> i=Au`Pq8zi#gB* M= vtC/8=8Gis5"E9apVt:bE->*p}&:Ef}=+sAGpC7evqeRJ/ī0*ed?JJvK-s:x$j9)P0R |82JFDE ]FV0{ _/e_Y;/?˯>xYO̊ts{so\+$ˈtaݿw[kӍrbvd>}grWZfƛX4M Ni]R%BL5ؽv!^ 5O"?qM|ŋbgoawzn;_޿}Fq(сx!,5BӟnH(|B -ĸlЅ|m'0TͻY@087FQej4bb瘅# ě_&?*\v *op=Fv S=]B% G=d6!n=072Bq+ʔ*dfIѡ ̡䑖xiʋ)5d轟u P GDySp7({xGjۿ8zޛwH(+v 21׸c0}܏x/>S[h&̣y=x7V?j]n(k PѨCnOM۽(S 5dYP梅 ~_nD͛&}ik)g3<ְ(Ā>{=}r?\88X~))Q1&A#4綿{I"'[7Hߪ.M>{Qshա:#ׂbBNJ3 B (.$]N@EVXP1p/^5 pZD-(sNb$m+XCb_mIՁÐAQlW8jC!41SN ;)ok׀6_Zm̪ѯ<t>FV?ڐ]gfS9|E:Yh5/]{CWjslTK*4Wf):^/ҡ"}+E$\J6 yAS> edu)OM TWt]ȨujZ /GF%!B6T+bȑ )OJzz*SuTc jrl R[\4Sv\S\-,iKVRgMyt: nqҩ:_Vơ*jŽu%7wyyr@Mk;}Rֱ xy)$Du|}O@N Rcb=^?ț|K~Om87ؿg]Ay6IEeBvwY? uo/d5g׷o]NFA% c/y,K"յ$ϟ4`Ol B-$ $́nqE@Cs&Y|)ɇfA9]M_7(s8L3`~Et7/r5fJf )t#+Ntʛa6s$ ,f̑-چ L-JL6d*Q!܀ij |9!u[nv};y7H㷂H2]$/ϬXӛSV||?m񋩦"FԧK׏|=JAa) HUdI$GyBeƓ̈́+@C_&ߑ&ET6wuI`Ιũ+tsm'63L!&Zc' J]qq\3Qӗ7+5D *Z }ߌ'﵏WHq^)b:X1ͺKOp:Lmq;)й_5PK?Xۍ3 -7rMu>M`RhҐp͕]Q<Żm B$E.1f~kTq]')hb?uƓu:FClB)xPQ$ tiEC˷=Ի h[a|I-Jh,gIb xqA$<^V^ TMM^I Ka'dIU5)L 'I3D;QVM@Zth \hjВyBڸi %P23\k $YLXfARjfOKt/gEˎt3qS'+e: (1P{Y_'S$O设#YAxTu_QV /o@=mWWXK)vf0蟏$bT4”|u-+Ypt/SVQDSlYr$TLY}GՔVA[TrzDu80Ayd~|tmXfV('B18 RbC}Q}RM4`{)75Yqk]25wQDǗQ+>@t"gNRmKJiUI!LA4X2 w#}ȟ%m:S_/DZDcva]楗Y[g͟)ڃlrmUm8͟cݴƌ|FB(vn{XDi1ȔSdJCC>8hU;dXTdݙ4W 'K?7@xCELB|<+PӔ@79>ܭ:ux t=.~XH1˝З[ڠSR4AYj>k•P,>ة;$PŶM?xx"MA wMPu%،ipeCRAWmuM)_U]mEӰvr5eH GSRsrj 8PD*C/' ܃T MES:/: h:d?/r@* LӼ|h̢7{xD U"alv{neysZ5EPY3P/UXZPT[W=h~*TEӴ8Ta鯿<Wtnaw^gyP>8ղ\^io2QSXT\5N?N[+~SL|@ÑfhڦT?)[  38RkJ;g$>3G$Ώ;BҐkT޳x# W$05pv6Ѵ!%vT"I+ԯi .o?bT=$j bJքXxJZPa@6+~5!''OG]fŸX[YGx1穥@@vl<!ga.gt a 6XFyWEBX5[.0v9`-A)6%(c ulaXko #= H}~ڟk-~jjO2(P 79 mhV]m@r,4OaPL8*\hs3jŠ<aZ>)5\3+vڊ K!/q<7MU b6ڌ DqK>Sbk“>nHϙ`M_3M\lck\iRU'Obo+7<|Y ֙0M"hGB{/BSA牟 Eb\Oմ//IPBqL@+rk+Z%:  L_jy*|{Q'ߟD?ΓYț9pnM*gՇ)di.xP+ʆ/x۔?U\fIGJ! %73!Z#}Α *B傷G5hxWv4-U.x`pք\vmj/'{@傷~h\<~k:i9ꨠ?UB-WK t4V7#jnhsGZ.H?7\%Y:10Od"._g) :z5/W\ůF ӍRՑx,VBy}LW(} Q@+q B/z@3s0o]D_\}}PtʋFzܝeY-9aԜ/QMq@'2Ld"D&2oDlIENDB`n1 o %qV PNG  IHDRB-gAMAPLTEU~ pHYs+ IDATxOh\g}&~ٕHKVVVQnREBO9l\|CbD+605^|!%C!yS[(T^g]~ y7}|߼Ah*ST2B5%"ts-Dž ?1&͐smc\H# ~{ Hmhvx;QC.&>pLKQVRwW Qrt֐uGC%K@Z+u9FBaDIGB\!ax^ A:1hK<6sd=v_,T%_` c}[ -z/f87%~=%D0%>g'hƷfO,߆E?nN}_eR&WFyCL+DۇH$-a pi vA*Z%%p{;k9 ,f,Æ.3y@$Ȳ̷f#@#U37,5 ; j7mԌi˩QJF~Q~3^c#K846C8sDU`5H hϠEU=)8akl>q+vS!LelF]C%7FA 6͡-؈uH#իQ,аTJ`3>A*Q۱ Q0*N( 6v$@- y9d-Rz^ȗ} O^ȑA@-Q S !DG'XF$DK**O BGPS/P ig#X9d Zm!@^!s~ ѲnJM (ZRMNT$WcBovڈCۈ T4:8sC@Q*!OٌpeC_Q kwdؒ K@HȊW[)O?c 6N[z0j14AIF40ŗ8IqṂ uR!2i.; kIy;ĽJ6DLQD|-X<^3 H u"riS0sau}i!bNJOvg߄lpBT2S~ FYHfBpsM£ pP{m7&dw D.pt/5ܒ OD-"ܬ@吞 [[ɑ7Qj<6BwN Vlm 7ĺ〄 NAu]$ NDŽ4,ȉQ2BAp]Bh*| ëjBhӫ3+Y Āk") %$Q!-9$j!I{2uđuȄ'e-EBB~-kq5D@&AF BwFnsH ^\$qRˆt$혐%T=IƃtV 1Bhur<U NÎ t$H7Iwj8 O7D4W\zނ$1FB!TT 6KրIe=  EYaDEﶬ%ps.ԘB8ISxslc8xu#ߓs݉ (_[i#>Qx?XŅ}mZY݂n0/ޫ &-9{suA%!QS@97 zi#!RRGw#O'8F!w}:{b>)  vVN6^]I 炜,MG|21drx'<3g䍧OpmO.=3dR -ڒV擩Le*ST/?{hIENDB``!$ybs>  @CjxNAp{pH@ $ZkGͩL 9 %Ž`q.{'lv;3]BH,iYX$0Vڲ8#+$[rUrx)fqRj@GW!/] :jjg\϶^4 ɁKp8ﴚPOw=.~ҿ<[풭{^z|qj[߭y*鯳%RSt%WrпO=p~w^["}܋u+?yy܎>.=c^_ƚ;A=bج7\uԲ&ŗ:on3 أJ#Qu5 UZPNG  IHDR7wXgAMAPLTEU~ pHYs.>IDATx1#5o %^%'Y9J$H"hR ŇNʼn~'DAbK"zˌMxe|=xQ&Wmc۶ms Q[C䖟P{G,e#VXs!WPCؓuJp'DsHzc"Op"m W q G1DwXjo"ŵJvC؟%n!}}甂oHƽ[6ę mbֺ9dnhnI@2$^9.̓?WT7bz5ó:nGQ<0e90KLD\kQ\s ,LX1Jk ,L8)=ޱ~8B\j B {>V6XHhKMKliN̦J56Wښm@s&<:'ZI+~ŎpW;Nb a)]ωb#{W(KOxtx.=?yN.Yܕmq_ E];SjqT:6sV+ 3_実&Uc>יs~jt Jv}tϽgUpcZ d8)V5╽g.&!c `y{s:LR7|ԍU>tB|(9N*nJP/_ x} ku+2CyߥLR䱜qNc[9S/~KmgzW7zl5S`3?_Wvr&u1 ZIZ[T"{QWwu߱'Pci:uL,zJ֜Neۓש0[kq{tfݝ;K~~-,0d}B>ޫ,4ӵQf 4pֽT'OsO% ߛ{cʇMNs 2$ *]=yW0'\"/_{8IENDB`n0yF}(pfBPNG  IHDR.ÂgAMAPLTEU~ pHYsodIDATxAoDgɪ% qxU/}NpEʡ?ďxRgr3"<)d]zמqZW '*|;;;;kN:,@^S_ NRR\aÜ?qVO9̷ikLbd:m G:y+̏֘Q_@-0༊r3UGCL֔%ХGbN>HbDiY^Ƽ┓֘;ƗҳSʌZ*McH41Nd#UDkb|;&H;:˫BN@!6N+L8rYZ {tuB( ̠`3{9!JSL . /2瘖(uyY)ˮ!BF}@0k^ާu=Hޖeu5ZA$ FeyXkY3p\5lal7%\s3wMmT&[F陏(Y>BwV0b< U)&"k'~RYRXLa>&U{ja-t Ƹ0ow~)*%' ̳ ,fO)STNw@g,p麧4BͽB),S $)>^Lrc1jPN0KNrH{ 08!F8.f-BycSWTj/rx0_!3ЃJ;p24'a6@8Ź:'\l#5TJ9 o@R\7 2ۧ!8,ΦnۚdEi(I*a_h6&\Jǂ )Sˮl69eww\H6 r NFJ)qc]\@^Њx^ՙ3xiϽlUrZ&ۺ+ı~ 3mb]X4$Lyj&y[̰SD`s7l&LQzGۛ/v;IzK u xHZ {Ka xp&-~c }l_! BZ줓N:mqsK xoheքY/IOv`uwSŒ6~X)%IN1 fttE>? 1Ä=9oo8%3ofÌ`+vpŮ3~[+fGۧ\:餓N:DШ[IENDB`nt|z[=PNG  IHDRENgAMAPLTEU~ pHYsodIDATxQhgV]LZ;qF"NuC)}1C */mXa"i 1I`So} Jd z ڣXhYaYݝGI\ڐroߙFf_n/mg0ΐ2 p6f"0[[jaӖfaF@ ̢yޥfiᐡ$@fmeRp;5f9ȇJr}YzQuzT,(5Ui pe3 XmLl•lLqʵ1ۄxW¹on,pq7vp蠍4Xi߄a:K?;pi0xKM!N1{  vk,w8BS_pIgS/EۇLgSǰmB^1iR jv"Q cUj(fO2dyR8`vD׸gcaSskF;ORhKk0&jхE1ps9 {^sBX Fn/^$0 uSdF leSYk%^@{PY8X7GU6*:UNV+Vh SADfd VU4mK3SU6&H†qFٖTp-]/Y5N3E3 1I̅jZ ' XqNqMLsYG1nRפ8ʏ!1PLPdZqS1 DžPLD1ҽ^0l3 { T qB1>ȱ"#KewlQ0g<}T\ǙJ0ކ .J,g0cp]p|V@euBWdЃsru QLs3XT]=^6BnhMsXO&c9.7j6\1>jd&H4A%K} JYZi ) ׂўX~<ɸ=3h@0а1A"sd &UPo'jQ8 &OdN'tMy ['d{?=V4MƇ'Î+jΥXߢTј軀f4zWՙ4C焩2R)\d苹qb.p3>6gAp eǭ ?1zSqXȕJs7;} cͬ\ 6=x}\n|ĉTꎷ1N^AD5:ZRakз '@0\*0)Ǡ K-I8M"kIx,sYgLF>طyf)N^6Wŏw޶oF'`]pҦIYͨQ9&~ѕձ9}Xǝ1)sF[;kSMe_7#:ၾ q#r[EP7_TYR2&}mοh.}yߤ_yŤ-[h%=boYݽo?E[a ʎF+ #f½2c \dqi0<;s,>eú`ikcd%m㩞zA?qDb.ŀ?T˿)ẇ37x֙WS5c/G1waDIԹA<u*.VŨs <3 :Ij|YGj:,@VGpwCH)07ZR}3o2K0& >3qA|/mܒW&˘į =: {91+/q<^ "&;,(diSIrbl%#^pض/C0 ;a#S2)!e8uG1ufbTl>s7稻:ю9K#P`ΡI(@kt- Fۘ"5i[х}J݆OK3eOƈv JIȿa S5:(Z[P$9Cs rȨ+v,E1+ْr*}HRԝ*9C )2[DSjh&⵨sB$ -Z<yF:H1[W"E9. XSHVDrۘ~K|@|.F:NΙzK}}$ '1t:vzڵ¸v4H3ί|fG\ߝk~{@C\xNAƿYH$VX[PZXc<HBg, x^B:xή{HP]&;3;oR@=[N$ 28W%ɮ3ȋH"{68=Jc\T;-jCtB4M[~޼lkb@"ͅ[\$ɱUfF|Gz}s{_dp8lKk^5Z::deQwޜ[ʪq}dTT:0 g=S,R5sѽ?$Z<a:gpKo^Oϙߊr DdVC& 0  # Ab a'fd-\Xg Dn_ a'fd-\XPNG  IHDRV6r5gAMAPLTEU~ pHYsj IDATx\kv:4&`hcI3$vh:8[1}5-6D98UXrQ%a4cMgh^-AF隯իWՃPiV? yې3yY@!gXl!C@ۆ$6$D>m+!Kmg0C؇هKCbKuȟ!~mugunmC>CeDۆ=jZtmȊuFXZV؅ȶ0}2v! K¤,- s..̠b}] 3$,- ,̘YUvd]+nd]Ua:Ua: eiUnKH s`2t0c+U4LY)̪wH$6 9U/?|w/@qLXq&LKK%kC4/mr&LK'v/+U0H,=fjyLܰC.IHG S5yIJU[rҍ$+kSC:K!*9KQQ8.ChdaSy!d dc̿jM"&Ν:E?CWDB&ALNl5SUd,jrFeX=T*jG'6TRQ4eJ'?N@*m"o~ Sl2G1HEM=7jT%{ pϸzonw@`PQFX 4(UhKx*ٜJ1MOs9 }ɲ%o$z $K3a2cSiKP\ECTY Y"&+POP}9wꙁtXKՐd1,Nk9܆8˶ z~I%Ka )L Locդ=r])A T %˜iMĜa");N|Aa"aX kH籠ZXLQɂ2VK=e侕H9S.LA`%싉 aA`پtFXX6oFgrU3hqm1`ڔdrPTy,D#lCl{%0f,!z,Rgbm(JNBzhCD쵕GyJ^shEuZ+F!}IAF1{=bM(ڸt@ 8_A\e,E:2Y%!㗙 |n!UҡYϠ"0jK天hgq)OQ2XVFX.>ey\eGS*&ڒv)$QjKYbyX=c(z|Z.b.K,ےmMe7 Hq`/u9|e#H/ڒFhKMux @uʒ/ <q뒇Bl lȈUQ#K%uSMZG4~l?~Nh/?dn?fqK#0dy:"ڴ{?td9[=?|K%dǝ'͋Ő7G^/o(v@?\>mz@-}lmXo\h!vzk²Ӆ*!O=~~u?џ+<{/wՇ Ë:->__=~O~~@֖n8KEp ?ʈBڵ?I1gA@C\xNAƿYH$VX[PZXc<HBg, x^B:xή{HP]&;3;oR@=[N$ 28W%ɮ3ȋH"{68=Jc\T;-jCtB4M[~޼lkb@"ͅ[\$ɱUfF|Gz}s{_dp8lKk^5Z::deQwޜ[ʪq}dTT:0 g=S,R5sѽ?$Z<a:gpKo^Oϙߊr$Dd0  # Ab|z[=|5 d  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~)(Root Entry9 F0!I@VData  VWordDocument8 bObjectPool;P?V@V_949909131 FŭVŭVOle CompObjMObjInfo "',012345689:;<=>?A FPBrushPBrushPBrush9q FMicrosoft Equation 3.0 DS Equation Equation.39qOle10Native pOle10ItemName_949740949 FLV0ӰVOle pBMvp>(V8p?`0 `1x~ ~p `6>1c~0q61cÇ338c0؜68'1a~3c0|`6?1 `>38g0>6?1a~30>?1x~ ~03񟟾x߇?0 ``3<a0 0``s`0 0`s`0 0@p@00@@ ~gp8n7^3>uW 8 ]U~p8WUU|UU_uUU|8`]UUW0WUUU_`uUUUU`>5UuUUUW0?pU_UUU~8`UUUUU` ?<  Uw}g;0?0UUUUUU| `?`0U_UUUUUWp@|? @UWUUUUUU_ ` UUUUUUUU ?UuUUUUUUWp@08uU]UUUUUUU~@` @5UUWUUUUUUUU8=0>UUUUUUUUU_`UUUuUUUUUUUU|0 0 UU]UUUUUUUUU`  |pUUUWUUUUUUUUU_ UUUUUUUUUUUU~pUUUuUUUUUUUUUW80UU]UUUUUUUUUU UUWUUUUUUUUUU}?@>uUUUUUUUUUUUUUUW``05UUUuUUUUUUUUUW~>00 <UUUU]UUUUUUUUUUuU` p|pUUUWUUUUUUUUUUuU_  UUUUWUUUUUUUUU_UU|8UUUUUUUUUUUUUUUUW0p?`UUUU}UUUUUUUUUUUU_`0UUUUU_UUUUUUUUUU]UUU0cUUUUWUUUUUUUUU]UUW UUUUUUUUUUUUUWUUU~3 0 UUUUUUUUUUUUUWUUU  > 5UUUUU?@0@UUUUUUUUUUUUUWUUUUx0 p` UUUUUUUUUUUUUUUUUU @ ?@ UUUUUUUUUUUUUUUUuUUUW|UUUUUUUUUUUUUUUUUUUU\@0  @0UUUUUUUUUUUUUUUUuUUUp `p `8UUUUUUUUUUUUUUUUUUUU?p0UUUUUUUUUUUUUUUuUUWC `?UUUUUUUUUUUUUUUuUU\`@`@xuUUUUUUUUUUUUUUUuUUp0=` `0~85UUUUUUUUUUUUUUUuUU00x|5UUUUUUUUUUUUUUUUUW UUUUUUUUUUUUUUUuU\  8@ UUUUUUUUUUUUUUUUUpp ?`8UUUUUUUUqUUUUUuU8` 0 UUUUUUUUuUUUUUUW 0`s}ww5UUUUUu\8 0 p0 UUUUUUUuUUUUUup8``8UUUUUUUqUUUUUU?!uUUUUUUUUUUUUUUw` &@85UUUUUUUU]UUUUU\ @p08 ``5UUUUUUUUUUUUUp0`?UUUUUUUUUUUUUU 0 UUUUUUUUU}UUUW 8  @pUUUUUUUUUWUUU\ @8=cUUUUUUUUUWUUUp UUUUUUUUUUuUU  p 8UUUUUUUUUUUWp  UUUUUUUUU_U\ 8 uUUUUUUUUUUpp5UUUUUUUUUU 80 p7UUUUUUUUUU UUUUUUUUUU\ ;  uUUUUUUUUUs  8]UUUUUUUUU`` p UUUUUUUUUW  UUUUUUUU\8UUUUUUUUp 80 ]UUUUUUU uWUUUUUUW `x5UUUUU\085UUUUUpUUUUU w;s UUUUW UUUU\8UUUUp?UUUUUUUUUUWUUUU\8uUUUUp`5UUUU05UUUWUUU\ 8 UUUp UUUUUWUU\8UpUuVd 5X405` ?p?0?0?    CompObj fObjInfoEquation Native  g_949740567 FYVYV"K` O=Step 0 WjIj j " FPBrushPBrushPBrush9qOle  CompObj MObjInfoOle10NativeS      !"#$%&'*;:,-./0123456789RT=>?@ABCDEFGHIJKLMNOPQSUXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~SBMR>(R?3>1'3c1c16c3&c3c1c16c1?c3c1c~1c1?c~wsw1gw;c;ws ~?~s?>?gߟ <7qI0Iqa@IH @@IHx@@IH@8@IH@"@H@3̐@\x@@@@@@@0$'$'$&c2>g$&fd>g$$"fFc2$$$$ fFfd$$"BD"$$$$BD$$$$"BD"$$$$BD$$$$"BD"$$$$BD$$&fbBD"&fd$BD$$%BFb%.BFd$ B B. @ @ @ @pP20@@@ @ ` @   3 31@ 1@1`{pp@ @ @ @ @ @ @ @ @ `8p?X?\/N'G#C!A >@ @` w0;>8 p8>À>`pp088~8 |}Up`8|U`0`pUXp88UUX p8?UUXUUXUUXpp0p08Up8q`p8> uU 8?_>q`pp8>pp8?p p8@@ @ `~q8@p 0;`@8 `@p ?8?0@  8@ @ ?s@ @'@p>?`?|<p?_0??C/s @ #@!?0@p 8`@< 8|Ϗ`@ x@ ppy0p< 8?>p>?p`8`x0>0?8p`>wz;<?Uxp>>`U`xp80?UUp???UUP{??UUX????UUP?UUpp?0U`p0|88>`?Upwz;8|~q p|688>8lp|?ppp|`88>8 p|@1 ?q@c @ p|@ 8~8@8 @p!π8@#pG 0/_p?s?x0`?@<p>@p??@'s@#@!p@ q8@<  8>@p 8|@8 p;?p0 }sp|`p`8|p8|8pp 8pp 8|p 8>@p0|0?8q8`p;` 8`p8>08q88pps  ? 8?0s?08`|qp;0``|<`p|x9py `| 8<~00p|`0`>p|`@8 >=W@0 |;z@`  U`@ >ꪰA8 pUpC0 8񪪸G`!08?UUXN'`\/UUXX??p?UUX` ?@ 8UUX@ @  ?Up@ 0@8 `U`@0 >z@` 8=W08` À80`8| 3|Àn|πOle10ItemName_949740400!  FIV`VOle CompObjM FPBrushPBrushPBrush9q FPBrushPBrushPBrush9qObjInfoOle10Native+Ole10ItemName_949738106 F`V`VBM>(7p <@a  |x  L<8pp8 `   0 0p `p@08x@8<@8  D |fm܀8@B`&0#88!pf 8rp1??8`p30p` 8 p??`p00p?| 3p8`88xp p <8g`p1 8|0`?`0 ?` 8p80C``O0A;a!8 ?p?r p10p8s`pp@8@?@D fm 8܀|p@0p<xp p8c  0 0`p x8<>| B0Lhb&Ole CompObjMObjInfoOle10Native <* FMicrosoft Equation 3.0 DS Equation Equation.39q"K in i =W j , i a jj "*BMn*>(.0*?p ?q090`910`q0`q0`3a0`930q3?1?0@```r r01@01@@1@`yp808xp<pp<?<?><pxx88x8p<y8;0 0p``80``00``00` 0 ??!"@"  `? `   "c8 8"@c "@`0 00` 0````0`800p ?8q?  8 ppp<<xp<8<@>@`>p?<p8?<px@```r r01@01@@1@`yp8Ole10ItemName_949734765,#FVVOle CompObj"$fObjInfo%Equation Native g_949733397( F~V`VOle   FPBrushPBrushPBrush9q FPBrushPBrushPBrush9qCompObj'*!MObjInfo#Ole10Native)+WsOle10ItemName$"!sBMs>(A\sx0x` `~c`  ? `0c `|0??                         @ @@ @@ 3@@ +@ 1 1 X  v A r ` `      @ @ 8 ? x???? ?  x?  ?    >        p p p 8 8 8 8 8                                  | >     ? @ @@ @@ @ @@ @@ @@ @@ @ @                            0 r @ 0?~c~a~>0~<<~<p<``` _9494043802&. F@V@VOle %CompObj-0&MObjInfo( FPBrushPBrushPBrush9qOh+'0Ole10Native/1d=Ole10ItemName)_9493971584 FV`:VOle *`=BMV=>(=q8 Ì`S0 L@ d`! &`!ـ F'`DMDb&y8 l` @   <@D@F `?ᙀa=Ì`` ?cQa!L@1a!!d`!ı&`w1ىDЄF'`1Ȉ lHb&1 ` @ 8< p8p><p0|?8p>Oy@@>@8?@?px@|@?@@xx@@|8@??@x>@@0@~>>8@?8~@p<@x@>@@x?@<~@p<x@#>8@368@FFx@< fl0@"( @8280<@p;< @8<66<`@<0`@ @@ x@@8<` @p0@`8@@@`< @`@ @ @ p8@ 88@ 88@8@8@@~@?>@~ p@x| p@` `p@ `p@00`p@px08@8 8@l  8@\08@<x Lș@8lM͈@xfL@pp&DD`@8x"\`p@  x@F<@<pf@V@>`@>p`@|G"@c|@<|g`<?@=>"c@A@` c@~8< >@x@~@@<>@~@p@8?@?<|@@@@<x@p@8G<<8?8 @?| p=c"pp? <0 Á AxA  p8c #Ϟp<8@ G 0> Àz< Cp0 c@8? c ` 3 q 1 1 1 9 ? 5 ~s   `CompObj36+MObjInfo-Ole10Native57ĎOle10ItemName.BM>(h`0>"zBc"8㈈B3A" < HB!" < HB9 f HBC" f ?HBc"B|b1>"~`"""c|">@```` 0000      ` 00 0 0?p pf1p xB0H@`p8 xr `p̂>  `̂ 0b10>?|  >`8 `@  ` `x``0` 0``0<0p00`00:0`0`0`jp `<z00 ` x0 ꪫ `| p03 z  p :z00j0p`>`ꪪ``:p0ꪪ `zzꪪ183n`ꪪꪪr H$ @ 0 ꪪ #` & ꪪ $``#0ꪪ0`0@0j0``0 @`:000`À`x0<0  `  ` pp0<0:0p0ꪪ0 0p>p 8``ꪪ0`@>8`?< |pz8`ꪮ`< ?꺪0 008a80ꪪ`?ꪪ UW ꪪUU} <UUx  UWUX  0UUXpꪪꪪ``5UUpj0UW:>  `0 008?00x~0`0` |~  ꪼ |`ꪪx` ? 0`ꪪ `8|A|8< <@ 8A1:< A `81 A !! A b< A j` c10 |_|p  <  88` zax8 ꪪ`0p0j>0 0``:`0`:z0` pja8  8` p0 :`0?pf1 xB0@j xr z`̂> 8ĵ ` b1<>:>?f0 : ~08pxx80| 088p `>pp?? ` @ `s`?` 00 `0? 0?0`p|>8`8p<`8>=hG A |@8>`D "A @ Ȁx``Lj$ A @H`` `$ A @0` 0 ?$ |A @0 8 1$ A @Ȁ0 1f0 3c1`Ȁp  G=c|_=|X8        8:p0;88`j`p0`0 80j `p@  `j``8pp0 8`=`  0008 06~`pf`0`|`p 0 À` pp>  <0 p <`p`88`8`00 0`p8`  ~ :0 0> ?Op" 1Fa8 xA D0HB! ` 0A D HC! 00A  HA 0kxA D H@! q0Hc1 F1@f!;0؄>/ſG| 8` @  8p p 00z`00z p0x|8f;18x{8xzpx       #p%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnont|z[=PNG  IHDRENgAMAPLTEU~ pHYsodIDATxQhgV]LZ;qF"NuC)}1C */mXa"i 1I`So} Jd z ڣXhYaYݝGI\ڐroߙFf_n/mg0ΐ2 p6f"0[[jaӖfaF@ ̢yޥfiᐡ$@fmeRp;5f9ȇJr}YzQuzT,(5Ui pe3 XmLl•lLqʵ1ۄxW¹on,pq7vp蠍4Xi߄a:K?;pi0xKM!N1{  vk,w8BS_pIgS/EۇLgSǰmB^1iR jv"Q cUj(fO2dyR8`vD׸gcaSskF;ORhKk0&jхE1ps9 {^sBX Fn/^$0 uSdF leSYk%^@{PY8X7GU6*:UNV+Vh SADfd VU4mK3SU6&H†qFٖTp-]/Y5N3E3 1I̅jZ ' XqNqMLsYG1nRפ8ʏ!1PLPdZqS1 DžPLD1ҽ^0l3 { T qB1>ȱ"#KewlQ0g<}T\ǙJ0ކ .J,g0cp]p|V@euBWdЃsru QLs3XT]=^6BnhMsXO&c9.7j6\1>jd&H4A%K} JYZi ) ׂўX~<ɸ=3h@0а1A"sd &UPo'jQ8 &OdN'tMy ['d{?=V4MƇ'Î+jΥXߢTј軀f4zWՙ4C焩2R)\d苹qb.p3>6gAp eǭ ?1zSqXȕJs7;} cͬ\ 6=x}\n|ĉTꎷ1N^AD5:ZRakз '@0\*0)Ǡ K-I8M"kIx,sYgLF>طyf)N^6Wŏw޶oF'`]pҦIYͨQ9&~ѕձ9}Xǝ1)sF[;kSMe_7#:ၾ q#r[EP7_TYR2&}mοh.}yߤ_yŤ-[h%=boYݽo?E[a ʎF+ #f½2c \dqi0<;s,>eú`ikcd%m㩞zA?qDb.ŀ?T˿)ẇ37x֙WS5c/G1waDIԹA<u*.VŨs <3 :Ij|YGj:,@VGpwCH)07ZR}3o2K0& >3qA|/mܒW&˘į =: {91+/q<^ "&;,(diSIrbl%#^pض/C0 ;a#S2)!e8uG1ufbTl>s7稻:ю9K#P`ΡI(@kt- Fۘ"5i[х}J݆OK3eOƈv JIȿa S5:(Z[P$9Cs rȨ+v,E1+ْr*}HRԝ*9C )2[DSjh&⵨sB$ -Z<yF:H1[W"E9. XSHVDrۘ~K|@|.F:NΙzK}}$ '1t:vzڵ¸v4H3ί|fG\ߝk~{IDATx1#5o %^%'Y9J$H"hR ŇNʼn~'DAbK"zˌMxe|=xQ&Wmc۶ms Q[C䖟P{G,e#VXs!WPCؓuJp'DsHzc"Op"m W q G1DwXjo"ŵJvC؟%n!}}甂oHƽ[6ę mbֺ9dnhnI@2$^9.̓?WT7bz5ó:nGQ<0e90KLD\kQ\s ,LX1Jk ,L8)=ޱ~8B\j B {>V6XHhKMKliN̦J56Wښm@s&<:'ZI+~ŎpW;Nb a)]ωb#{W(KOxtx.=?yN.Yܕmq_ E];SjqT:6sV+ 3_実&Uc>יs~jt Jv}tϽgUpcZ d8)V5╽g.&!c `y{s:LR7|ԍU>tB|(9N*nJP/_ x} ku+2CyߥLR䱜qNc[9S/~KmgzW7zl5S`3?_Wvr&u1 ZIZ[T"{QWwu߱'Pci:uL,zJ֜Neۓש0[kq{tfݝ;K~~-,0d}B>ޫ,4ӵQf 4pֽT'OsO% ߛ{cʇMNs 2$ *]=yW0'\"/_{8IENDB`=Dd. <  C A? b0yF}(pfB#dn0yF}(pfBPNG  IHDR.ÂgAMAPLTEU~ pHYsodIDATxAoDgɪ% qxU/}NpEʡ?ďxRgr3"<)d]zמqZW '*|;;;;kN:,@^S_ NRR\aÜ?qVO9̷ikLbd:m G:y+̏֘Q_@-0༊r3UGCL֔%ХGbN>HbDiY^Ƽ┓֘;ƗҳSʌZ*McH41Nd#UDkb|;&H;:˫BN@!6N+L8rYZ {tuB( ̠`3{9!JSL . /2瘖(uyY)ˮ!BF}@0k^ާu=Hޖeu5ZA$ FeyXkY3p\5lal7%\s3wMmT&[F陏(Y>BwV0b< U)&"k'~RYRXLa>&U{ja-t Ƹ0ow~)*%' ̳ ,fO)STNw@g,p麧4BͽB),S $)>^Lrc1jPN0KNrH{ 08!F8.f-BycSWTj/rx0_!3ЃJ;p24'a6@8Ź:'\l#5TJ9 o@R\7 2ۧ!8,ΦnۚdEi(I*a_h6&\Jǂ )Sˮl69eww\H6 r NFJ)qc]\@^Њx^ՙ3xiϽlUrZ&ۺ+ı~ 3mb]X4$Lyj&y[̰SD`s7l&LQzGۛ/v;IzK u xHZ {Ka xp&-~c }l_! BZ줓N:mqsK xoheքY/IOv`uwSŒ6~X)%IN1 fttE>? 1Ä=9oo8%3ofÌ`+vpŮ3~[+fGۧ\:餓N:DШ[IENDB`XDd<  C A? 2$ybs>)d`!$ybs>  @CjxNAp{pH@ $ZkGͩL 9 %Ž`q.{'lv;3]BH,iYX$0Vڲ8#+$[rUrx)fqRj@GW!/] :jjg\϶^4 ɁKp8ﴚPOw=.~ҿ<[풭{^z|qj[߭y*鯳%RSt%WrпO=p~w^["}܋u+?yy܎>.=c^_ƚ;A=bج7\uԲ&ŗ:oDdA+0  # Ab]үֱ݊ !,dn]үֱ݊ PNG  IHDRAz:gAMAPLTEU~ pHYs+XIDATxMOFpbnͪQދJߢGG8U]Hi)C=CSKzCuJ ZJ/1g}'2̌q `0 n4).Z2I$ .Th0^)IAA"K\ ,a\RrpGĹ P?轹|z875>>-HO¡? xơ$PYpGqGqyK| ;wwT;rh<מ.vjos.LmW202h'Qpqu4xwk8^/9H;ml1K<6C!K⏘p9c!N 88#8s'/ ~ᆱ8E"K\NV_d; qmnP7ף5ݙ^"%N: 1Ĝg@LWKO`# H*N jW_ o-h V][)+%rnAnǔqG%viv 85CeC!x֊n/6ø1~R ķA\{~_J N_{?l8tFBi_~`}>> pLKQVRwW Qrt֐uGC%K@Z+u9FBaDIGB\!ax^ A:1hK<6sd=v_,T%_` c}[ -z/f87%~=%D0%>g'hƷfO,߆E?nN}_eR&WFyCL+DۇH$-a pi vA*Z%%p{;k9 ,f,Æ.3y@$Ȳ̷f#@#U37,5 ; j7mԌi˩QJF~Q~3^c#K846C8sDU`5H hϠEU=)8akl>q+vS!LelF]C%7FA 6͡-؈uH#իQ,аTJ`3>A*Q۱ Q0*N( 6v$@- y9d-Rz^ȗ} O^ȑA@-Q S !DG'XF$DK**O BGPS/P ig#X9d Zm!@^!s~ ѲnJM (ZRMNT$WcBovڈCۈ T4:8sC@Q*!OٌpeC_Q kwdؒ K@HȊW[)O?c 6N[z0j14AIF40ŗ8IqṂ uR!2i.; kIy;ĽJ6DLQD|-X<^3 H u"riS0sau}i!bNJOvg߄lpBT2S~ FYHfBpsM£ pP{m7&dw D.pt/5ܒ OD-"ܬ@吞 [[ɑ7Qj<6BwN Vlm 7ĺ〄 NAu]$ NDŽ4,ȉQ2BAp]Bh*| ëjBhӫ3+Y Āk") %$Q!-9$j!I{2uđuȄ'e-EBB~-kq5D@&AF BwFnsH ^\$qRˆt$혐%T=IƃtV 1Bhur<U NÎ t$H7Iwj8 O7D4W\zނ$1FB!TT 6KրIe=  EYaDEﶬ%ps.ԘB8ISxslc8xu#ߓs݉ (_[i#>Qx?XŅ}mZY݂n0/ޫ &-9{suA%!QS@97 zi#!RRGw#O'8F!w}:{b>)  vVN6^]I 炜,MG|21drx'<3g䍧OpmO.=3dR -ڒV擩Le*ST/?{hIENDB`[Dd3'n0  # Abō\7)o;g Ljnō\7)o;g LPNG  IHDR gAMAPLTEU~ pHYs.>*IDATxMlGvU==ɘ&fFk00-'{۶"0ئ(C"EӐ|TY\x!8PH NtuWucf$ӿ__ Ld"+10xPG= >z&&룇{dY{L3?J)jΏ 1@10ΎCnc>GĿ=ُmcΐ1@1@t> i=Au`Pq8zi#gB* M= vtC/8=8Gis5"E9apVt:bE->*p}&:Ef}=+sAGpC7evqeRJ/ī0*ed?JJvK-s:x$j9)P0R |82JFDE ]FV0{ _/e_Y;/?˯>xYO̊ts{so\+$ˈtaݿw[kӍrbvd>}grWZfƛX4M Ni]R%BL5ؽv!^ 5O"?qM|ŋbgoawzn;_޿}Fq(сx!,5BӟnH(|B -ĸlЅ|m'0TͻY@087FQej4bb瘅# ě_&?*\v *op=Fv S=]B% G=d6!n=072Bq+ʔ*dfIѡ ̡䑖xiʋ)5d轟u P GDySp7({xGjۿ8zޛwH(+v 21׸c0}܏x/>S[h&̣y=x7V?j]n(k PѨCnOM۽(S 5dYP梅 ~_nD͛&}ik)g3<ְ(Ā>{=}r?\88X~))Q1&A#4綿{I"'[7Hߪ.M>{Qshա:#ׂbBNJ3 B (.$]N@EVXP1p/^5 pZD-(sNb$m+XCb_mIՁÐAQlW8jC!41SN ;)ok׀6_Zm̪ѯ<t>FV?ڐ]gfS9|E:Yh5/]{CWjslTK*4Wf):^/ҡ"}+E$\J6 yAS> edu)OM TWt]ȨujZ /GF%!B6T+bȑ )OJzz*SuTc jrl R[\4Sv\S\-,iKVRgMyt: nqҩ:_Vơ*jŽu%7wyyr@Mk;}Rֱ xy)$Du|}O@N Rcb=^?ț|K~Om87ؿg]Ay6IEeBvwY? uo/d5g׷o]NFA% c/y,K"յ$ϟ4`Ol B-$ $́nqE@Cs&Y|)ɇfA9]M_7(s8L3`~Et7/r5fJf )t#+Ntʛa6s$ ,f̑-چ L-JL6d*Q!܀ij |9!u[nv};y7H㷂H2]$/ϬXӛSV||?m񋩦"FԧK׏|=JAa) HUdI$GyBeƓ̈́+@C_&ߑ&ET6wuI`Ιũ+tsm'63L!&Zc' J]qq\3Qӗ7+5D *Z }ߌ'﵏WHq^)b:X1ͺKOp:Lmq;)й_5PK?Xۍ3 -7rMu>M`RhҐp͕]Q<Żm B$E.1f~kTq]')hb?uƓu:FClB)xPQ$ tiEC˷=Ի h[a|I-Jh,gIb xqA$<^V^ TMM^I Ka'dIU5)L 'I3D;QVM@Zth \hjВyBڸi %P23\k $YLXfARjfOKt/gEˎt3qS'+e: (1P{Y_'S$O设#YAxTu_QV /o@=mWWXK)vf0蟏$bT4”|u-+Ypt/SVQDSlYr$TLY}GՔVA[TrzDu80Ayd~|tmXfV('B18 RbC}Q}RM4`{)75Yqk]25wQDǗQ+>@t"gNRmKJiUI!LA4X2 w#}ȟ%m:S_/DZDcva]楗Y[g͟)ڃlrmUm8͟cݴƌ|FB(vn{XDi1ȔSdJCC>8hU;dXTdݙ4W 'K?7@xCELB|<+PӔ@79>ܭ:ux t=.~XH1˝З[ڠSR4AYj>k•P,>ة;$PŶM?xx"MA wMPu%،ipeCRAWmuM)_U]mEӰvr5eH GSRsrj 8PD*C/' ܃T MES:/: h:d?/r@* LӼ|h̢7{xD U"alv{neysZ5EPY3P/UXZPT[W=h~*TEӴ8Ta鯿<Wtnaw^gyP>8ղ\^io2QSXT\5N?N[+~SL|@ÑfhڦT?)[  38RkJ;g$>3G$Ώ;BҐkT޳x# W$05pv6Ѵ!%vT"I+ԯi .o?bT=$j bJքXxJZPa@6+~5!''OG]fŸX[YGx1穥@@vl<!ga.gt a 6XFyWEBX5[.0v9`-A)6%(c ulaXko #= H}~ڟk-~jjO2(P 79 mhV]m@r,4OaPL8*\hs3jŠ<aZ>)5\3+vڊ K!/q<7MU b6ڌ DqK>Sbk“>nHϙ`M_3M\lck\iRU'Obo+7<|Y ֙0M"hGB{/BSA牟 Eb\Oմ//IPBqL@+rk+Z%:  L_jy*|{Q'ߟD?ΓYț9pnM*gՇ)di.xP+ʆ/x۔?U\fIGJ! %73!Z#}Α *B傷G5hxWv4-U.x`pք\vmj/'{@傷~h\<~k:i9ꨠ?UB-WK t4V7#jnhsGZ.H?7\%Y:10Od"._g) :z5/W\ůF ӍRՑx,VBy}LW(} Q@+q B/z@3s0o]D_\}}PtʋFzܝeY-9aԜ/QMq@'2Ld"D&2oDlIENDB`1Table$ SummaryInformation(:/DocumentSummaryInformation87CompObj@j   ,8 T ` l x:Search problems, their implementation and how to evaluatetearGraham Kendall,rah Notes.dotdagxk87Microsoft Word 8.0e@4W\@< 숿@ =@V U՜.+,D՜.+,|8 hp  CS - Nottingham University+Xij :Search problems, their implementation and how to evaluate Title 6> _PID_GUIDAN{B4DB7EAC-7085-11D1-8824-00C0DFAA07B5} [$@$NormalmH @@@ Heading 1$<@& 5CJKH>> Heading 2$<@& 56CJ88 Heading 3$<@&CJ<< Heading 4$<@&5CJ66 Heading 5 <@&CJ:: Heading 6 <@&6CJ22 Heading 7 <@&66 Heading 8 <@&6< < Heading 9 <@& 56CJ<A@<Default Paragraph Font,@,Header  !, @,Footer  !8Y8 Document Map-D OJQJ4OB4Head01 $x 5CJ(mH 0OB0Head02x 5CJmH &B@B& Body Text0B0Head03x 6CJmH   %)-048=AEJOTY^chmv(,/37>GNRUXav  2?MNQUX[^adjptx|#l |{zwusrqpongdba\[ZWRQPM   <:9876543+(%">?AGHW  %)-048=AEJOTY^chmv(,/37>GNRUXav  2?MNQUX[^adjptx|      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijk#l Y+0HK_mn}op#q;QYiw{f> 0"]"o""""""""""0$v*r--/&/J/16!;7;N;e;u;;;i>FGH!H?ABCDEFGIJKLMNOPSTUVWZ[\]^`abcdeghjklnopqrsuvxz|~: "8,0a;GpOpQc@no#q=@HRX_fmty} *!#%aqsv:::::::::X  MNb$]үֱ݊ "b$a'fd-\Xg 2$rߢ߇e b$ō\7)o;g Lj$b$o %qV 9 92$؞@=y!u>-tj2$$ybs>VDb$3 أJ#Qu5 UZEb$0yF}(pfBKb$|z[=|Q$2$"rSTwZJyb`@^M0 ^(  Z  s *A ??H  C A >? H  C A =?  H   C A <? H   C A ;?@ 2  42   Br`   C : 8 :42   n 42  A u'42  2 ZB  S DK)ZB  S DZB  S D'3`  C 9 _ 9`  C 8 )Ks 8`  C 7 < 7`  C 5TI 5`  C 6 6`  C 4T 4ZB  S DBH  C A /?H  C A .?H  C A -?8N % & L Z M 3 ,% p ,TB N C D = ZB O S ZD gZ P 3 +az @  +Z Q 3 ** p *Z R 3 )+ . )42 S n H2 T #  _c42 U @.42 V #  LZ W 3 (P/ (TB X C D5ZB Y S ZD4t#5Z Z 3 'n Mi 'Z [ 3 &7#5& &Z \ 3 %8" % %H2 ] # { rH2 ^ # ".l#H2 _ # #M$42 ` N&Z a 3 $!  $Z b 3 #L" #ZB c S Dl 8H d C A "?@ = ! B2 f  Mvf g S !gC !B2 h  yB2 i  Ly+B2 j  =  `B k c $DO -`B l c $D`B m c $D+7f n S  nc  f o S o-Vw f p S p G  f q S q _M  f r S r  f s S s_ `B t c $DM !f u S ua N #. v Z w 3  - . TB x C Dm l%m -ZB y S ZDn -  -Z z 3  $ >& Z { 3  -. Z | 3 #]% 42 }  $ G%H2 ~ # u%%H2  # Y,]-42   ,W {-N ".  Z  3  - . TB  C Dm l%m -ZB  S ZDn -  -Z  3  $ >& Z  3  -. Z  3 #]% 42   $ G%H2  # u%%H2  # Y,`-42   ,W {-Z  3  " $ Z  3 L,. ZB  S D %*N ".  Z  3  - . TB  C Dm l%m -ZB  S ZDn -  -Z  3   $ >&  Z  3  -. Z  3 #]% 42   $ G%H2  # u%%H2  # Y,`-42   ,W {-Z  3  " $ Z  3 L,. ZB  S D#s-N  #  T  # K" T  # Q T  #  + `     #    42     T  # o [ ` h e # { x42  h eT  # 3Q   3`     #  42     T  # 2o [ 2`     # 5#42     T  # 1o [ 1ZB  S D qZB  S D GGZB  S D b8p@ 7  If  s *?7  x  0W?` J W2` 3  #  Z  3 S   SZ  3 R   RZ 3   3 Z  3 Q 1  QB2   3 Z 3   ] Z  3 P 1  PB2   3 Z 3     Z  3 O 1  OB2   3 ZB  S D } @ ZB  S D {  ~N ok +C M f  s *?ok +C x  0V?p@Z! V2` 3  #  "UZ  3 N   NZ  3 M   MZ 3   3 Z  3 L 1  LB2   3 Z 3   ] Z  3 K 1  KB2   3 Z 3     Z  3 J 1  JB2   3 ZB  S D } @ ZB  S D {  d@ 7nF Kf  s *?7nFx  0U?P  U&` qw4 # ~ T  # I >  IT  # H   H` 3  # q T  # G 1  GB2   3 ` 3  # q 4T  # F 1  FB2   3 ` 3  # wcT  # E 1  EB2   3 ZB  S D ZB  S D < @ } +U Lf  s *?} +U x  0T?$V T ` 2(^ # ?**T  # Ds DT  # C{^ CT  # BF#$' BT  # AF#$n AT  # @ @T  # 0C 0` 3  # J!ST  #  1  B2   3 ` 3  # Y!AT  #  1   B2    3 ` 3   # %2(T   #  1   B2    3 ` 3   # ST  #  1   B2   3 ` 3  # }T  #  1   B2   3 xB  <D?';xB  <D?J$xB  <D?f!U%rxB  <D?!C%&xB  <D?(h(xB  <D?22@ F%U( @`  z  # 0 *T  # g3 m \  gB2    z  `  z  # 0 @$&Z  3 f3 m \  fB2     z  `  z  !# q Z " 3 e"3 m \  eB2 #   z  `  z  $# $&Z % 3 d%3 m \  dB2 &   z  `  xx '# Y"Z ( 3 c(k iZ cB2 )   xx ` /V *# J# "&Z + 3 b+M bB2 ,  /V  - xB*CoDEFo_P ( *@   }  . xBrC DEFYC ViMr@    / xBCDEFi&,@  2S $ `B 0 c $D$ `B 1 c $D%% `B 2 c $Dl%;l% ` 3 C a3Q  ( a` 4 C `4$ \& `f 5 S _5OF _f 6 S ^6i!  ^f 7 S ]7z!/" ]f 8 S \8]$ & \f 9 S [9$H& [f : S Z:>!" Z ; xBCpDEF<4pO fj\U@  & F( f < S Y<N&U( YZ > 3 h>"q%G hZ ? 3 i?"$%M& i~ A BjGaqHIh[JKL3MN>  j\@  M K E TB BB C D / ?TB C C D ? TB D C D M K~ G BkGHlHfI XJKLpMNK|   k~ H BlGoHAI XJKLpMNK|  lB S  ??0AGH2"#01l%))))0136c8=/?ACGKcLY}\#lsw ^4 tMg#s tI/d tL}$U tK/nF t@"1tAAstHK( tGx} tE #[t 4g 4 4  4b<1 tp] 4Ey44Lv td|$45b tv= txX t t OLE_LINK8 OLE_LINK7 OLE_LINK6 OLE_LINK1 OLE_LINK4 OLE_LINK5 OLE_LINK3 OLE_LINK2ijj~jjjjj$lijjjjjjj$l]g QZ]bdm2;@Etxz~lv g m p v , 2 7 ?  v 0 6 FM{txN]p-"2"""######//_1i111111111 2 2 2 2)2*2&4+4568 8>> ?+???@@AAAAAAuB|BDDDDDDFFFFTG]GiGuGGGPPGPIPfPhPRR/R1RNRPReRgRRRRRRRRRS)SYY\\\\^%^^^__$_(_*_3_9_<_E_S_______$`&`/`5`7`;`o`x`y``````aa'a.aaa(b1b9b;bEbHbwbbbbbbbbbbc c!c*c2c7ccccccccccccccc ddddxe{eeeeeeeeeeeffffffffffHgUgVg\ggg h i i$lU \ KL`oHYrsH"P":&<&e&g&&&&&''O)R)A+D+f+m+++,,P,S,,,,,r0z000111111 2 2L2M2e2t2;3C3&4,4E4J4e4m44455;;]?l?@@@@AAuB|BCCGGHHKKMM*N.NPPQRTTJVMVxVyVWWWXYZ_ZeZ\\^^9_<___PaZaccnfwfCgVg h i iiiiiii^j_jajbjdjej$lGraham Kendall?C:\WINNT\Profiles\Administrator\Desktop\011 Neural Networks.docGraham Kendall?C:\WINNT\Profiles\Administrator\Desktop\011 Neural Networks.docGraham Kendall?C:\WINNT\Profiles\Administrator\Desktop\011 Neural Networks.docGraham Kendall?C:\WINNT\Profiles\Administrator\Desktop\011 Neural Networks.docGraham Kendall?C:\WINNT\Profiles\Administrator\Desktop\011 Neural Networks.docGraham Kendall4C:\TEMP\AutoRecovery save of 011 Neural Networks.asdGraham Kendall4C:\TEMP\AutoRecovery save of 011 Neural Networks.asdGraham Kendall<C:\windows\TEMP\AutoRecovery save of 011 Neural Networks.asdGraham Kendall_C:\My Documents\Training & Courses\Lecture Courses\G5BAIM\Nott Handouts\011 Neural Networks.docgxk_D:\My Documents\Training & Courses\Lecture Courses\G5BAIM\Nott Handouts\011 Neural Networks.doc Z[ nq# %* > 0'? +l@ YiD M]T Ur\ C?f hh. hhOJQJo( hhOJQJo( hhOJQJo( hhOJQJo( hhOJQJo( hhOJQJo( hhOJQJo( hhOJQJo( hhOJQJo( M]TUr\%*nq#YiDZ[C?f>+l@0'? @UgUgܫUgUg(#lP@PP<@GTimes New Roman5Symbol3& Arial5& Tahoma;Wingdings"qhԚ"D&BCfW U+!20dXiAg@C:\Program Files\Microsoft Office\Templates\AI Methods\Notes.dot9Search problems, their implementation and how to evaluateGraham Kendallgxk  FMicrosoft Word Document MSWordDocWord.Document.89q