ࡱ> V7 bjbjUU 7|7|%~eljjjjjjjDOOOTPP+HQVFNVNVNVX^`$s 2мjnaX"X@nanaм:njjNVNV@:n:n:nnajNVjNV:nna:n:n:vjjnNVQ _r&IO8e.n<0+fkn:njjjj PC Shopping Assistant Using Case Based Reasoning to help customers find products baylor wetzel Artificial Intelligence and Knowledge Based Systems II Graduate Program in Software University of St. Thomas St. Paul, Minnesota, USA 12.15.01 Table of Contents - Summary  TOC \o "1-1" \t "Chapter Title,1,H1 Appendix,1" 1 Overview  PAGEREF _Toc535692424 \h 6 2 Technology Background  PAGEREF _Toc535692425 \h 12 3 The PC Shopping Assistant Application  PAGEREF _Toc535692426 \h 18 4 Application Limitations  PAGEREF _Toc535692427 \h 31 Appendix A: Overview of The Selection Engine  PAGEREF _Toc535692428 \h 32 Appendix B: Tools Used  PAGEREF _Toc535692429 \h 44 Appendix C: Data File Formats  PAGEREF _Toc535692430 \h 47  Table of Contents - Detailed  TOC \o "1-1" \t "Heading 2,2,Heading 3,3,Heading 4,4,Chapter Title,1,H1 Appendix,1,H2 Appendix,2" 1 Overview  PAGEREF _Toc535692431 \h 6 1.A Project Background  PAGEREF _Toc535692432 \h 6 1.B Licensing and Intellectual Property Restrictions  PAGEREF _Toc535692433 \h 6 1.C Overview of Retail  PAGEREF _Toc535692434 \h 7 1.D Purpose of the PC Shopping Assistant Application  PAGEREF _Toc535692435 \h 9 2 Technology Background  PAGEREF _Toc535692436 \h 12 2.A Approaches to Product Recommendation  PAGEREF _Toc535692437 \h 12 2.B Case Based Reasoning and the PC Shopping Assistant  PAGEREF _Toc535692438 \h 14 2.C Limitations of Case Based Reasoning  PAGEREF _Toc535692439 \h 17 3 The PC Shopping Assistant Application  PAGEREF _Toc535692440 \h 18 3.A Screen Flow  PAGEREF _Toc535692441 \h 18 3.B Screen Captures  PAGEREF _Toc535692442 \h 19 3.B.1 Start Up Screen  PAGEREF _Toc535692443 \h 19 3.B.2 Query Screen  PAGEREF _Toc535692444 \h 20 3.B.3 Results Screen  PAGEREF _Toc535692445 \h 24 3.C Graphical Batch Viewer  PAGEREF _Toc535692446 \h 24 4 Application Limitations  PAGEREF _Toc535692447 \h 31 4.A Performance  PAGEREF _Toc535692448 \h 31 4.B Code Quality  PAGEREF _Toc535692449 \h 31 4.C Dynamic User Interface Support  PAGEREF _Toc535692450 \h 31 Appendix A: Overview of The Selection Engine  PAGEREF _Toc535692451 \h 32 Appendix B: Tools Used  PAGEREF _Toc535692452 \h 44 B.1 Language and IDEs  PAGEREF _Toc535692453 \h 44 B.2 Environment  PAGEREF _Toc535692454 \h 44 B.3 Libraries  PAGEREF _Toc535692455 \h 44 B.4 Data  PAGEREF _Toc535692456 \h 45 B.5 Object Modeling  PAGEREF _Toc535692457 \h 45 B.6 Packaging  PAGEREF _Toc535692458 \h 45 Appendix C: Data File Formats  PAGEREF _Toc535692459 \h 47 C.1 Data File  PAGEREF _Toc535692460 \h 47 C.2 Query File  PAGEREF _Toc535692461 \h 48  Diagrams, Pictures and Tables  TOC \t "Diagram Caption,1,Diagram Caption - Main,1" Screen flow  PAGEREF _Toc535692462 \h 18 The start up screen  PAGEREF _Toc535692463 \h 19 Query screen  PAGEREF _Toc535692464 \h 20 Query screen no breakpoints  PAGEREF _Toc535692465 \h 23 Results screen  PAGEREF _Toc535692466 \h 24 Batch screen data tab  PAGEREF _Toc535692467 \h 25 Batch screen query  PAGEREF _Toc535692468 \h 26 Batch screen data breakdown  PAGEREF _Toc535692469 \h 27 Batch screen work area  PAGEREF _Toc535692470 \h 28 Batch screen results  PAGEREF _Toc535692471 \h 28  Overview Project Background The PC Shopping Assistant application and this accompanying paper were created to satisfy the requirements of CSIS636T Artificial Intelligence & Knowledge-Based Systems, the second semester artificial intelligence course in The University of St. Thomas computer science graduate school. The focus of this course is on developing expert systems. Prior to this semester, I had created a general purpose Case Based Reasoning (CBR) engine named The Selection Engine. The goal this semester was to use the engine to build a realistic and useful application. Given my background in architecting large e-commerce retail systems and evaluating artificial intelligence tools for use in retail, I decided to build a retail system. Specifically, I chose to build a product search system that used CBR to guess at what product a customer was searching for. More detail is provided in  REF _Ref535573243 \r \h 1.D  REF _Ref535573261 Purpose of the PC Shopping Assistant Application. Although not a goal of an expert systems course, I decided for personal reasons to build a system that was, in many ways, production quality. That means that, while no one is likely to confuse this application for a polished commercial product, substantial effort was put into making sure that the core technology and architectural decisions resulted in a system that could be easily and quickly converted into a production or commercially-viable system. Licensing and Intellectual Property Restrictions The PC Shopping Assistant and the underlying CBR engine were written in their entirety by baylor wetzel. The systems rely on a very small amount of code (I believe one nice but relatively unimportant routine) developed by others and released as open source. This application is released without restriction with the single stipulation that you cant go around claiming you wrote it. The code, either in part or in its entirety, can be used by any one for any purpose, which includes using it in a commercial application without acknowledgement or compensation to me. The Selection Engine was developed for fun. The PC Shopping Assistant and this paper were created for a class. No one is promising this code to be perfect, and the focus was on academic issues, not production ones. Further, The Selection Engine was written by one person over three months while the PC Shopping Assistant was created by one person in four months, so you obviously shouldnt consider this to be perfectly polished, bug-free, heavily documented, performance turned, infallible, commercial-quality code. But you probably knew that already, didnt you? If you use this code and it causes you burst into flames or develop cancer, the law probably wont let you sue me. If, however, it does, you will be sorely disappointed by how much money youd get. Pretty much just a junker nine-year old car, some comic books and my karate DVD collection. God my lifes sad. ( Overview of Retail Most people are familiar with the concept of a store and the products sold in them, but I think its worth making a few explicit observations about the different categories of products and the special issues related to each one. Some products are configurable, others are not. Some are important, others are for fun. Some are stand alone and others require other products to work. Some come in multiple versions while others have only one model. Some have numerous competitors while others have none. Some products are well known, others unheard of. Some products are easy to use, others require assistance. Some products are judged by their features, others by qualitative, subjective criteria. Each of these distinctions factors into how a product is sold, marketed, stocked and how the sales force handles it. The primary question a retail customer has is what should I buy?. And this is one of the things that AI, as virtual sales people, can help with. Many products are differentiated by features. Examples include cars, DVD players, washing machines and televisions. Cars are judged by seating capacity, cargo room, horsepower and price. They are also judged by subjective criteria such as how cute, sporty or regal looking a car is, but quantitative factors are generally the most important. DVD players are judged by the types of media (DVD, CD, VCD, CD-R, etc) they can play, output connections (component, s-video, etc.) and price. More subjective criteria such as manufacturer reputation and appearance play a role, but normally only as tie breakers. If DVD players are radically different in capability and price, as they once were, aesthetic concerns are of minimal importance. If DVD players are commodities, as they are now, more importance is placed on soft criteria. Even then, most cars, DVD players, washing machines and televisions tend to look and act relatively similar. On the other extreme are products that are almost identical in features and price but differ substantially on qualitative criteria. In this category are many high volume leisure products such as CDs, books and movies. It is normally meaningless to recommend one book over another because it is 20 pages longer or to suggest one movie over another because it is 15 minutes longer. Many products fail to work without required accessories. Most retail customers purchasing complex items want to purchase solutions, not parts. When a customer says he wants to buy a dryer, he typically means he wishes to have a working dryer in his house. That can mean purchasing a gas dryer, a dryer vent and a clamp to attach the two. A TV satellite system requires a satellite, a receiver, cables and possibly a telephone line extender, all sold separately. A car stereo requires a head unit, mounting kit, either RCA cables or speaker cables, possibly a speaker cable to RCA adapter, possibly one or more RCA Y cable splitters and perhaps a line driver. It is uncommon for a customer to know every single item they need to make their primary purchase work. They normally must rely on the sales associate, and it is common for a sales person, especially in low-paying, high turnover stores, to not know which related items a customer needs. Many products rely on services. Most cell phone, many personal video recorders (Tivo, UltimateTV, etc.), some network appliances (WebTV, etc.) and all digital satellites require monthly subscriptions. The majority are proprietary to the specific device, so when customers consider purchasing equipment they must also consider the associated services. Shopping is frequently an unpleasant experience. Heres a list of some of the most common complaints: Dependency on sales person Product complexity Product variety Sales person knowledge (either no information or wrong information) Sales person availability Product availability Sales process time Inability to test products Product support Many of these are problems that can be ameliorated by the intelligent application of technology (and, of course, with non-technical solutions such as better floor staff, more staff and better processes). Retail companies make money by selling products, but that doesnt always translate into profits. Retail companies, like most companies, must deal with large amounts of information. It is not at all uncommon for a store to sell an item for less than it cost because the store does not know the true cost of that item. If that seems odd, consider this scenario. A lighting store sells lamps. It buys 100 lamps for $50 to resell at $75. Will the company make money on the sale? The answer is, maybe. In addition to the cost of the lamp is the employee cost (sales staff, stockers, loss prevention, customer service, store manager, etc.), facilities cost (rent, electricity, maintenance, shopping carts, cash registers, etc.), storage cost (warehouse rent, warehouse electricity, transportation cost, warehouse staff, forklifts, etc.), ordering costs (buyers salary, ordering systems, transportation, accounting), customer service costs (returns, restocking, etc.) and marketing costs (advertising, coupons, etc.). On top of that are opportunity costs (money you lose by spending it one thing over another; if two lamps cost $50 and you can sell lamp X for $75 and lamp Y for $85, selling the lamp X results in a gross margin of $25 versus $35 for the lamp Y; if you choose to stock lamp X instead of lamp Y, each sale has an opportunity cost of $10, which is the extra money you would have made had you invested your money differently) and carrying costs (its possible that you will not sell all 100 lamps, leaving you to absorb the $50 cost of each unsold one). So how do retail companies make profits? By increasing sales, increasing the quality of sales (selling more profitable items) and holding down costs. These can be done by: Cross selling (as a general rule, profits are substantially higher on accessories than they are on core products; a DVD player might carry a 5% markup while the cables for hooking it up have a 100% markup) Better marketing (more focused marketing to keep down advertising costs and identifying and understanding the most profitable customers to increase sales) Better inventory management Better supply chain management Adaptive pricing (finding the proper price for the stores location based on proximity of competitors and changing pricing based on pricing events such as holidays) Improving product presentation (product location and display) Improving loss prevention Most of these issues are fairly well understood and addressed by the market. As an example, several large companies (SAP, Manguistics, Retek, Peoplesoft, etc.) sell products that concentrate on inventory and supply chain management. Most of these use ideas from the AI and statistics field hierarchical task networks for supply chain, clustering and rule induction in data mining, constraint based reasoning for product configuration, etc. Where the market has done less well are in customer-facing systems, especially in the area of product selection. Determining which items to buy can still be a confusing, wasteful and frustrating experience. Some advances were made in this area, primarily in collaborative filtering (addressed later) thanks to the popularity of Web-based stores. The two primary reasons are, in my opinion, that Web-based stores do not typically have live sales people there to walk customers through the sales process and because investors were throwing large sums of money into the e-commerce market. While these advances are appreciated, product selection capabilities are still unnecessarily limited and shopping is still, more often than not, a headache. This paper and the project it describes focus on the product selection problem and, in particular, how case based reasoning can be used to help shop for certain types of items. Purpose of the PC Shopping Assistant Application In the summer of 2001, I wrote a general-purpose case based reasoning engine (described later), which I made freely available to the world. Interest in it was more than I expected (I expected none). Within weeks of releasing it, it was being tested and/or used by people (primarily academics) in New Zealand, Sweden, Ireland, China, India, Portugal and the United States. The Selection Engine, the extremely unimaginative name of the CBR engine I wrote, was worked on, off and on, by one person (me) over the course of three months. At the end of the summer, it looked like a program that had been written by one person in his spare time, so it was no surprise that I received several questions about how to use it and what it could do. In the Fall of 2001 I enrolled in an independent study AI 2 class and decided to use the class to create a sample application that illustrated some of what the engine can do and how it could be used. I also decided to build a detailed graphical batch viewer application that could help me and others understand and debug the CBR process. Here is an excerpt of my project proposal: My goals are three-fold. First, I hope to develop a realistic CBR-based application. Specifically, I intend to build a sales advisor system that helps computer shoppers determine which products to purchase. The sales advisor will be targeted to the e-commerce environment and will be implemented as a stateless Java applet. The success of the application will be judged on the accuracy of its recommendations, although attention will also be paid to other aspects of expert systems, most notably system maintenance. My second goal is to investigate issues in data representation and to illustrate ways to model data that make application development easier. It is my belief that substantially more deployed expert systems fail because of data representation than because of the underlying expert system technology. My third goal is to understand those features that make a CBR engine successful. I am the author of The SelectionEngine, a highly portable CBR engine hosted on SourceForge, a site for open source projects. While the engine appears to be functional when used within a test harness, no attempt has been made to use the engine in a real-world application. The needs of the proposed sales advisor application will quite possibly lead to changes in the underlying CBR system. Given that I had four months to work on this, these goals seemed realistic. Unfortunately, I forgot to factor in several factors, most notably that Im lazy and spend my time playing games and looking for work. The resulting application was less than I had hoped for. But since this paper is being written for Dr. Bennett, my professor, its worth noting that its still a pretty good application. The application, as promised, helps customers purchase computers. The goal was to have the computer application emulate how a sales person might act. That meant asking simple, easy to understand questions and then making a recommendation. At its most simple, the computer would ask the customer to rate, on a scale of one to five, how important price and performance were. The shopping assistant application would then recommend a computer along with a list of several alternates should they not like the recommended system. Although it might not seem like it, several weeks were spent designing different user interfaces in an attempt to find the proper balance of power and simplicity. Although many designs were drawn up, I ended up only having time to implement one interface. I chose to implement the power user interface since it is more useful in testing the system internals and because it best exercises the goal of a dynamic user interface (discussed in the architecture section). With this interface, a customer who knows a bit about computers can specify that its important that he get something thats fast (the system uses relative values, so youd specify fast rather than 1.4 GHz, which is both easier to understand for non-computer professionals and makes system maintenance easier, as discussed later in the architecture section), he would prefer that it doesnt cost much, it would be nice to have a DVD player, it must have a CD burner and, if at all possible, no Dells. Also implemented was a detailed graphical batch viewer. This application was similar to the test harness included with the original Selection Engine but made it easier to view application details and added the concept of a data breakdown, which is where each numeric value in the data set is linearly plotted along a line (rounded to specified percentage breakpoints for grouping purposes). The breakdown helps the user (in this case, me) understand the spread of each trait and the proximity of data points which is useful in eyeballing the results of the similarity (nearest neighbor) computation. This application is intended for use by developers, not customers, and so little effort was spent making it pretty. Technology Background Approaches to Product Recommendation While customers frequently ask sales people for help in finding a product, the ways in which they ask vary substantially. Some common examples: Where are the 1 to 3 PVC pipe step-up adapters? Do you have Gin & Juice by The Coup? Which CD had the song Oscillate Wildly? Where can I find the ink cartridges for an HP DeskJet 692C? Do you have the latest John Grisham novel? Can you recommend a good movie? I really liked Half-Life. Do you have any games like that one? I want to surf the Web. What kind of computer is right for me? I have $200 and want to buy a camera. Which one should I get? What would an 18 year old want for his birthday? OK, I have the satellite dish, is there anything else I need to buy? I like classical music, have a small apartment and own a Sony receiver. What else should I buy? Eventually, I want to build a competition car stereo system, but I only have $800 right now. What should I upgrade first? I really wanted the Samsung 19" monitor, but it's out of stock. Should I try another store, another model or special order one? Most of these questions can be handled by a human sales associate. Not always well, but they can at least be answered. Most computer systems, both on the Web and in the stores, have problems unless the customer knows an exact product ID or the value of a significant trait such as the products name. Although unfortunately not wide spread, several methods for answering at least some of these questions have been developed. A low-tech option popular on many Web sites is the drill down search. If the customer was looking for a 36 TV, he might first click on Home Electronics then Entertainment then TVs and then 36-56 and finally see a list of TVs in that category. This doesnt help answer most of a customers questions, its easy to get lost (where would you look for a portable MP3 player, in mobile electronics, stereo or computers?) and it can take a while to navigate through all the menus. It also puts a burden on the marketing department to categorize data and maintain those categories (which the marketing department often does anyway since businesses are often organized category and sub category lines). Still, it remains popular with Web designers for its complete lack of algorithms, making it a no-brainer to implement. Another approach that is somewhat common is to use manually built cross sell tables. A database table would hold a list of products to recommend if a customer purchased a specific product. For example, if you bought a camera, a related items table might tell the system to recommend batteries, film and a carrying case. This approach can give fairly good recommendations but requires a large amount of data entry and is prone to data maintenance errors. An automated approach that became popular during the rise of the Internet store is collaborative filtering, which is based on a variety of statistical techniques. In CF, a computer divides customers into a defined number of groups. Clustering is based on the types of products each customer buys and the similarity of their buying patterns to other customers. The result is a number of groups, some of which might be filled with people who enjoy action games and action movies, others containing people who buy romance novels and Barbie dolls and still others who buy beer and diapers. Once the computer system determines which group you fall into, it recommends items that other people in your group have purchased. As an example, if you purchased a computer and several action movies, the computer might determine that other people who purchased the same products you did also liked action games and would then recommend a few to you. A slight variation keeps no information on you. Instead, you tell it one product you like and it tells you others that you might want. If you tell the computer I like Ani Difranco CDs, the system might determine that people who buy Ani Difranco CDs also buy CDs by Dan Bern and Alexia Lutz and recommend those. Collaborative filtering is best used for taste-based products such as music, books and movies. Another taste-based approach is model-based recommendation. Two items are determined to be similar based on a set of traits they share. For movies, the traits might include the actors, producers, director and genre. When a customer asks for a movie recommendation and states that she liked The X-Files, the computer would look for other movies that were science fiction, starred Gillian Anderson, were written by Glen Morgan and directed by Chris Carter. The system might then recommend The One, The X-Files TV series, The Lone Gunmen TV series, the Millenium TV series, Princess Mononoke and Hell Cab. Although theres still a fair amount of debate on CF vs. model-based recommendation systems, in my personal opinion, which is always right, model-based recommenders for taste-based products are of limited value and are often not be worth the effort it takes to enter in all the necessary information. Model-based recommendation can also be used on feature-based, as opposed to taste-based, products. A customer might state that he wants a TV that has a universal remote, is larger than 27 and has s-video inputs. This is the same as case based reasoning, the approach used in this paper. How do CBR and model-based recommendation differ? Im not sure they do. In practical use, most CBR systems Ive seen tend to compare products against a specification whereas model-based recommenders (which Ive only seen used for taste-based items such as movies) compare products against a specific, existing item. Another CBR is constraint-based reasoning. This type of CBR is predominantly used for configuring complex products. A model is built of a given product and dependency and constraint rules are created. Consider a construction system that determines the cost of building a room in your basement. The room would be the primary model. It would have, as requirements, walls and at least one door. Once given measurements, the model determines how much sheetrock is to be used to build the walls. The sheetrock would require a certain number of boards for framing and mud, tape and paint for installation. If the room were designated a bedroom, the computer model might require at least one electrical outlet and an egress window and might suggest a cable outlet and wiring for a ceiling light. Constraint-based reasoners are very useful for configuring complex products and can help let a customer know which parts they might need but have not yet purchased. Beyond suggesting necessary parts, though, constraint-based reasoners are not really designed for product selection. Most constraint-based reasoning systems are large, expensive and complicated and are targeted at expert users at manufacturing companies. Simplistic, normally homegrown, solutions have, however, caught on in popularity with Internet computer retailers, where constraint-based reasoning is used to catch configuration problems such as a customer adding five PCI-based products to a computer that only has four PCI slots or choosing an AGP graphics card for a computer that does not have an AGP slot. Some people have tried to use constraint-based reasoners to generate recommendations, but these people are typically idiots and their results are expectedly sub-par. A simple yet interesting recommendation approach is interactive querying, referred to by Robin Burke, who created the restaurant recommender Entre (which, after bouncing around the University of Chicago, Northwestern University, University of California at Irvine and Recommender.com, appears to be missing), as a collaborative and knowledge-based recommendation system, which he sometimes called FindMe systems. The idea is simple recommend a restaurant and then give the user seven buttons to press Less $$, Nicer, Change Cuisine, More Traditional, More Creative, Livelier and Quieter. Pressing a button brings up a new recommendation and the user can once more tweak the criteria. Underneath the covers, Entre uses collaborative filtering to categorize the restaurants and case based reasoning to find the best matches. When multiple restaurants equally match the search criteria, CF is used to break the tie. Case Based Reasoning and the PC Shopping Assistant The PC Shopping Assistant uses The Selection Engine, a general-purpose case based reasoning engine that relies on a dynamic, brute-force, k-nearest neighbor algorithm. The nearest neighbor algorithm decides how similar two items are by, oddly enough, using a variation of the Pythagorean theorem. Conceptually, it plots all the items on a graph and then determines which item is closest to what youre looking for (for more detail, see  REF _Ref535653547 \r Appendix A:  REF _Ref535653551 Overview of The Selection Engine). The closer the item, the more similar it is. The most similar item is considered to be the best match. There are other names for this such as sparse matrixes and vector space models (which, to the best of my limited knowledge are pretty much just CBR), but the concept is pretty simple. So how is CBR different from your standard, every day SQL statement? First, SQL requires an item to meet each and every specified criterion. Second, SQL doesnt handle missing data very well. Third, SQL returns a Boolean set either an item matches or it doesnt. SQL does not rank the items and state that some are good matches and others are simply OK. Fourth, SQL does not support the concept of weighting (obviously, since it does not rank results). If SQL could do these things, a SQL statement might look like: SELECT rank = 3 points for texture, 2 for spicyness, 1 for the rest * FROM recipes WHERE recipe contains a couple of the following 5 criteria { spicyness around low AND (ranges from very_low-very_high) texture around crunchy AND (ranges from very_soft-very_crunchy) meat = pork AND vegetables prefer none AND (ranges from 0%-100%) cooking_time around 20 } The result set might contain, in order, sweet and sour chicken (very_low spicyness, crunchy, chicken, 10% vegetables, cooking time 25 minutes, rank = 85% similar), spicy pork (high spicyness, crunchy, pork, 0% vegetables, cooking time 20 minutes, rank = 75% similar) and kung pao chicken (very_high spicyness, soft, chicken, 30% vegetables, cooking time 15 minutes, rank = 35% similar). Unfortunately, SQL does not do anything close to this. Some pieces can be approximated using ranges (SELECT * FROM items WHERE (cost > 500) AND (cost < 1000)) and like statements (SELECT * FROM customer WHERE (last_name like ande%)), but only very poorly. CBR and nearest neighbor, luckily, does this pretty well. As with all things AI, there are numerous variations of CBR, with most of the modifications made for performance reasons. An obvious example is pre-computing distances offline and then using the cached distance information. Another variation on the nearest neighbor theme is to use cut off criteria to filter out those items that are unlikely to be good matches, saving the system from having to fully compute the distance to those items. The Selection Engine trades off performance for flexibility. SE allows users to search by a subset of the criteria. If you dont want to specify a value for hard drive size or amount of RAM, you dont have to. Those traits will not be used in similarity computation. This gives the user more flexibility in their search, but to support this flexibility the distance calculations must be dynamic, meaning that they cannot be pre-computed. Assuming it takes longer to perform the calculations than it takes to access cached data (which is normally true, especially when in memory database management systems are used), this obviously slows things down. There is a lot of debate in the CBR community about automated case adaptation. This is when a CBR system finds the closest matches and then modifies them until they are an exact match. A classic example (which Im stealing from the Caspian CBR system) is a recipe system that cant find an exact match for a recipe. Suppose you want sweet and sour pork but the best the system can find is sweet and sour chicken. A set of adaptation rules can tell the system how to convert chicken recipes to pork recipes swap the ingredients, cook a little longer, turn the heat down a little, remove any tomatoes in the recipe, etc. The big question has been whether CBR researchers should put any effort into adaptation. Why? Because normally someone has to manually write the adaptation rules, and many CBR researchers hate the thought of someone having to do manual work (CBR systems are supposed to be automatic, relatively maintenance-free, learning systems, and with the exception of adaptation rules, they are). Also, it hasnt exactly been easy to get adaptation rules to work yet. Many systems that propose adapted cases must have those cases reviewed by a human prior to being accepted. This author thinks adaptation rules are neat but should be considered a separate field from CBR. The Selection Engine and the PC Shopping Assistant do not use adaptation rules. It is my personal opinion that the key to a good CBR system is the design of the user interface. While I have not seen many production CBR-based applications, it is my belief that the most common approach is to give the user a list of all traits and ask them to specify the values for each one. The problem is that users often dont know what many, if not most, of the traits mean or what realistic values are. Another problem is that, in my opinion, that a human must decide which traits to include in a CBR system and they frequently choose the wrong ones. Many traits hint at valuable but are not, in themselves, valuable. Lets use shopping for a PC as an example. Customers typically want a computer that is fast. Or a computer capable of doing digital video. Or playing games. They do not want a computer that has a 1.4GHz CPU or 512MB of memory or a 10,000RPM hard drive. They dont want to know what a gigahertz, megabyte or revolutions per minute is. They just want something that works. Most customers put up with specifying a hard drive size and chip architecture because they have no choice. There are a few solutions to this problem. One is to use relative values. Rather than asking a customer if they want a 1.4GHz CPU, ask them how fast they want the computer to be. The PC Shopping Assistant asks the user how fast he wants his computer to be on a scale of one to five. That translates into slowest, slow, average, fast and fastest. In my experience, the average customer is comfortable answering whether they want the fastest computer currently for sale or just an average computer. The same customer is less impressed with questions about GHz. Its worth mentioning that relative values makes system maintenance easier (more about this later). A second option is to use roll-up attributes. Rather than asking the customer what type of CPU and how much RAM they want, ask them how fast they want the computer to be. PC performance is a factor of many things including CPU clock speed and architecture, amount and access speed of RAM, hard drive seek and throughput rate and a host of other factors. Rather than asking the user to fill in all that information, ask the user how fast he wants the system to be. If the user says Above Average speed (4 in the PC Shopping Assistant), a (most likely hard-coded) formula in the system can convert that single number into specifications for a variety of traits. For example, the system could set CPU to value 4 weight 5, RAM value 3 weight 4 and hard_drive_seek_time to value 3 weight 3. A third, and more direct, option is to use meaningful numbers. Rather than guessing at which components contribute to system performance, run a benchmark on the system. For PC performance, numerous benchmarks exist including WinMark, Quake2, 3Dmark and GLMark. The downside of this approach is that vendors rarely supply this information, requiring the buyers or their assistants to run the benchmark tests. While undesirable, this is not a crippling amount of data acquisition and entry. The possibility of entering bad data exists, but anyone who has ever worked with wholesalers knows that bad data is already the bane of retail. Certainly not a good situation, but one most companies marketing departments are equipped to handle. Limitations of Case Based Reasoning Case based reasoning is good for finding those items most like the specified criteria. This is useful for finding primary items but is not nearly so good at cross selling and recommending accessories as collaborative filtering is. Nor is CBR very good at recognizing when other items are required, a task constraint based reasoners excel at. While a CBR system, with the proper design, can be made to tell you that the HDTV youre buying requires a special receiver and the game youre buying only works on the Playstation, it takes a fair amount of work and carries a higher than average risk of failure. In this instance, CBR is a square peg in a round hole. As mentioned before, CBR can be used to make taste-based recommendations. The problem is, its hard to get them to make good recommendations. To be fair, I have yet to see a collaborative filtering recommendation system that Im even modestly impressed with, but I believe that, in the long range, CF has a much better chance of making good recommendations than CBR. The problem with CBR is that a human has to do the design, and its because of humans, in my jaded yet highly accurate opinion, that most computer systems fail. Picking the right traits to model is a difficult task, a problem that CF doesnt have to deal with. It is worth mentioning one interesting experiment. Graduate students at the University of California at Berkeley have built a music jukebox named the Ninja Jukebox (http://www.cs.berkeley.edu/~mdw/proj/jukebox/). It contains a CBR-based search engine. You tell it the name of a song you like and it finds similar sounding songs. The CBR piece is fairly standard its a simple brute force nearest neighbor search with some pre-search pruning done for performance reasons. What differentiates the Ninja Jukebox search engine from other CBR systems is the choice of traits. Each song has 1,024 features. The features represent musical concepts such as rhythm and tempo. When a song is added to the jukebox, a custom-written signal analyzer analyzes the song for musical information and writes the data to a profile record. The search is done against these profiles. The authors report good success with jazz and classical music, less so with other types. The PC Shopping Assistant Application Screen Flow  Screen flow This would have been a fairly long section in which I showed you the role-based user interface with customized interfaces for novice users, knowledgeable users and power users as well as usage-oriented searches with usage blending (multiple profile matching), feature-oriented searches, quick hit searches and interactive tweaking of results. Unfortunately, many of these designs refused to jump off my copious pages of drawings and implement themselves. Lazy designs. So what we have is two very lonely screens. On the Advanced Query screen, the user describes the perfect computer. The Results screen shows the results. The Main Menu is a simple group of buttons allowing the user to decide which screen to run, assuming that more choices actually existed. This screen, in its current form, would not exist in a production application. The Batch screen loads the data from data.txt and then executes the query in query.txt. The screen has five tabs. The first tab shows the data that was loaded. The second tab shows the query that was executed. The third tab shows the ranges (max, min and intermediate values, rounded to the nearest 10% mark) for each value. The fourth tab is a general display tab. Currently, the system displays the results of running the data through the filter engine. The fifth tab shows the results. The Batch screen is meant to be used as a debugging and educational aid. Details as to the function of each screen are given in  REF _Ref535682074 \r 3.B  REF _Ref535682081 Screen Captures. Screen Captures Start Up Screen  The start up screen This is the start up screen. Not much to discuss here except that this would have led to the Settings screen, which would have let you set the file names for the data and query files, let you determine whether the query screen showed the max and min values and various other settings. This was not implemented due to time considerations and because a Settings window, while nice, was not central to the purpose of this project. Query Screen  Query screen The Query screen is the most important screen in this application. Although it looks fairly conventional, this screen went through nearly a dozen designs in an attempt to be as powerful yet easy to understand as possible. Each trait (manufacturer, processor speed, price, etc.) is in its own panel. For each trait, the user can specify preference criteria (Prefer) and filter criteria (Require). All preference criteria is of the form Value/Weight. For numeric data, the values are chosen by choosing a value from one to five, which represents lowest (1), low, average, high and highest (5). The values under these radio buttons either say Low and High or give the actual minimum and maximum values, depending on a switch set in the code (see figures Query screen above and Query screen no breakpoints below). It is worth pointing out that the min and max values are always accurate and up to date. The system determines these values by reading the data. They are not hard coded in the code or in any settings file. Data can be added or removed and the meaning of lowest, average and largest will adjust them selves. This was done to ease system maintenance. The weight (Importance) is set in a similar manner. The user is asked, on a scale of one (Low) to five (High), to rate how important this trait is to him. A trait of low importance still affects rankings. To prevent a trait from having any impact, the Prefer checkbox should be unselected. The filter (Require) criteria act like a traditional SQL statement. Items must match this criteria in order to be included in the rankings. Filters for numeric data are made up of operators and values. The operators are the standard SQL and expression operators - =, !=, >, >=, < and =<. The values to filter on are selected from a drop down box. This shows the user what values are currently valid. Currently, this piece is a bit bizarre as the values listed are from the breakpoint computations. The breakpoint computations take each value between the minimum and maximum values present and rounds them to the nearest 10% marker. As an example, if the min and max values were 0 and 100, the values 8 and 12 would be rounded to 10. This limits the number of items in the list to 11 (0%-100%), which is useful in preventing the list from growing too large. Approximate values are also appropriate for operators such as > and <. Unfortunately, they arent good for = relationships as the rounded values might not exist in the data set. Further, users who know what the data should look like will complain when seeing that 112 and 130 are valid values for the amount of RAM in a personal computer (they arent). In my personal opinion, the current design for filter values is pretty poor and should be replaced, although Im not exactly sure how. Since filter criteria are absolute, there is no weighting. The query screen also handles string and Boolean values. Both string and Boolean preferred values are listed in drop down list boxes, although the Booleans could just as easily be represented as radio buttons or check boxes. The list box was used because I felt the screen already had to darn many radio buttons and check boxes. As expected, the string values to display are determined dynamically by the system. The filter criteria for string data is the same as for numeric data except that the only listed operators are = and !=. The filter criteria for Boolean data is interesting, at least in the fact that it was redesigned and implemented several times. The only control on the filter line is the Require check box. If checked, the filter value (true or false) is read from the preference line. This was done to prevent the user from preferring one value and requiring the opposite. It was also influenced by the authors desire to have as few controls on the screen as possible (there are plenty as it is). This probably isnt the optimal design. There should never be a need to have both a preference and requirement criterion for a Boolean value after filtering, all remaining data will have the preferred value, making it a meaningless selection criterion. This screen is a pretty cluttered, data-packed screen, which Im not happy with. The consolations are that this was meant to be a power user screen and that this screen is less busy than the product ordering Web pages for Dell, Gateway and Micron (a substantial amount of time was spent gathering and analyzing the Web pages of these companies to identify design ideas, both good and bad). This screen was intended to be the second most flexible/powerful screen. The most powerful screen allowed, on paper, for the user to add criteria and operators rather than modifying a pre-filled form. The current query screen has some limitations that the expert screen hoped to resolve. First, the current query screen, unlike the batch process, does not allow negative preference criteria. For example, while you can say that you prefer machines made by Micron, the query screen does not allow you to say that you prefer to avoid Dells. The query screen could be modified to handle this by adding an operator list box/check box to the string preference line, but this was not done because I felt it substantially complicated the screen while providing functionality that was only infrequently needed. Second, the query screen does not allow you to set multiple criteria for a single trait. The only preference-based example I can think of where this would be useful is to say that you prefer to avoid Dells, HPs and Compaqs. In other words, it is useful for soft filtering out data. For hard filters, single trait / multiple criterion is useful for defining ranges, for example, price greater than $500 and less than $2,000. This query screen does not support it in part of user interface complications but mostly because The Selection Engine does not support this functionality. A third thing you cant do is set a per-criterion weight (applicable only to multiple criteria / single trait situations). For example, you cannot say that you mildly dislike HP but strongly dislike Dells. Again, this is functionality that The Selection Engine does not support.  Query screen no breakpoints Results Screen  Results screen This screen should be self-explanatory. The results of the product search are displayed here. Currently, all traits are displayed. Data is stretched or squished to fit on the screen there are no horizontal scroll bars. While The Selection Engine supports k-nearest neighbor retrieval, this screen currently shows rankings for all items that were not filtered out. A close observer will notice a bug in the above screen shot. Item 10 has a 5% similarity. This is, of course, impossible. This bug does not show up in the batch runs and has not been identified and corrected in the graphical version as of the time of this writing (1.14.02). In a production application, it is expected that a user would select one of these items and be able to view additional data about the item or add it to their shopping cart. This functionality was left out because of time considerations and because it was not central to the purpose of this project. Graphical Batch Viewer The Batch window is used for understanding and debugging the selection process. When the Start button is pushed, the system loads data from data.txt and a query from query.txt and then processes the query. The data file defines both the data and meta-data. The meta-data defines the name and data type of each trait. Any number of traits (strong, Boolean, float and integer) can be added and any trait name is legal so long as it unique. The data file makes no assumption about the domain. While these screen shots show the retail PC data used for this application, it could just as easily be recipes, cars or cell phones. The Batch window is driven completely by the input data files. This means that it, too, is domain independent. All screens in the batch viewer are read only.  Batch screen data tab The Data tab shows what data was loaded. This data comes from the Items collection, not the data file. The collection, rather than the file, is displayed to verify that the data was properly loaded into the system.  Batch screen query The Query tab shows what criterion will be used to perform the search. As with the data tab, this data is taken from the actual criteria objects rather than from the query text file.  Batch screen data breakdown The Data Breakdown screen shows the spread of values for each trait. For Boolean values, the data breakdown is simple. In the example above, CD-RW has only two values, true and false. In the loaded data set, 60% of the items have a CD-RW and 40% do not. The numeric examples are a bit more complicated. For cost, the smallest value is 500 and the largest value is 3,500. Since these values are used to determine distance in the CBR engine, they are plotted on a scale from 0 to 100 where 0 is the smallest value and 100 is the largest. We are, in essence, normalizing the data. To make analysis for a human (well, me anyway) easier, the data is divided into 11 groups, the breakpoints. All values are placed into one of these 11 groups by determining their exact position and then rounding to the nearest breakpoint (all of which here are at 10% boundaries; the number and placement of breakpoints is controlled by constants in the code). Why group the values? First, to make them fit on the screen. Second, because it makes it easier to see the distribution and clumping of values. For cost, the data breakdown shows that 50% of the systems are near the median price (although none are that price exactly). 30% of the systems have the lowest possible price. There are no systems between the lowest price and the average price. This information should help the developer recognize that if the user asks for a low cost PC (in the query screen, by selecting two on the scale of one to five, which translates to a percentile value of 25%), he will most likely end up with a system with an average price. The query will return the same value as if he had asked for an average priced computer (in the query screen, a value of three corresponding to a percentile value of 50%) and will produce results similar to if he had asked for an expensive computer (in the query screen, a value of four which corresponds to a percentile value of 75%). The purpose of this screen is to give the developer a quick and easy way to guesstimate what a query should return without having to always work the nearest neighbor calculations out manually (a non-fun process, I can assure you).  Batch screen work area The Work Area tab displays all the information that didnt fit anywhere else. Currently, this means displaying the names of the traits and the results of the filter step.  Batch screen results The Results tab is identical to the results screen returned by the interactive query screen. All traits are displayed and all items are sorted in rank order. Application Limitations Performance Given that the goal of this project was to investigate the functionality of case based reasoning, almost no effort was made to performance tune the code. No caching is done. There are several places in the code where data is loaded from disk multiple times in a row. Several full-scan loops are embedded inside other full-scan loops resulting in O(n2) or worse performance. Some of the collections are cloned (a deep copy) unnecessarily. In short, the code is a mess. Having spent an inordinate amount of my career performance tuning complex applications, I believe I am qualified to recognize poorly tuned code, and this code is among the worst Ive seen. It is worth mentioning that this code was written in Java using Javas Swing windowing library. Java is fairly slow while Swing is just a slovenly pig. If I were to build this as a production application, I would most likely have used JSP. Having developed applications using the JRun, Tomcat and WebLogic application servers on modestly configured machines, I firmly believe that a JSP version would be substantially faster than the Swing windows used here. The code was written in such a way as to make porting to JSP simple. The application is stateless, there are no risks of racing conditions or concurrency issues, the query screen uses controls (list box, radio buttons, etc.) available in most windowing libraries and many of the screens (most notably the results screens) were developed in HTML. The application was not developed in JSP because it was my intent to distribute it to the widest audience possible and not everyone has a JSP-capable application server installed or available. Code Quality The code suffered from numerous rewrites of the hacker variety. While an actual object model and screen mock-ups were developed before the code was written, the code still shows the effects of numerous, rapid changes. Much of the code looks like it was created out of bandages and duct tape. The interfaces vary from class to class (not all classes ended up supporting Comparable or implementing clone, and key generation, internal data structures and iterators varied slightly between classes). The UI was rewritten repeatedly to accommodate custom-developed UI objects and helper classes. Some of the methods are dozens of lines long. And method-level documentation is sparse (although documentation for important code decisions is fairly complete). Dynamic User Interface Support Im rather proud of how dynamic, flexible and configurable the system is. One goal, dreamt up mid-project, was to be able to dynamically generate the entire interface. Much of the infrastructure to do this is in place, but it was not fully completed. The PC Shopping Assistant application contains a class named TraitPanel. This is a graphical (Swing) container that understands traits and search criteria, displaying the appropriate controls for the traits data type and able to tell the underlying program what values were set by the user. In theory, you could read in the meta-data from the data file and dynamically generate a TraitPanel for each item. While this should work, it has not yet been tried. There are several items that must be completed before the interface is fully dynamic. First, the existing code hard codes the traits to display (the TraitPanel class, luckily, can be setup within the IDEs GUI builder). Second, the display configuration functionality was not completed. This was to be another file (display.txt) that controlled which traits to show, the labels to display for numeric values (the text above and below the value radio buttons) and numeric formatting (in the form of C-style sprintf format strings). Third, any decent dynamic UI system should support user-supplied layout templates, which this application does not. Finally, a bug was discovered late in the testing process in which numeric values formatted for display could not be used accurately in selection criteria (specifically, they can result in a NumberFormatException). The current version of this application has temporarily removed the number formatting for this reason. Not a show stopper, but it is annoying. Overview of The Selection Engine The following is an article explaining The Selection Engine, the CBR engine that the PC Shopping Assistant uses. It was originally to be published in the January 2002 issue of the Java Report but the magazine was cancelled before the article could be published. It is included here as background for the PC Shopping Assistant.  What is Case Based Reasoning? A human walks into a car dealership and tells the salesman about his perfect car. It would be red, have four doors, a large trunk, 300 horsepower, side airbags, four wheel drive and would cost $20,000. Knowing his perfect car probably doesn't exist, he asks the dealer to show him the car that is closest to what he described. It doesn't have to be an exact match, it just has to be close. If the customer had asked to see all cars that had more than 275 horsepower and cost less than $22,000, we could have told him that with a simple SQL statement. Had we found no cars, we'd have asked him to change his search criteria. Had we found a dozen cars, we would have handed him the list, perhaps sorted by price or horsepower, and asked him to find the one he likes best. But, being a typical human, he didn't ask for a set of acceptable items, he asked for our "best" item, our closest match. And how we'd do that in SQL isn't exactly obvious. There's a field of artificial intelligence that talks about the best match rather than exact matches. It's called case based reasoning, a field that wants to build expert systems that reason by memory rather than rules. As an example, suppose we wanted to examine a patient's blood for diseases. In a typical expert system, the data is run through a set of rules and a diagnosis is made. In a CBR expert system, the system searches its memory (the case base) to see if it's seen this situation before. If it finds a similar situation, it uses that situation's diagnosis. Many argue that human thought, from diagnosing blood to playing chess to driving a car, is based more on recognizing patterns than on memorizing rules. If true, CBR seems more human-like than rule-based systems. But that (and other CBR issues such as adaptation rules) is for the AI people to worry about. This article isn't about expert systems or cognitive modeling. It's about writing a search engine that returns best matches rather than exact matches, so that's the part of CBR we're going to talk about here. When To Use Similarity-Based Matching Suppose we wanted to find all items that matched at least seven of ten criteria. With similarity-based matching, we can find items that match most of our search criteria but not all of it. Suppose some of our data is incomplete - some of the fields we use in our search criteria were left blank. With similarity-based matching, we can find items even when data is missing. What if we wanted to see all items that were around a certain price? With similarity-based matching, we can find items that are close to what we want without having to specify a hard cut off. What if we wanted to sort our items by a combination of attributes rather than on a column by column basis? Similarity-based matching makes it easy to rank items. And suppose certain attributes are more important than others. With similarity-based matching, we can assign weights to each attribute, allowing some data attributes to be more important than others. I'm sure you can think of plenty of examples of where this would and would not be useful. You obviously wouldn't use similarity-based matching if you knew exactly what you were looking for and had that item's "key" - a part's part number, a person's social security number, an order's order number, etc. Similarity-based matching is meant to be used when you don't know what options you have or when you want to have your items ranked. So when would you use similarity-based searching? If we were searching for a plane flight, we could ask to see all flights that left around 10:00 on Friday, cost around $500 (the cheaper the better) and, if possible, had no layovers. If we wanted a restaurant recommendation, we could ask what the best restaurant would be if we wanted a medium priced restaurant, someplace that got good reviews, French and Italian are good, youre not in the mood for Mexican but Chinese or Thai would be the great. When it's time for a movie, you can say that you're in the mood for a suspense or action movie, maybe something directed by John Woo or Robert Rodriguez, movies with the CIA or some other spy agency would be nice and perhaps starring Bruce Willis, Chow Yun Fat or Jet Li. Real world systems have used case based reasoning to help lawyers find cases, TV viewers find interesting TV shows, customers find the perfect computer, help desk workers find answers (by far the most common application), technicians configure X-ray machines, credit card companies find good credit risks, auditors find fraud, mechanics maintain aircraft and music lovers find songs. Which leads us to a short aside - as with SQL, you can only search on those fields that you define. Often, the trick to a good search or expert system is in the feature extraction, not the search algorithm. A good example is the University of California at Berkeley's Ninja Jukebox (http://www.cs.berkeley.edu/~mdw/proj/jukebox/). Several graduate students built a search engine that allows you to find songs that sound similar to a specified song. The search part of the problem (a brute force k-nearest neighbor algorithm, which we'll talk about in a minute) was simple. The tricky part was in figuring out how to describe a song so that you could search on it. The group wrote a series of automated feature extractors that take an MP3 and convert it to 1,248 feature dimensions such as frequency, amplitude and tempo. That's pretty important since, obviously, the search is only as good as the database description of each song, along with the proper weighting of each attribute. CBR doesn't solve the feature extraction problem nor does it by itself solve the weighting issue - you're still responsible for those parts. If you want to create a great search engine, start with good data. As they say, garbage in, garbage out. The Nearest Neighbor Algorithm OK, time for the big words. The class of selection algorithms used by most CBR products is called nearest neighbor. Consider an item that has two attributes, price and performance. If you felt like it, you could plot all the items on graph paper. On the bottom (x-axis) might be price, say from 1-10 to make it easy, while the side (y-axis) would have performance numbers (again, for simplicity, let's make the range 1-10). You want to find the item that has the highest performance (10) and lowest cost (1). On the graph, what you're searching for would be plotted at (1, 10). So far so good. Unfortunately, no system has a price of 1 and a performance of 10. What you do have are items like item A with a price of 4 and performance of 4. On a graph, we'd plot that, obviously, at (4,4). We also have items B at (7,7), C at (10,9) and D at (1,6). If we plotted them on the chart, it might look like Figure 1.  EMBED Excel.Sheet.8  Figure 1 We want to find the item that is closest to our ideal item at (1,10). So, which item is closest to what we're looking for? If you look at the graph, D probably looks the closest. Which is easy for us to say - we have eyes. But how does the computer know which is closest? Although I am a rabid math-phobe, it's time to venture into the murky waters of math by remembering high school geometry and the infamous Pythagorean Theorem. Remember that one? It says that a2 + b2 = c2. It's used to calculate the length of the longest side of a right triangle. So let's take item A. Draw an imaginary triangle with the hypotenuse (the diagonal side) from Target to A. We draw a straight line down from Target and left from A. Where they intersect, (1,4), is the right angle portion of our triangle. So now we have a right triangle with the points (1,10), (4,4) and (1,4). Using some pretty simple math, we know that the length of side A (the vertical side) of our triangle is 6 (i.e., 10-4) while side B (the horizontal side) is 3 (i.e., 4-1). Side C is the diagonal line between Target and A. Using the Pythagorean Theorem, the length of side C is: a = 10-4 = 6 b = 4-1 = 3 a2 + b2 = c2 62 + 32 = c2 36 + 9 = c2 c2 = 45 c = 6.71 Technically, we get the same answer if we skip the phantom third point (1,4) and just subtract the two remaining points, which means (10,1) - (4,4). The calculations look like: a = 10-4 = 6 b = 1-4 = -3 a2 + b2 = c2 62 + -32 = c2 36 + 9 = c2 c2 = 45 c = 6.71 So now we know that item A has a distance of 6.71. If we run the other items through the same process, we get B's distance as 6.71, C's distance is 9.06 and D's distance is 4. Since D has the shortest distance, it is the most similar and thus our top choice. What does a distance of 4 mean? Nothing by itself - the range of numbers (the maximum distance) changes with each problem. So how do we convert it to a percentage? First, figure out the maximum distance. With ranges of 1-10 for performance and 1-10 for price, the maximum distance is from (10,10) to (1,1). Again using the Pythagorean Theorem, we come up with a distance of 12.73. So how similar is item D and our target item? The amount of difference is (4/12.73) which is 0.31 and since similarity is just one minus that (1-0.31), the degree of similarity is 0.69, or 69%. A's similarity is 47%, B is also 47% and C is 29% similar. OK, that was an easy example. With only two attributes, we could use the Pythagorean Theorem just as we remember it from our early school days. What happens when an item has numerous attributes? Short answer - rather than use a triangle with two lengths, A and B, use some multi-sided object with however many sides you need. Or, using our more simple way, we skip the whole diagramming thing and just subtract all our points from one another. Say, for example, that our items have price, performance, reliability and size, again using the simple 1-10 ranges. We want to find an item of price 1, performance 10, reliability 10 and size 1, which as a graph point would be (1,10,10,1). Item A might be (4,4,4,4), B might be (7,7,2,8), C might be (10,9,6,10) and D might be (1,6,8,3). We aren't going to show a graph here since I'm not very good at drawing four-dimensional objects. What is the distance of A from what we're looking for? Running through the calculations, we get: (1,10,10,1) - (4,4,4,4) = -3, 6, 6, -3 a2 + b2 + c2 + d2 = e2 -32 + 62 + 62 + -32= e2 9 + 36 + 36 + 9 = e2 e2 = 90 e = 9.49 Since a quick calculations shows that the maximum distance is now 18, we know that item A is 47% similar to what we're looking for. Now let's talk about our last little algorithm bit - weighting. Suppose that, in the above example, performance is more important than price. We can invent a scale of one to five, where five means really important. Let's say we weight performance highest (weight=5), price high (weight=4), reliability low (weight=2) and size lowest (weight=1). To use these values, multiply the initial numbers, which for our perfect item was (1,10,10,1), by their weights, which here is (5,4,2,1). That's it. Using really simple multiplication, we end up with a new, weighted scale where our perfect item is located at (5,40,20,1) and item A moves from it's previous (4,4,4,4) to the new weighted position of (20,16,8,4). To see how near a neighbor item A is now, let's compute the distance: Target position = (1,10,10,1) * (5,4,2,1) = (5,40,20,1) Item A position = (4,4,4,4) * (5,4,2,1) = (20,16,8,4) (5,40,20,1) - (20,16,8,4) = -15, 24, 12, -3 a2 + b2 + c2 + d2 = e2 -152 + 242 + 122 + -32= e2 225 + 576 + 144 + 9 = e2 e2 = 954 e = 30.89 We calculate the maximum distance as before, this time using weights. It looks like: Max value = (10,10,10,10) * (5,4,2,1) = (50,40,20,10) Min Value = (1,1,1,1) * (5,4,2,1) = (5,4,2,1) (50,40,20,10) - (5,4,2,1) = 45, 36, 18, 9 a2 + b2 + c2 + d2 = e2 452 + 362 + 182 + 92 = e2 2025 + 1296 + 324 + 81 = e2 e2 = 3726 e = 61.04 That means that the similarity of item A to our target item is (1-(30.89/61.04)) = 49%. As a note, things haven't changed much - A was a 47% match in our two-attribute example, 47% match in our four-attribute example and now a 49% match in our weighted four-attribute example. Enough with all the addition and multiplication. Let's take a slight detour and talk about nearest neighbor terminology. If, on every request, you manually compute the distances and return the best match, that's a brute force nearest neighbor search. A k-nearest neighbor search returns the k closest items, meaning that if you ask for the top five matches, you get the top five matches (k=5). The whole brute force thing is pretty simple to understand and implement, but you are doing a fair number of calculations on each call. I believe that many, if not most, CBR researchers are spending their time in performance tuning. The most common, as well as obvious, approach is to pre-calculate all of the distances and cache those in memory. This approach trades flexibility for performance. Why does this limit our flexibility? The pre-calculation routine normally assumes that all attributes should be used in computing distance and that all attributes carry the same weight. You could tell the pre-calculation routine to use a pre-defined subset of attributes and define weights for each field, but those decisions are frozen at the time the similarity relationships are pre-calculated. In other words, you can't change the attributes and weightings on the fly. That's perfectly OK for some applications and a problem for others. In this article, I want to show you how to build a user-driven search engine, and since the user is in control of the attributes and weightings, we're going to skip building a pre-calculation routine and do everything dynamically. Implementing CBR Before we start looking at code, I want to talk a little bit about an open source project named The Selection Engine. Once upon a time, I needed a CBR tool to help with a couple of applications I was responsible for. Unfortunately, I couldn't find a freeware or shareware tool I liked and couldn't easily get my hands on a demo of a commercial product. After a little research, I realized that CBR wasn't all that complex and that I could write a tool to do what I needed pretty quickly. Two weeks later, I had everything I needed. For me, that meant an engine that could do standard data filtering (ala SQL), compute similarity, compute percent similarity, handle weightings, sort data by percent similarity, handle searches on subsets of attributes, be generic enough to deal with any arbitrary data sent to it, was easily integrated into larger applications, worked with java and was stateless. And since I knew from the beginning that I wanted to share this code with others, I wanted the engine to work with whatever environment other people had, which led me to create generic data loading and display managers. By default, The Selection Engine reads the data, meta-data and search query from text files and sends the output to stdout, which should make it usable by anyone. It's expected that the person using the engine will replace those pieces, perhaps replacing the text file loader with something that reads from an Oracle database or VSAM file and perhaps replacing the display manager with something that generates HTML or perhaps creates a GUI using Swing. I leave the customization of the I/O to the reader. The implementation of a CBR system I'm going to talk about here comes from The Selection Engine. However, this article is about CBR, so I'm not going to spend a lot of time here talking about the parts of The Selection Engine that aren't nearest neighbor related. That includes the filter engine, the file parsing routines, the stdout-oriented display manager and the engine's object model. If you're interested in knowing more about those pieces, or if you want to help improve the tool, feel free to visit the engine's Web page at http://selectionengine.sourceforge.net. OK, with that out of the way, let's talk about the basic pieces of our CBR engine. The most obvious part, of course, is the java class named SimilarityEngine, which, as you probably guessed, computes the similarity of a collection of items to a specified target. SimilarityEngine relies on a handful of objects, seen in Figure 2. The call to SimilarityEngine's main method, computeSimilarity(), takes the objects Items, SimilarityCriteria and SimilarityWeights. Each Item has a collection of Traits while the Items collection has a link to the meta-data collection TraitDescriptors.  Figure 2 Items is a collection of Item objects. It's pretty much a standard collection. An Item is generic and represents anything you want to be able to search for - a product, a song, an image, a decision, a customer, etc. Since Item is supposed to be able to represent any type of item, its attributes are defined at run time rather than compile time. Specifically, Item contains a Traits collection which, obviously, contains several Trait objects. A Trait object is basically just a name/value pair. A separate object, TraitDescriptors, plays the meta-data role. It is a collection of TraitDescriptor objects. There is one TraitDescriptor for each trait an Item object will have. A TraitDescriptor, like the Trait class, is basically just a name/value pair. In this case, value is the data type of the attribute. The Selection Engine recognizes integers, floating point numbers, strings and booleans. It's worth noting here that all of the items in the previous paragraph (Items, Item, Traits, Trait, TraitDescriptors, TraitDescriptor) exist in The Selection Engine to help make the engine generic and reusable. In your particular application, where you know at compile time what classes, attributes and attribute data types you will have, you'll probably replace Item with the specific class you want to use. Doing so should have minimal impact on how the engine functions, so the rest of this article should still be helpful. So that covers the Items, the first of three arguments we pass to SimilarityEngine's computeSimilarity() method. The collection should contain every item that we want to compute similarity on. In The Selection Engine, the items are first run through an object called FilterEngine, which filters out items we know we don't care about. As an example, if the user doesn't want to see any items that cost more than $100, the FilterEngine can get rid of those choices before they are passed to the SimilarityEngine. In practice, most people will probably use SQL to filter out obviously unsuitable items before they are scored by the SimilarityEngine. The other two arguments passed to computeSimilarity() are SimilarityCriteria and SimilarityWeights. SimilarityCriteria is a collection of SimilarityCriterion objects. A SimilarityCriterion object describes a simple relationship made up of an attribute, an operator and a value. The Selection Engine recognizes three operators. ~ means "around", % means "prefer" and !% means "try to avoid". The first is used with numbers, the latter two with strings and booleans. The distinction is that numbers have a degree of similarity while strings and booleans in The Selection Engine either match or don't. Let's do a few quick examples. For the similarity criterion "spiciness ~ 10", the value 7 might be a 70% match. For the similarity criterion "taste % 'spicy'", a value of sweet would be a 0% match. While The Selection Engine is case insensitive, it does not currently do degree of similarity calculations on strings or use similarity-oriented matching algorithms such as Soundex, metaphone or Levenshtein. At this point, some of you might be asking what the difference is between "taste % 'spicy'" and "taste = 'spicy'". In The Selection Engine, % is used by the SimilarityEngine while = is used by the FilterEngine. If an item fails the "taste = 'spicy'" criterion, it is rejected and will not be passed to the SimilarityEngine. If an item fails the "taste % 'spicy'" criterion, it lowers the degree of similarity but does not prevent the item from receiving a similarity score. The applies to both strings and booleans. For numeric fields, the SimilarityEngine recognizes two special values, [MAX_VAL] and [MIN_VAL] (again, case is unimportant). Both are, obviously, relative values rather than absolute values. You can do CBR without using relative values, but relative values makes CBR maintenance easier. The SimilarityEngine translates relative numbers into absolute by first looping through every item passed to it so it can determine the max and min values for each of the item's attributes. The third object passed to computeSimilarity() is SimilarityWeights, which is a collection of SimilarityWeight objects. SimilarityWeight is yet another name/value pair, where name is the name of an attribute and value is its weight. Weight can be any integer value. The weights are not percentages and do not need to add up to 1 or 100. By default, all attributes have a weight of 1. Remember that weights only affect similarity operators (~, %, !%), not filter operators (=, !=, <, >, <=, >=). To review, let's look at a sample query. We visit an Internet retailer to buy a computer. We like Alienware computers but would rather not buy an HP and absolutely will not buy a Dell. The price should be low, and we absolutely cannot spend more than $1,000. We really, really want a fast computer and a big hard drive. And a DVD player would be nice. How would we write that as a selection criteria? Using The Selection Engine's query format, which is pipe delimited and uses c and w to signify constraints and weights, the query might look something like this: c | Vendor | % | Alienware c | Vendor | !% | HP c | Vendor | != | Dell w | Vendor | 1 c | Price | ~ | [MIN_VAL] c | Price | <= | 1000 w | Price | 1 c | HD | ~ | [MAX_VAL] w | HD | 4 c | DVD | % | TRUE w | DVD | 1 c | cpu_benchmark | ~ | [MAX_VAL] w | cpu_benchmark | 5 Note that, to search for fast computers, we search against a benchmark score (cpu_benchmark) rather than a CPU name (ex. - Intel PIII 800) or Megahertz rating (ex. - 800). That's because a benchmark might accurately represent system performance while the name of the CPU or Mhz rating most likely will not. While this has nothing to do with the technical aspects of the nearest neighbor algorithm, I mention it here because, as has already been mentioned, garbage in, garbage out. In my experience, poor design (i.e., database and object modeling problems) is responsible for far more problems than technological issues. So we know we pass Items, SimilarityCriteria and SimilarityWeights into the computeSimilarity() method. Let's talk about how the computeSimilarity() method works. A pseudo code representation is shown in Listing One. The source code for computeSimilarity() is shown in Listing Two. First, we need some information on our data set. Read through each item and determine the max and min values for each numeric attribute. We also need the range for each attribute, but we can get that by subtracting the smallest value from the largest. The code is shown in Listing Three. Next we need to turn our similarity query into something (a series of points) which we could, in theory, plot on a graph. We do that by normalizing each numeric value in the search criteria. Normalizing means that we want to express the value as a percentage (0 to 1) of the possible values. We do that by subtracting the minimum value possible (which we found when we calculated our data set statistics back in step one) and then dividing it by the range of possible values (i.e., the maximum value minus the minimum value). As an example, if our target price was $2,000 and the prices ranged from $1,000 to $3,000, the normalized value of our target price would be 0.5 - (2,000 - 1,000) / (3,000 - 2,000). After we get the normalized value, we convert it to a weighted value by multiplying each normalized value by the weight for that attribute. The code for this is in Listing Four. Next we calculate the maximum distance possible for any item. We'll need this number to calculate percent similarity later. We know that the normalized value of the maximum value is always one, so the weighted values are just the weights. We convert these points to a distance by squaring each point, summing them and then taking the square root, just like the good old Pythagorean Theorem. The code is shown in Listing Five. That completes our prep work. Now it's time to rate each item. As with our search criteria, we start by normalizing and weighting the numeric values. The source for this is in Listing Six. We then compute the distance by subtracting each of the item's attributes from the target values (which gives us a delta), squaring the deltas, summing them and then taking the square root of the sum. Dividing the distance by the maximum distance gives us the percent difference, and subtracting that from one gives us the percent similarity. The code to do this is showing in Listing Seven. Listing One Gather data set statistics (max and min values) Create normalized values for what we're searching for (the "perfect" item) by: For each search/similarity criterion normalized_value = (value - min) / (max - min) weighted_value = normalized_value * weight Determine the max distance by: For each search/similarity criterion sum += (weight*weight) distance = sqrt(sum) For each item Score the item for each search criterion by: For each search/similarity criterion normalized_value = (value - min) / (max - min) weighted_value = normalized_value * weight Compute distance by: For each score delta = target.criterion.weighted - item.criterion.weighted sum += (delta*delta) distance = sqrt(sum) percent_similarity = 1 - (distance/max distance) Rank items by sorting on percent_similarity Listing Two public SimilarItems computeSimilarity( Items items, SimilarityCriteria criteria, SimilarityWeights weights ) { DataSetStatistics statistics = new DataSetStatistics( items ); SimilarityCriterionScores targetValues = getTargetValues( items.getTraitDescriptors(), criteria, weights, statistics ); float maxDistance = getMaxDistance( criteria, weights ); SimilarItems similarItems = new SimilarItems(); Iterator itemList = items.iterator(); while (itemList.hasNext()) { Item item = (Item) itemList.next(); SimilarityDescription descriptor = new SimilarityDescription(); descriptor.setItem( item ); SimilarityCriterionScores itemValues = normalizeValues( item, criteria, weights, statistics ); float distance = computeDistance( targetValues, itemValues ); float percentDifference = (distance / maxDistance); float percentSimilarity = (1 - percentDifference); descriptor.setPercentSimilarity( percentSimilarity ); similarItems.add( descriptor ); } similarItems.rankItems(); return similarItems; } Listing Three private void buildStatistics( Items items ) { Iterator itemList = items.iterator(); while (itemList.hasNext()) { Item item = (Item) itemList.next(); measureItemTraits( item ); } } private void measureItemTraits( Item item ) { Iterator traitList = item.iterator(); while (traitList.hasNext()) { Trait trait = (Trait) traitList.next(); String traitName = trait.getName(); int dataType = item.getTraitDataType( traitName ); if ( (dataType == TraitDescriptor.TYPE_FLOAT ) || (dataType == TraitDescriptor.TYPE_INTEGER) ) { TraitStatistics traitStats = this.get( traitName ); float value = trait.getValue().toFloat(); traitStats.addExample( value ); } } } Listing Four private SimilarityCriterionScores getTargetValues( TraitDescriptors traitDescriptors, SimilarityCriteria criteria, SimilarityWeights weights, DataSetStatistics statistics ) { SimilarityCriterionScores normalizedValues = new SimilarityCriterionScores(); Iterator criteriaList = criteria.iterator(); while (criteriaList.hasNext()) { SimilarityCriterion criterion = (SimilarityCriterion) criteriaList.next(); String criterionID = criterion.getID(); String traitName = criterion.getFieldName(); int traitDataType = traitDescriptors.getDataType( traitName ); SimilarityCriterionScore score = new SimilarityCriterionScore( criterionID ); normalizedValues.add( score ); float position = 0; if ( (traitDataType != TraitDescriptor.TYPE_FLOAT ) && (traitDataType != TraitDescriptor.TYPE_INTEGER) ) { switch( criterion.getOperator() ) { case SimilarityCriterion.OPERATOR_SIMILAR: position = 1; break; case SimilarityCriterion.OPERATOR_NOT_SIMILAR: position = 0; break; default: position = 0; break; } } else { TraitStatistics stats = statistics.get( traitName ); float max = stats.getMaximumValue(); float min = stats.getMinimumValue(); float range = stats.getRange(); TraitValue traitValue = criterion.getValue( ); float value = 0; if (traitValue.toString().equals( MAX_VAL_INDICATOR )) { value = max; } else if (traitValue.toString().equals( MIN_VAL_INDICATOR )) { value = min; } else { value = traitValue.toFloat(); } position = (value - min) / range; } score.setNormalizedValue( position ); float weight = weights.get( traitName ); float weightedValue = (position * weight); score.setWeightedValue( weightedValue ); } return normalizedValues; } Listing Five private float getMaxDistance( SimilarityCriteria criteria, SimilarityWeights weights ) { float sum = 0; Iterator criteriaList = criteria.iterator(); while (criteriaList.hasNext()) { SimilarityCriterion criterion = (SimilarityCriterion) criteriaList.next(); String fieldName = criterion.getFieldName(); float weight = weights.get( fieldName ); weight *= weight; sum += weight; } float squareOfSummedDeltas = (float) Math.sqrt( sum ); return squareOfSummedDeltas; } Listing Six private SimilarityCriterionScores normalizeValues( Item item, SimilarityCriteria criteria, SimilarityWeights weights, DataSetStatistics statistics ) { SimilarityCriterionScores normalizedValues = new SimilarityCriterionScores(); Iterator criteriaList = criteria.iterator(); while (criteriaList.hasNext()) { SimilarityCriterion criterion = (SimilarityCriterion) criteriaList.next(); String traitName = criterion.getFieldName(); int traitDataType = item.getTraitDataType( traitName ); String criterionID = criterion.getID(); SimilarityCriterionScore score = new SimilarityCriterionScore( criterionID ); normalizedValues.add( score ); float position = 0; if ( (traitDataType != TraitDescriptor.TYPE_FLOAT ) && (traitDataType != TraitDescriptor.TYPE_INTEGER) ) { String value = item.getTraitValue( traitName ).toString(); String targetValue = criterion.getValue().toString(); if (value.equals( targetValue )) { position = 1; } else { position = 0; } } else { float itemValue = item.getTraitValue( traitName ).toFloat(); TraitStatistics stats = statistics.get( traitName ); float min = stats.getMinimumValue(); float range = stats.getRange(); position = (itemValue - min) / range; } //--- if dataType = ... score.setNormalizedValue( position ); float weightedValue = (position * weights.get( traitName )); score.setWeightedValue( weightedValue ); } return normalizedValues; } Listing Seven private float computeDistance( SimilarityCriterionScores targetValues, SimilarityCriterionScores itemValues ) { float sum = 0; Iterator targetValueList = targetValues.iterator(); while (targetValueList.hasNext()) { SimilarityCriterionScore targetScore = (SimilarityCriterionScore) targetValueList.next(); SimilarityCriterionScore itemScore = itemValues.get( targetScore.getID() ); float targetValue = targetScore.getWeightedValue(); float itemValue = itemScore.getWeightedValue(); float delta = (targetValue - itemValue); float squaredDelta = (delta * delta); sum += squaredDelta; } float distance = (float) Math.sqrt( sum ); return distance; } Tools Used All tools used in the final version of the PC Shopping Assistant application were free. Most were also open source. It was not the intent of this project to use nothing but free software, but the free and especially the open source software proved superior to their commercial counterparts. Some commercial tools were used in the beginning of this project but were replaced relatively early. The one tool that i did not find a decent (i.e., functional and easy to use) free tool for was deployment. That one feature alone made me wish i had used a commercial IDE as most commercial Java IDEs come bundled with automated package builders and deployment tools. Language and IDEs Java J2SE 1.3.1. Java 1.2 features used heavily (Swing and collection classes), minimal use of Java 1.3 features (just JLabel HTML support i think) JBuilder 5.0.296.0 (primary) JCreator 1.52.01 (secondary; excellent IDE, no GUI builder) Early development also done with Forte and Cafe Environment Java application. JSP/servlets were not used for portability/distribution reasons Developed, tested, crashed and repeatedly rebooted on Windows 98 CVS on linux (hosted by SourceForge) used for source control. Kind of. Setting up CVS on Source Forge was decidely non-trivial and getting gnu's WinCVS to work was a pain. For those reasons, source control was not actively used Libraries PrintfFormat - an sprintf simulator from Allan Jacobs, Sun Microsystems. Used for some string formatting, although not used extensively due in part to formatting errors that cropped up when using the ' flag (localized thousands separator). Since the code is provided, i could fix this, but i was lazy and never got around to it TableMap and TableSorter - swing table extentions by Philip Milne. Used in an attempt to make the horrible Java JTable class simple and useful for displaying output. It didn't work DynamicallyModifiableTableModel - a subclass of AbstractTableModel by me. Another attempt to make a nice display object in Java. Why don't most languages make an easy way to do simple data display? Makes me miss PowerBuilder and its datawindows. While this approach worked, it was eventually abandoned in favor of displaying HTML in JLabel and JEditorPane objects TraitPanel - not really a library, just a subclass of JPanel i made that had a lot of little properties i needed. Specifically, it tracked button groups and check boxes specific to that panel (a simple version of the query screen had 13 button groups containing 65 radio buttons) and the item that the panel described (most controls store this info in the actionCommand property, but the JPanel class doesn't have that). Why build this class? Aside from making generating a query from a collection of panels (or a huge collection of controls) much easier, it was pretty darn necessary to make possible my eventual goal of generating the entire GUI dynamically from run-time loaded meta data. 'Cause hey, who likes building, hard coding and maintaining GUIs anyway? Data Pipe delimited files. Databases were not used for portability/distribution reasons. Why pipes? Why not? i could just as easily have used any other text file format - tab delimited files, colon delimited, comma delimited, space delimited, XML, fixed length, etc. i used pipe delimited because i thought pipes were prettier than tabs and commas and because pipe delimiters required less coding than XML and fixed length records; was more compact than fixed length and XML; was easier to maintain than space delimited, fixed length and XML and because pipe delimited files perform waaaaay better than XML Object Modeling Pieces of paper. The ultimate object-modeling tool. Anyone who uses anything else is an imbecile and should be shot repeatedly. Any employer who refuses to interview object modelers without heavy Rational Rose experience deserve the idiots they get. Anyone who claims that they enjoy working with any Rational tool (or worse, claims to have RUP experience, which is the PM equivalent of a snipe hunt) is lying and should be decapitated immediately White boards. My house is covered in them. You can never have enough white boards ArgoUML 0.8.1. Used to draw pretty pictures of objects (class diagrams only; the other UML stuff is pretty worthless) after they have been modeled on the back of scratch pieces of paper. ArgoUML (also called Argo/UML) is yet another super-duper free (BSD license) open source piece of software from the fine folks at UC Berkely. It's kind of like a UML expert system, or, according to its tagline, it provides "cognitive support for object-oriented design". It does all the standard stuff like code generation, and it has the nifty ability to make suggestions on how to improve your models (assuming your models can be improved, which never happens to a super genius like me *cough* *cough*). Me, i just used it to make pretty class diagrams and save them as GIF files Early documentation also done with Rational Rose (not a free tool, despite its amateurish appearance; i started with this because i had it installed and i have way too many years experience in it), Visio (both Visio's block diagrams and its iffy UML support in version 5) and TogetherJ (it supports design patterns and is a lot nicer than Rose, but it's a java app, so it, like every other java app except maybe ArgoUML, is painfully slow). All three of these tools cost money, although, personally, i think only TogetherJ is worth spending money on. But in the end i decided to use ArgoUML because i just liked it better (money wasn't an issue as i own and use all three tools professionally). Remember, though, none of these tools, however, hold a candle to scratch pieces of paper Packaging jar. As in the command line jar tool that comes with the JDK. Distribution was a major pain in the ass. My version of JBuilder (the free version) does not contain a jar builder (my version of Cafe Enterprise did, but i only have that at work, not at home, and this is a non-work assignment so...). i tried to use Cannery, a free jar building tool, but, despite a lot of effort, i never got it to work. So i wrote my own manifest file (it took quite a while to figure out the directory structure jar expected), manually verified my dependencies, manually built the archive and spent way too much time manually testing bad archives (path problems in packaging multiple packages caused me most of my head aches) Data File Formats Data File #--- A dummy data file of PC numbers to test the SelectionEngine #--- If this were ever used in real life, it'd probably be in a DB for the #--- app to read from and in XML if a vendor sent emailed/ftp'ed it to us f=s:maker |s:model |i:cpu|s:chip name|i:ram|f:hd|f:cost|f:winmark|s:video card |f:q2 |b:cd-rw|b:dvd|b:AGP slot d=compaq |Presario 9228 | 733 |celeron |256 |20 | 500 | 1 |Intel3D AGP | 24 | false |false|false D=compaq |Presario 9257 | 900 |celeron |512 |20 | 600 | 2 |Intel3D AGP | 35 | false |false|false D= dell |Incendiary 2000|1400 |athlon XP |1024 |100 |2600 | 10 |GeForce III |221 | true |true |true d= hp |Pavilion 9420 |1100 |duron |256 |30 |1000 | 6 |GeForce II MX |130 | true |true |true D= hp |Pavilion 9200 | 950 |Pentium III|256 |30 |2000 | 4 |PowerVR Kyro II |120 | true |false|true d=micron |millenia 1800 |1800 |Pentium IV |1024 |80 |3500 | 10 |GeForce III |186 | true |true |true D=micron |millenia 1200 |1200 |athlon |512 |75 |3000 | 9 |GeForce III |170 | true |true |true D=emachine|e1300 |1300 |Pentium III|128 |60 |1800 | 7 |ATI Radeon 8500 |160 | true |false|true d=emachine|e1200 |1200 |celeron |128 |30 |1500 | 4 |ATI Radeon 7500 |108 | false |false|true D=emachine|e600 | 600 |celeron |64 |18 | 600 | 3 |S3 Savage 4 | 66 | false |false|true Query File #--- This is the criteria we use for selecting items #--- Operators: #--- = != > < >= <= #--- ~ around; for trying to be close (+/-) to a specific number #--- % prefer; for soft-matching on strings and booleans #--- !% rather not; for soft-matching on strings and booleans #--- Values: #--- [MAX_VAL] maximum numeric value in this dataset for this field #--- [MIN_VAL] minimum numeric value in this dataset for this field #--- #--- Format: #--- First field is type. Type is w for weight or c for constraint #--- When type = c, the format is #--- type | field | operator | value #--- When type = c, the format is #--- type | field | weight #--- #--- Weight is an integer, higher is better #--- The weight can be any integer, but i personally use 1 to 5 #--- Weights *only* apply to similarity measures (~, %, !%), *not* #--- to absolute filtering criteria (=, !=, >, <. >=, <=) #--- #--- Everthing should be case insensitive #--- i don't like HP, but it's not that important #--- but i absolutely won't buy a Dell c | maker | !% | hp c | maker | != | dell w | maker | 2 #--- This is going to be a game machine, so i really want a fast video card c | winmark | ~ | [MAX_VAL] w | winmark | 5 c | q2 | ~ | [Max_Val] w | q2 | 5 #--- i only have $3,000, so it absolutely *cannot* cost more than that #--- i figure $1,500 is a good price for a fast game machine #--- so let's try to find a machine around that price c | cost | <= | 3000 c | cost | ~ | 1500 w | cost | 4 #--- A big hard drive wouldn't be bad c | hd | ~ | [MAX_VAL] w | hd | 1 #--- And i wouldn't mind having a DVD player c | dvd | % | y w | dvd | 1  TITLE \* MERGEFORMAT baylor  DATE \@ "MM/dd/yy" 01/15/02  PAGE 47 of  NUMPAGES 47  [h#$%VWXYabc}~9:;źŬźźŞźźŐźźłjwUmHnHujUmHnHuj}UmHnHujUmHnHujUmHnHu mHnHu5;CJaJmHnHuaJ$mHnHu jU5CJOJQJOJQJ CJ OJQJ CJ(OJQJ3   [\]^_`abcdefghv$@&a$$a$%#$?%&By Q ;  $  $  $ @&$a$;=>?klm%&'jkUmHnHu 6;>*CJj6;>*CJU5CJOJQJ jUjUmHnHujqUmHnHujUmHnHuaJ$mHnHu5;CJaJmHnHujUmHnHu mHnHu0!"#=>?@ABEFXYZtuvwxy|} 뚍5;CJaJmHnHuaJ$mHnHuj_UmHnHujUmHnHujeUmHnHujUmHnHujUmHnHu mHnHu:CJaJmHnHuaJmHnHu2      / 0 1 K L M O P Q T U    5 6 7 9 : ; > ԎjUmHnHuaJ$mHnHujSUmHnHujUmHnHujYUmHnHu:CJaJmHnHuaJmHnHu5;CJaJmHnHu mHnHujUmHnHujUmHnHu2> ? J K L f g h j k l o p       ! " < = ۫۫jA UmHnHuj UmHnHujG UmHnHu6CJaJmHnHuaJmHnHujUmHnHujMUmHnHujUmHnHu mHnHuaJmHnHu:CJaJmHnHu2; l B ~  `  N } Jy0@& $  $  $  $ = > @ A B E F \ ] ^ x y z | } ~         > Ϲj5 UmHnHuj UmHnHuj; UmHnHu5;CJaJmHnHuaJ$mHnHuj UmHnHu:CJaJmHnHuaJmHnHu6CJaJmHnHu mHnHujUmHnHu3> ? @ Z [ \ ^ _ `       ! , - . H I J L M N Q R ԛԛԛԛjUmHnHuj)UmHnHuaJmHnHuj UmHnHu5;CJaJmHnHuj/ UmHnHuaJ$mHnHu:CJaJmHnHuj UmHnHujUmHnHu mHnHu2R [ \ ] w x y { | }    ()*DEFHI̚jUmHnHuaJ$mHnHujUmHnHujUmHnHujUmHnHu:CJaJmHnHuj#UmHnHujUmHnHu mHnHuaJmHnHu2IJMNWXYstuwxy|}*+,./0DE_`acderؿرީ؉{j UmHnHujUmHnHu jU5CJOJQJOJQJj6>*UjUmHnHujUmHnHujUmHnHu mHnHu:CJaJmHnHuaJmHnHu5;CJaJmHnHu00e;q"$&/BHI~ $ rs5679:;PQklmopqjUmHnHuj|UmHnHujUmHnHujUmHnHujUmHnHu5;CJaJmHnHujUmHnHu mHnHujUmHnHu2 !"#& , , 8;;;@DE|O~Orr*r+r7r9rLrMrmrnryNyTyԽCJOJQJaJhmHnHuCJaJ jLjpU jUCJOJQJ^JOJQJjOJQJUjUmHnHu5;CJaJmHnHu mHnHujUmHnHujvUmHnHu6 5$6$%%d)e)**++F+Z+j+++++ ,,,,, & F?,3334565U5586R6S6 88::;;;>>@@;@<@@BAB $]^a$ & F?ABlCmCDEEEFFHHLL{O|O~OOOIP~PPPQ UVmHnHu jUCJOJQJ^J 5CJ$\ jCJOJQJUmHnHuH*OJQJ jU j U j؛U jvUCJOJQJaJhmHnHu jGU/tuZ[BCE^_ $%{|bcp`a|}}GH%&]^9:;<]^NOxy445:;89T]^no$%^$a$ # $ ( ) - . 2 3 7 8 < = B C H I N O R S k l n o e f j k o p t u y z } ~          H*d  )*-.lm|~   ^ " 9 T m v     : d {        WX fg^     |3D 0DIK]bsz~MQc g s y      !@!O!f!u!!!!!!!"""CJOJQJhmHnHuCJOJQJ^JhmHnHuj42UmHnHuCJOJQJ^JhmHnHuCJOJQJhmHnHuCJOJQJ^J 5CJ$\H*Bg|"}"$$''++--..00 3&3;3R3a3^$a$"""""""""###$$$$$$%%2&>&z&&''7'J'O'a'f'w'y''''''\(](n(o((())) *N+]+d+s+++++++++++5,E,],l, -0-P-Y-^-g-,.<.///*/D/U/_/o/000000000000000CJOJQJhmHnHu`00002222x44666666667,777<@H@x@WCXCdCGGG_J`JmJQQQSSYYYT\jjkk,l1l"m'mwr{ruuuv>$%&'>?GH jUjUmHnHu 56\] 6>*]6]56>*\]CJOJQJ^JCJ CJOJQJCJOJQJaJhmH nH u5CJOJQJhmHnHuAa3}33333334*4667788J<K<==;@<@H@x@@@A ^``^AFAeAAAAAABEBpBBBBBB+CWCXCdCCCC*DUDDp^p^^` ^`DDDDEDEbEEEEEFPFTFFFFG8GZG]G_GzG|GGGGGGGGH~ҁ%FG  !H$ !$HI]^fgijpqstxy0JmHnHu0J j0JU mHnHu jU#0/R / =!"#$% 1h/ =!"#$% 1h/ =!"h#$% 1h/ =!"#$%}DyK _Toc535692424}DyK _Toc535692425}DyK _Toc535692426}DyK _Toc535692427}DyK _Toc535692428}DyK _Toc535692429}DyK _Toc535692430}DyK _Toc535692431}DyK _Toc535692432}DyK _Toc535692433}DyK _Toc535692434}DyK _Toc535692435}DyK _Toc535692436}DyK _Toc535692437}DyK _Toc535692438}DyK _Toc535692439}DyK _Toc535692440}DyK _Toc535692441}DyK _Toc535692442}DyK _Toc535692443}DyK _Toc535692444}DyK _Toc535692445}DyK _Toc535692446}DyK _Toc535692447}DyK _Toc535692448}DyK _Toc535692449}DyK _Toc535692450}DyK _Toc535692451}DyK _Toc535692452}DyK _Toc535692453}DyK _Toc535692454}DyK _Toc535692455}DyK _Toc535692456}DyK _Toc535692457}DyK _Toc535692458}DyK _Toc535692459}DyK _Toc535692460}DyK _Toc535692461}DyK _Toc535692462}DyK _Toc535692463}DyK _Toc535692464}DyK _Toc535692465}DyK _Toc535692466}DyK _Toc535692467}DyK _Toc535692468}DyK _Toc535692469}DyK _Toc535692470}DyK _Toc535692471}DyK _Ref535573243Ddf  C BA*temp\screen flow.gifb&nBƽZq1n&nBƽZqPNG  IHDRN$tRNSbKGD#2 cmPPJCmp0712Om(IDAThAkA`{'hKд ҄rlh5 {%'-,8H9Nx,n(H֝qݙb ª|>>21868!  %.e2g-6B걩l8B^咽 k0l` OEзRvBa)NɊ'ShbNl0+&13h 8W)8ҿSr|!0$4SH+&:_Z{P㠕EkQ 9:p \Z-m$DwKY$ H" teHD 3@vJ~ y-IsBq\AzID{n~/5kk91ʐlH~$EҪKO}ݓPBlLq+Gn HYBRo0#D0jӣDEb#rH!uxD )6mSc Z갲 ;O(96#p8yKK$,"Y>fyW:"G"oԋoA\:˓"Ȼq=S&#>Bb]K*ou/gN'>`vCK>Z$H2n3.(Z㕲o:+[UaɄF)8`GZsAxުwN"dH>؜DŨ͉A6'Fhsb$v07'ȑmNF͉HdJ$ѬQ)sZ#U PIENDB`Dd \  C 8A main screen.gifbOے}SK> n Oے}SK>PNG  IHDR^PLTEcccEXbKGDH cmPPJCmp0712Hs rIDATx^]: dܷ/% K ]s7_XXŗ?w-dH@8^ĄݦD$^ De__ o 7U^|w2 -D1"aE#+W- M3BE-H$;XQ>g2#KHa E6<\~0[.0c~Az$#G %`IK M@8 7t܄ɈKD9)C&S?$iKK+#1IL#(JBAeI]եV29jE-Qv ~N%iM[ ?]P>}ğOrv ":F? kT$~}p };oR|'ry(Q qgs(h\8=m`YFD^GȺ&'n'0L7ºH $YDx'p=']'Q'6SWPll&_.m"BS+BJ37ZlghJ5H i9BNxB8U# ïykLOKl߅)j uvo}H_ ]VsyJ?zuv+-=Oeq.FjyDIGH|>E_qSO/%ă3z<{$RY*J:G*T\L]#RiE|V|$/gh ?uZ:k}/q8'0#U ƒk12u(sqFoFL2tqRi,Fȴ},~Os76<$}sf>G,Ps+s4S'`KJ>N2FYq4 $a4rŊ­g2B>".lCxHHkaA2>:Ѕyqئ0cD!ii!E'D72!Nj\*\36phG-px%'֯פxѸtj(+:vcG{]!&M1:?<~w%;eU@Hqe-YϴQNRI- c|H~l!Zv>0Rw1R.H ٛ^G+Ð8 oFYtkk/iXU[/LYp|@Յ,f^%3\IUZVǔRZ ICZk!iIk)$Mi-tAҕ*HZCZk Hk $.i TC&:퓖IOqW-}gg[tJ<s I&JˉC*74;b"'z)L?GU)'{8lbڝPΈq'✽^K)6G-̎]DFf[9,L ț^wgLxGBaUٙνVȕyq[l  ^k/oX.Π߁;+^ye@S'DI#>5ϫ4^Osm2!CU9BOMFvX Y0ͩ4E#Cٯ <3%Lr ('sv+UT$|Q(,l^fonr0|2~Njge -E.lMa7,bpXm]?_Ţv*hzJlg/rGq$r5Rw)J֖g$9%3+aA0FfxK^D>#("(&|M<bL.aQ e,9L|BȄtO+f׉}H¬I+M&3p"7 nQA$U8p ysxxnZ`,RRc |A$l!b^ԄkqPqȘlB))Q ra%{6|%XJR!l31#ebS؄hKTO@} hGjcauHlcF>M($.+xIfZtVT6PL!,C3OBZI $FN2R3!!hS(-a3J I +!Po-uMfD@wc"J rI_ E.EW-+ʵX{<>pl[v;۳QŞ_߿64Pez<+^)ՕzrYޥ/4&=#lnx S|6LnLR\.m-)NVsuJjCTs7 g) ̅_8I%fǖ'(|z5 % c2Cã4P6x!@HJ-7$mQ,fu0$V A6J?[5{sYSD;Jphv2 GIz᭹Avޟk.8"4Ã}L8:@ai[j|(|NSI ܠ5J+XgWaO↊_\ T~vxki.OЂ;YoAM^ÏHrșp4K]]7[R9JZ_zP=nsÆjr8w\ek+̅ ~ Ϡ.ICp/7c R|? f>HU$4P Y 2,}+~Aas!c -[e2\ 5Ip} h<>GˏS5霵3_S-.PkpCxr;Dv5<65aǣpѩ٧,^Pꞈ/uVsZC7lCa\@-;[2z ,p7l (3tAA]SKv;m?Tz;?(0|3MXo5$Y|Kj8dB!Q48%?!;fqW\lpöp֕قӭ~ !*hQpXo;D2 Õ1$r@L=js4YkutL7ЦSu7z}-M*2 UcD)0֐tBPKO"3 ohAB`4Y+!DUZDFfB N?9s*; ғX,c@!HV i F)t$&'l8qtUW'䤔 PHv@Ot|ua]d+,8)t7@QUڧ(8oxETE iDp7(-.7"]p ˹>uѮ#Zܐz8 v!Bs]H冨cؤZV,ר( P 5g5K)a%r;;"TC6u;_mF[d nKYBF[ll4d)Χ(0d5΢8J, V(SYv47ӯ a䴡X=O8j ȍk{obc!,nu=Bn}C1Oxppt05-iMXYY}-g`UPOya(ק\ Wz&>@Pn1厨(Y[n|,k$xoC݌,#,r"7d=J)W;W@. 5'dۈTuo rCq*ƕ,e9IT眭j];vvZr !7Vi6Qx3}`J[׎* q:770G H/.SaNb9Oym<7 Q8~ɶ2/e7pGnTrBB|Oc+cR䴟 8OΨP4Bg7hܐvkL2Z/# #YA=WjJy} )6b2,m |9 l0EEb nrmeN5 x#c עeh[ {0 W0eufptB܀Le";'ek؂B_ nm+/ [6g)~]b 3Pw봛2ɒ#n1.Hᆓ↰2CJ oh[,r%&ɳ+o=AENճJ4h5{ cVq.< '@P.up <Ыgu% X'; ' Iɦ(IlGoz z|1pxH#yXMp'%lptn87n(YkbP]ֹ'1ݻW g CXg$ာ"yK+lsCRqL'h۟_l(8PtoWe+RO 7Ty puݬprn\U~YJTp٨ _S sɽܧtIkt*?Syϳԯ:;'b̺c/>nC|tou^d{"V!UY1ftl"W^]OQ+8/;P^ka f%H 7XTqi8;+7#HjfT|#*ŁQ]t^5zwH<˰7ϐ3hU8 ?Sʁ@(83җrCsp@hp PTwA~l(B(!Ijt!kIu"7\pg]$r  '4@"%HJً݂rC_%׮=_R/56ٿpm>\!SY523/Np@ )*nvȹ!#6&!opG$/7,t4u@bu#0%$.iD0yOnh) @JGA}ruyBʪJ↚ .!E"d8`+獢lnuvS}͜W[nIӒ- ;T{&c~X[Y32#9)bU:ظ$v+)@`I587A3/7VpkQq+qþ%BuȮ٘S*P h5ɓlL,EY6\'ݸW8o]KJ,h;Jh35[ \'5ތ6,POh|jjNI՚5͸!R(m; > y/RB⣤~JnU]}VHEa۬.jRsJsɸjwxdMs}W)-_(3lxCXtO-hS|E4H%Z-|qGPB|*П|HnTF\i-톴qv9L9heݴ { 67gyc:nڕW9أ=YWj⛘C4W Ըìm@.5Pl~u( [h`gfE@. o ]>J( Ph[BU<-l7Zú@$A*٪`а>*-V^2|tϨԸ|lL&$HE -BKq'"\k )a(qT ڗpP@)$7= #8BNS ASt-sOX 09=Ը!Da}O 6 !A5OqC9S#fiIL9^KrlP[3N) d7} J]u'Y ELjԅ~S7i ա:Iz_?=j"MJObhC9y%y'(&ǓrQa]P*`HH|-PlH'Ti0KdH\̽ݝPZr{R( Z@JoEM76"G'pQܐU¬g >⪦&v# h@3qFJ$"/ANhcD,ۧkC'Es5PBhH2B H)o}1`(STl<ƫfB:Jvk BT!onC6R7hMUjAHk52ℂ/{,Ol0XY {vYf (>\٨2h C-jr˹ Mf/rCp$^Y/F4Qּ jRS%7}b1-(R l@̓&HbgTF7%7TRHn?hX{1u1 %6,Ii @jFw!9,ZrCݒZh /TUMT; Oh5hS\>Ѐcih0\dh@b@8Y E.ͬD757Qh8$;j)ήL*r`B܉Xd~ cRg!EQ,X:Ӛ]+X+ u)ubV5tSyA.R۸c;EbMpxve ^nX fwpVpznXO4|s1SeT̉z' nKT~P5.l2c!pW\U8bت9j n`ji#(xP _Y^`s)p.5$#j1zCv#f!,-C4(rYn8roolѯw-T͂ҕ!9.TUhK빁2WUC4&/ق_6٪\l(9?c#o;`8u+ )NLZd E70mfyyeٌx\61G] E |vNϭ١H'Ժ@)N˽]h+ N!Es҅1i;\F YfU-"ߋsL-@U'(Fnؒ]oޱ\ L,ncsٴԪGcT9k]9~-e. X: qѸxC? ,T ɣ̹a*ܐ6A.a.r{Z b? 1A!yS@pjLcj%Գ…~) sCe\N :08$|go9w؝jn}VJwVz|u VQ%{mN_Yf :r $W{`"}0Ål.7`M.r+o嵈r*:ArY֧S#QKkH‹!s]3S0[?릚5VμS9=E 5FĎ8eN(Q?O~NGa&[ _F-xH)dk5 %%Vq\ﯯ'rli kش~28PQ "N>w\Sy2O-cvO({)|]ѳεR>]Lez~ xj ^~|!&<ŕs#_ r( ~:!{zt0`` BYh3fӁٞrs̟UŤ %P8I:dNft@ ejX'rC7\ 륬 `e淂QPv78] gqQn8 S_h}8 nDuv7r4иR^<4l;FC dvrsB.}ahpȂ! .0×䆪!kY4 h ?9EpbiE3Ւ(Oe~7d!YqA gCؓ"vPI;4.[WJ EIyY^}# sA(S7Fa/Iv;1܄(\QrCi•\/çsa.!uy oE=Nt~(1 o &=EkN{|J)9#r2*oU,o v=lx(_d\+ŌJCV~̂S.7ᆸ㙲zomڦ] jɖfgo H0zo_\w~{CK떩Jރq]2{ lZ.ҥk/@vKkp,pbvT2v_b#̅VRwM.z"-K0;{Eԓ}!Z>d~Gj>P)N|(u<Ί ` ~(7Pi~޶Uz>ebF2r#@+B8{Ć1WÓioSifQ/RىC~<~/N-5z@#ǩ]8AHI^\JTm=5i|eD)1=WHhd痖/Z.7l"Z!3qfp} pY< Fw =ph`G-]BГVJqt]3܀ =ƣ4itxMtL2-bj(Sؤaj.m `i. jа}  J`vU-|OjQF/7ĭp=\0Z!$4s0pr8̦i"rӃ~&pº 8 JX߭>TR=0 Fa/ iG; eQ$: )(@,れƹV9ڂQ%FpCx|2:)|#Í?XHbFpWL` X) Kb!'<3:w;D MձUS^n'LJFs`PʄѤb# T! XWj1Аyn]cGbz8VI%Pfj(,1y: Q @= z(. 5&"moSيH]m4|Qĝdn c dSE0YAAő!@Eʴ/TE!j%' ckkrd÷1:o񐮒!n(JZCf㴜 | 'ChT6o# }NaT{F*= L}ݭO'[I>|\"nISJsfHwDCąa~v:Qӛ4& dPJ8% !׽S}[&,nHUKj8nC MsdOW^rCʸ!zUph /T6. -!> SӮ/%Ncj(:ehQj& Ca&bY`d(Z_QADAX=;qS)^F`"Rݵ۪0qK=-E,Lo~E MH8(0s3lۭ0һ!u(JmYֈfL;1ZnqUur8 cĈB1p1 w鏆`*M(n03QvL83NPzgQLN+`KZ|;^ Y7kuK\;qj8HRІ$؂"t] g:w$RMN%{kRkp)%X,n 2W,Bp7t4;V2zpP֥΂, Fed{kt_ns4 F:+ϡHd#FpLōʠOQP=h";:t>J@`N†xG8dZ? 8|7HG)W4a.4 j&<䒰%[冤ql1JVgܥ' ne~AU,rB،t!JiGBsCjj @,T;WRQB ڙf:w\ fv>Ј1Jvxa`6"j=~x'n7>i]n G(_KZ<ъ).sm[8|n6~h. !W+,/Gman qJHlEѵ(Sռe` 藣Rt[6^f!mn!w+>>@JܑΧ-ؖݵ3ͪhePؔAUAMٛvƓAm鶅70NRzWA2Q%nHF.H_7``RR+†c,#AoϽ`+|v"nH΀y4X' GԔOQ/ *nPB lc]HZ>0 A [)~L)ܠT _ed[pR ǜCq v}Q6zIb c?CJTmhMXf S6Â~/|ܮ't세Ns#f",nd [{/YA†!c$0L$;_'ebwXU vvzS+r=8U(7ly5A8śy&ϭ Nfpt ٘=`qCk,MN(2Mj9~ZH:X~\Ƅ!L m٘ji𴈩u߲KubGڞ`}D0,{4(8#K){YKߍ ŶZp'܀ct?+fQ]HEqRarskZW"7 s970ݔj!@@AdtZUا!-)4=XaY@fOx;IcMr*;Lp(*`HmxC&ЀGéHܠЕM~CIy&!kETnw )?89yϟ>gǾgo=UФ61;Q r4;p  [RFbǸ!!"M º3epF 8,C31o7jgmY C6P"Bf [FF*P+w'qzaKcd4+"P(F8ꇱvcܠʡ)TcM4IϻME.a0~2F}V"b"l5X ˓+ ̓VB:Go.;`iu64\ —;g W("'9T0I(EL=(쥗&J49 2HpW!Bl_;&w:K!.DAI)\;QD;g#HDXLtN)MB*8#'réKG7hpV|72SJFfBܐt@rFD ,nMQ uX,nH\ڒNܰjƘb)4mw@0!pX%i7 s&Y<ѽ!0W;:y"z 5~d+Voy);'Tڮ uʓZʫ74$T4&[OHr0h% [XJL<:vXnk'KC8 Y G j1&rõI"8|,7Ȗw=v8 eÈ>Àu'1YMp᝸aD킃h2,Zvs^"Cp o XW4ItZF $jH~D:5U=ܐ&H‡LnRìmBcwZ#s1j|Qtѐ?פa WLHgg'wtD2oGgjFNs#OHЀ1&GN99!'6V=v(F}NnZ-hR. ONv `h\v{|FFYV;s RnP*V77zC(4{4L~婞Jt@!$G+zQֽZk"1nh>!&1B n& h dD=o TWT#ԊŽ%?P)zP*(E[.i.V"[Sa{$rNƓ?4kSi)VSajyNaT8/HhWĨIus,R 5 OHp0΂,O@d}pZYbY .\O\:rѸ4 5vsNʥ ,ZmXcR 3*Z,g" Tb`Iw_tb1PR@" @SI#t;E0Z}\l1P!FM:=x?5:!wP;n$n.Au2bS(;tֽW37lSl2!}JIG @X CvH7*R'''ǜV&1@(RQvLX(#"t!{ ˡЍw9ސ/YG`oz"{jd).`'x]Qfio4@j#dI@(HAXNj1ϒC6V)h|#7e5*\6C,8A Idܐ̜R傷vFs_M5/KYGxC ۫dn-66ԁnzk|s ti_+.[-CCmY#T62pi^q\F"Ko G)~u6W1 z+$"?F[X3nh7B#zA vK#򲸁Xͫ'al F$ L2 };0HA}j#e@^rC+ @^=/VP;pE(#IBw,:3(eٗB !j#'rÐ2 ";V= g/$|]]<8,Āǡ@~_|TgDժ2f`e3eVTm!j)+5wyx`j8  ~8'7> Utns+P9o ja xʇ:n;ph+a7 t4(p7Lqo8<ܰJe2:8v7,䆖)=!i+vt2ѯ~wn>Wxm8'9ߝ&+~ ~Ps9ۣհsU!+ :u=/<˥|Ϧ^F<ERS'k5Nhehݓ9L߿Ŷ]g3qqC2)jtethvzL3CO$vIyq`l* xW?0%zXcUن 7 ΙҖ&`팭uR6oisˣ8ٚ 6ϽaWq:{Yr}`̥@@Klpngb:-1gᔸ7||[ےhK\7>XӋk)5BI9q 鞑tOMe$YA3mNh;]&{Hr5n%9b!)6W &7x(B)9oHS }* n8pn`zOsҠ;%n`y'\L=iwoSTL ε: 9ωx|θp@KOR{16 XY(Ҡ)9ψ~{ zV̆ es}"N+jNS↰#4oWrHJc"ڜMA|psNܠz*^wSc"&ҧ(('<)n@7(. z=blL_g :v!k":K9DSv0fu"QN3q9OKyGCGiЧ?_Mrύx8S\r(ŧ>d5l`-qﯯ۟'xo>ELi /@pt|:~ɊhaJEPXP]>~ m`w8d}C`1ypɽ{5Px۽4S(&0Z |-++t ޾Q4h}~&ko)fpygùed R{0> YE}L*$3Z8qzxm\QzUUA'x~Z4~`v}:>,u9Q`6˵ ؟GF8E< ҈t×ٗrCrT&s4$`챍bAZ~DnAi4)!FEjx%MpaGF Y iv& q4979"v{;ri0X@:z7`8sZu;bqV̛4x+7"G1Sq*Gw]xpE\[;,rCk8t$9:ƗUa斣I@hF$t!"KlLFG#I<)n"|`ʖ+y0n83t#I>Z7웨[4;ÀmoU> uno wAE'<#KWpC6,:áP~g8*읹W#@S; "1;:_v:;s7_k|WDyhT#yk0^Ҙ%'''zY * ``1>%Nl E~p`n ON%F٤[o2-55nǝmS㾈ƶ3`SeP 0`'O66ȓ.:t_Qv#d'7,ర V"Չ MAYffȈge-˪LdLJ_S{p޷\Xk"\t5,IܰU1rgbf qxMvX9Xrlv*CFoZ B lQnRl(#.O@ 烲01GH9:մc5nh[&]EpCo %pGdWݎ7Gmq@dQ\Ⱥjg"[P^,62 ,AFJ4 m:@OR.ԥ<0r0<iqAz(|^SA˜p+ q+ܠ-@#? 5 >0,qC"S]dh30^܀ׄ*p#t|l \koWG%;!j`7DkJ<<˿S>V?9y˟3}94QK BԸ P΀Uo7RS?͇|[4W[h4!=ٞ~;?'7H (HO!A0@Ti*GP[+7+D_astWsAs m.[e$Zt vá i$")n P7V?hpX @!DFN(3v44B8eCܝܐá۩/NRcuGA(6ƷKZ$!C42{z r/N0?AİY Ur}|4paqCrFzO^e: qCI8ZT/eŒ~hLrzu>m^;D4Sm>quCR)ߐqx?@ܮog$ 9pF䒜&nV0P7 ]ߠyzWyˬ6n -[e)C"ǩ^g.5d"e{rIKbʶ[HŃ Re3Sʆ_+ -Gs;UzTʇ䙀Xyq2xwnYx>eIxᆇz0 *.ੇzpPȱȧOa85MOİ>Ś]fvH=pCέ 9 3VZV77Q+ 5 lOl7سMLk\8%N&l\q0ynHy#F/Ƕ\q/ b |(b%~r>/^to9q734D߀˖ir˘7X"Зk9whOyln+O^?DZx 6?=HOZ<z9L30'y uݲWݙPܝ -O,!lֈ/ 6OU 'k`Yvڞ^_|/>>F/ҦY|w-`аs6&*C`֯|#'jc;Г ǤN&i97d#7z>Y ܻz]E?w% cOwcrffx=u޳P"at1= HWnx:oډLnh\9q =\)1A#1XXbxjy\4|ÄxpΒ#YFaGq;`a`q{4qa1ϴ4̛"INfHISJ>V9ظh2+ZO$3~,7L8tn2Up |^p> ]WX!CFI& f*cFK <Ǖ<0 -2GM:\kDø! -/ 0yᓸ:?}:}k qp1v&<֒1I7=pC- }#`p?]d~#K3y vnxP>v!>1= _ǽ[iZ VaWePT85f۠(ry Cw1$eHl}exSH <'Fٹ3Sy%Hy;gN$8iXLOڝ-a\27}[ kNNMv.3xJc98>Xڝm2zmgJS*\ זNMx36H\Zc+-$]KD>n]/?#ua; Nqo`N `qC 6TX|cy!"v7#]@D ~NN] {.v-7tJ 47KL:pCOobJ5K48urPiX$3ĝ* Y #ވ.0yː~NFN TN(d+OLa)xkf VI0^U9nX>䕬ܠZs Ռc乥$[n8} 470Vg|!Ng:ٷR7Sy_m>$ 7)ܰdoO~w#!5{ndTnH۹Uv!r3nbr-´AJb9l7\Y&6xy4| 7}Yb6iny}Ҹ!!0^V2vn@c`|Zƶ|bcqC?P*$'=p8gpC?CC ,-N#OaLmh_պ7hKZFn& ]N.h|9㺕iצSSDI lcI|΋Έx OK>] ~eCPމЪ\>'Z&ϋrJN|~~~}|/;qt~mON/@D_l/x.wJ #,jy{ja4}*z=wJ{j#cvh;К/Pp]Ӽx8ph<6 /20/G"oK(eK3(}e@aD$Sָʑgm,#YOq!/b3'TsSzbEҌ"fǃjJ;n=E{ 3ə_;r#]- D'wf$=7X" ?xH"ȱsRD#NS,=9!ѧyŽvrwc >ړq}../&-7h'w8&2&8WDJ$'!d%ܢF>) U쿸em6;]=`3ni#c n L$4t4P Sgэ~E4H4<-`:Ok]=U5G>Qǀ ^5akaU([aQ[H I34 H Rشa931ᜳ#xZOD 581x=& @!X+npqAh I=YZ1\-)x)(+9tL%DԦWE1A`f#hZ7hkuUĵq H["|{Qpp>*j#[X3hG#߿iDDX4k1tƵL3an"`Ie ItO0>1Qa7=$_Vmc$.QO(\ㆠ~B2_KWlD~.xL@|mD>A 1Q@4ƢH2{]qDnq1]t!0< V=/DNIcE`!H#ЀL8>#JS3\nsXd,P->|~A996NNOpj(˖i5$f€3l$ IhaǔM$kCpZ'|ojW"">b㋬]`RS\O'x4z36 Xדe[b`:ZYEadOwhlK[ՖА*r5]>*0%j):;wsC7o'7``(Xn,b7Tty!%4T'Ս=ExۍhOf6*0!8J9 *>nP< cy6W۳p+Qhp ,u7,RUgq=nrjS;CI!Zj-ًqCbhが v|2 d5lFHϮi^>@ު- H Ʌ*t nQŶ%7}r!%܌$FY1^X5INP CFUFlFQʐ2_k%eS82MwzpU]wᆵ3p( 7L/S kzi(u{Zȱ7P3J7tr.*2Zpa,uA`8Ӵ="]O4l({3n˿t؆6<u|;n@#J[lj|xSbϑ.s7f7oj5e 53:+gC`$k%s 7,E*$on>\+۠Sr Sm@Jmw;0o3a.REx]##ctaY%C=iC1'<;Cs!9.Y䫘Ik텕ψL. h=w_Z[kY|[i&gnYA1)Q*ј2v).n`[fyr0-[iӄI`dGis,-gו&>H )loMPSt 3&$f"l͏90GpXKӡihsAjNn|gIw&dʖ'DΊ sC ]gwzJΩ(.[vPdXBCNM' zhA`pq4 hK@>86!-ʙ} 7X"@a7R(gҟ{! #U&|02 h@ľ7M'7jE-TUW#ܐG"X&W7Bkd=szIkXFS (~܀TFHd՝woקhwVun5L$y \>9F|;L>^Oúih |,2qQz7zK'-. O㬞Nۿ!y SE]+H Wa]c"u;I؜ﯯ'S55Nύ !-0:ebvnxG10,Z`XhʎdC60<]^Gy D4ή}$+,H <=-E6_7<تp춚`Bx  4) PJƟr ѯ玃,v -pݲ5xH ^iQAz7J /b5:n-^hR[6=͚ %R} 0Aiy8}%Ps[,T˱$@keX4rCneC5{X^ K-inX} Kw#c]Z`K7 }Q dn4xLs5fᆸ#~!l#MMWf27d8~#n]?r/H\=Dh?]CNTl67؋SDHǍ Si n1iJI^侃s0n8bɶLN5nekQ'8.<|iB$M4FQK1s.6Q/Z[t >Ŕٶט!>hʦ ݬ>ƶ ߎިO/ q( >R!ㆪucX ܎ިO p>ұyO!`py} LnZ,*qCB.o 7SsJ%mjmItq̨v3m*LS6{ UF#liԲǴRt}+/7A)^ unjR]SXJ`p ϣ v4l?vY~+.ύ\Sz!V[Fgk;V߯c^\o|*x lCIENDB`ZuDd(#.  C \ADquery screen - no breakpoints.gifbty?]8{JxbtԠnZty?]8{JxPNG  IHDRX ȾPLTEccccc-=bKGDH cmPPJCmp0712HspIDATx^](=$@xHjg;3m2 B6ϿoϿoZ\;;߮51sRvVW)WX '+G`yrڰ 5?WVr3 Uqzy~s1?CE@XX<&r`dG耊 חAM2Հ~F ) XGĘĨAHN&fip[,a+a8$ B2~V.mXR Ju=&C%X N50XM.D oX6yP(QW 5jc1fFWgk @AQ(<#^!n|x'u4:D3K[SƱbBƮk@Xl<0@l=p04v'Ȼ<)@!yx^MPVz86w^Z~~vOϝk`Q ~Wߝk` rO(c{DAX.XZ K[,~g hh:C \GSG=r[PP(q9등6Ye% / HTa%OY?U)6XJ4X92nk*w Jf- >Mal B:J:K彃pAuSV yC,IfWSNk2YT3Rre9]K Wm-*b5x4[,{N2\Ayyq4Pp[H1]GQZ"uǟ>`ZXiBLHE dG:߹SRE3358#d,& 2sw<a|uXr'hhQD% ̠t]S:L/k%<_QN'XgB&Z-8@C|)XϣU@x(? ~JUt\oԶ\? 5V!CAU1XLD<`K(h.hFkM!•%-STG:YEL#zBjyz yGa)|Շ. !Ej.!/bS''&TQ *)! 檉xWl|F+IcCܡ@FM#e!zf?X̮F qhtE oe0I4&‹jQq pVHJ)Ziϗ2/G:- A>*PXytar Ydeg 3{l*=ƒuICcaHQߘ@P3!/eYYi p1YEn/^'о|m_]^d{EKZ{Zݒ:78N+(~ $DŽ_O9/JM$0pC`6SigfL J(L7+CĥZ3X'a[-O_h;=i\ȆdS Y skpy_((@Uk|89:ܜ7lQj+ .UG~e0*FW'W~@֟_o8Ñ߿z+ׅ 7Kõ8k>Z?J*HPB"KJWH"$bk]|9}x `1'i50VILcn`X~5i?w|a9cUĠyEPg^|Tf%>߇"c:tOh+UxKc4o\ߥCY ㏭ ☻6_^>=!#*Ŷ}3ĽYXUV20^E@D.'1V{#ڴ?_I Ydm*<Qf ko}3IO<2oqv`mZN~rS3S Xv:`p@+Ez~p_ddm23y u >q9V*e`d=-"3*T, @VD)"빌++D=E\]I }u=a.$]?NY F ^u㒜\$ ДZ1J^{0L[kI)%C[łvVRb݃.;.w" `B=C5py1NQ.6_7aXuԞI6 (0BCEꙀ~y9H51@t>/a,,Dd$ئ.b=c#~Iaڙr X~͓.P" :fD$=fNPR5pC1L̷ X~zȪ2LHHihXΨ$,ߡb3|(t1V m!*cQw}S«$ҭUH)/@bGVҙQIJD RCHR w-=2blZ [exƂ{v/;R"$s??%_0ͥX5Lx$uDcZhqdbqv`}IP@ E4dy@`!H]u" Mfm :k#v('d,,x}=*D >$HY?Ų,yPE/F`,#[wЈs9ˇD{ aXc}X8j#B[큯󛱪=Ђ2ͻJ#(XNI{ `$.`1c%Ce)1eXEpbdX²x:p/M](726 elR{mFb0W?r*>a+F[XA>ƒJ$d[a+*Ebc *oyАk$H=g8XKɬu+ ^>2TS58Zkm3`(g:Ug41ˀb?cU!僯*We~"Cc{&YoFE_'#kD=Q*d4;ϼyʭ{ŞК4iQ:$"B!.PC!mqf],NGAiInRFY #+wuj55tUupO!X0V-nd=g7e,5hO0ȰJyH&K?quGî,F '1V P=.Z8П-c.n3V*cgñRyboK"#6&o.>[eeq,{SIagmp@b!ފzn8YaQv2D;iGUХ^g*\XjWhٔxMzl[kf hߠ +4\ojT<6QhoRq̈́| U} xyt-UhQ0f,&|w2+B'Z ^ZP6cq$aI51'Wz9j%t+Z/t<Šk+\:X܉Š~|&Z6ʧƥB g, 9/"@witfZ A2bo'@yP cDŽpAƊ +PYyu:nlFٺ ʷacb pW:bܫQo}1VIR%).zDZJXTLS q&hv-6.ʝUlq+$H;E2mJ0V_ MTiH=݌XGZ 9aOFf}܅1/uu]8.3djq?c/pɑ݇ KrH bj:K)~[_y c!L( jɡBڛ*My*uuUVH:j ƻW'^mr՟YQ k3"cu>gг΢vSi0s4܌5v{ T-ۙ Lpw]1*X qa*2t*&؃L{߮%5p4__eNj;G@ֱ5$cȸv @>>c,x|%™Nṽ/ª3`mH)ý]n2@ܚ,3sIژYo7 <̇nv>rY~ 6CU,*.>j+}vM'r3׹m( G;+3WQ)hIw=78>JΟוP+qF,^",j=x><8?Ga@I dCuy$ rJی%8bBBE<{:Gu\XWDRHWD,,!'9yR~xr2u`U7e"i zY&b66j[txE=DR*Q4 <,a_cob<5I} $,d5_cL|S0&+6艬XHO?DTC rr|_Zh"7P )E dPQf>ܻl&SDϦ z~e~yU^pNtR LVk[C7>Ҡ5Y{7\Yv.Ǝ|4e,V}ɧ!k3[ 7c gw3Vΐ =<O%s Eʀ YTpvi-ϵ=U$VlRMz(IKF_)a c~r{@+Gz~<͉EgA~' )L玃3OP%^h^n$}3-g6/0쉕`,Ss+qLP%좓mrv&Hl4G Xf櫞brh1D%]+_Xr:0D$ *XwrymƊ^ތ5yw0PiJ9#Z;??+DTԂ}v+LBl}]UgIk7LSRgEmD g>(:V=r,٭0)`6ӕȺ܀1̷ Cq!8|V.^F-1r'ɩr ﯏Dftn$Z[A;w'P*E/0^3ߠ>yS/YȲY;sL[$Pz~̨ZNa/ǷjM\KX~ǵ:_Xp#_X]Ȱ;ã|Sn0\^v-U8xbIH13h y'8ω8b,2ߵW~;`xȀEȖ5 ^zi½mtEƂIt6_f+ZMNvƯuWWԬuUܢU?fnGA5I8 `Fw=O""WK42_-r O7!ƶw墌c5 $ St M2Fwat2H>ͺe2Z1/U,NC|cxS"NX> =nmLnvL=Ԯ,a?K7X3Zh-YFȨӥxm =T"PHJ A&ȖXĨ@衧8)3-.8z CZ*[HkK{:?#1+ ~9.p !TLX!rrH0CfS[L@z } #DP*[ .a IB58 K'#y@*jjEҍ'^[[YT11p҇ GkmJfy(Rc2؉^\Dޭ.#&Z-d* ,V E'KU`,>H]uEX# 6,EȊ*>{|;`@.5 HlR-]ĘZ,23 3 iD͕[  d1"J ,zDfhq1a'Ӻ#c5`sVC8d7n&eJ半4:}^X9*0Vd< YY x+5VPAƚhG kU5`VkSUMX^ze{NLHTT3 t(,lHѮۀkHckޱ )n| 7W/7dBG*-lּA&Pmq0!; )͙b7Ȭ޶Zܒ4037 D@jyPjoCgE{cI(l GVqdjyodJTՌur[g Y#'JZB7c?Ҡ3HKm2ی`:9Oo3k _-[/h3}6gPnJRrX[gś9#c̺"@˺{r-h /323Vn\ePщ98\i/^ضX'VZ eT<b)g 4+~#'&lBtxS†םK^f9wGkK4@r8Tη7&Xtۤڅ'f⏰k8C=g_nLMKM'[X[B7PNɁuybu\rSƺDciM5hX$ -9Էi87K {9aUEp'h, v#Ǧ8c٠Yp /n,L.TjaUcMpB& HMeOD95JB s?XsCaIX}Fc95K{Jr^v X1BBC4E:G֜Bjms+4^dg15WWCF,,!;{R`gră=t=Țiu{mZ)mbx!s;{YCuҸH9HW}(1V_.zi<.'\ Wޔ." "X)>Wp4Loxoyl鸳{25KTydqgz÷x|2n%婞\ѦpHeUW{;oTd} %Xmb7lXEL@m"ϑ"i6~μqX߮%5p__g+Tþ5uv X.2]5>I,>:^L'eMU]aEB6}"{,GԧFs=3ř7Lp 1^;fY BxV O? <"U:VHSg>rr/'-4vI"E X*Z7{ ((ldrIV2$Ԉ#^/DގMUkg?2$0&SL) kG_ɸbX7>Ҡ55Td},c1d_ Ef,Ŗ@Y,l&bkl8{Ƌ6=Ya,Fim 5fHѣǴ2 }tBV(5U-n<.嬷0/?Wz~SXU0J17=衢Lhd _X\}ʑ$6drT+7hb8> )+玃V.OP%^h^꬜$`_Zo~ʆ!p lcw0 Y9ĕ[8&luɶY9IPEn6#r;l',,r-&eUR,0I,>j3hـ0V[N'%9N0u0V!&<rq3Vf!R`:m\)s$]k?{8Tj>&Uy!_ -Ԫ,uZ{J[\H( Hs,Ղatb0)Hd6ӕȺN)q4?g?\DAZbROSX]ݨ֦kC3QhU;~fb,GcDp0P lx~ D=g?Hy֊M-^/fV|Wzcna@Ejӗ@BY-A5.cʙ臛7fr`VM0u{ϓ<_iKv{юM5 j0W=5 F V~0)Y,ן V=ƽu@)".;Qb csy:`9leuF!JjDy,b ~]..XVO:'"]S2K tZbC6i-@:ǬiA'rZBw&1 PHJ A?hK5,YX'g8ʑ^bN\Tu YуT^⿉zt.s<䝚0G6ѣsow3a,Vݞ[Ad#" EhXn @^`1w{hkeb1( ٪1Ѐ UQ\iXA4)!R빞 RDVs0 ;yf!KT<$wk,ce jB=5WhNЧR5Ē͙YZh5 ת =$ s(1T%;kMӹFc pbj]V ź̃5&lۏY J>ZYY׃^X"+ JK*AsVzd&'ckޟF }5cܖo cbIu3x5sR\al( t-XfC6c=0~b_:V^7U󫌨3θܔ喱^wjՍefo%_MXͱ#Cɭ$NYβm^|mnUefm+]Kq 98lJ`O5x/gS7 F#β՜OƷ #=h:+0)fiB0VaC pRe_2e[яZ "׋]a5y[%JDXw[`")^e9'/X2lE 2ZU4+,8&GU;ew]i{ӊ,f3{.T>/X\S9J4re}7K&Jtޞ %{MҲ+q=uKT$-S;ݍI |y5_ Y0sۋ "!-$+ǿdm)c]21Vn*X$reKh :3:$XqV2!4&Gmw6{h,R#9XBYp GdV>!Pm*cъ >ƚ;>mc!Tg,{*%9,IPD1j,Qq?s?X掳/z Z[Fx1͜x'%jHcQ.k&"cp橌IIۋ-_q9i,jɵmH_I䠽Ʊ΃,T-^1?t5C,ɓ^A门fĖ4j_4.[X |rY+kx.T_h,Y$PJ iYp)ƚdck8.Rߝ%p7~(iujid 7+ڔzX*d=ZOIGXtO3N5OBnN%mfPzn59g^wgXRG8I5QX;o"50_yti7Yy7w?6#!W,&dҝ6yup1#d4a8~|l|ōٓ1p^aR2zUl 6Vݰ~'rsj]+N Fo۾g+u > Iw~ct ?`ô%>]Y`3gsYaᵞ_TW_0WTqAIĠO>Y?XGҀ!mα:wGrBCHgOPCĆ4QcX²ȑȓ#83$"6Z!0ewdhAIw%?لea$?R*4 Q[\"/:cX2`$-Xg8ud״1QyL aؚ;*Uɮ)R]4ֆ* 18_NZh@Of?ewXܻ) ^ 08 C]6ف'X*~ cyHN8xiDzEBw~dڝ ~.#9QrP;:lzӤP7 0 Ðx ezF>+QoZbR垗t u.nݍDk׵s惡z+E\PNa.]7hl&`9i0ͿhkXYI_#CTͷ~o0HUz~w)ٟ p]2Xa28+\$WGpk=^)~VKPͧˢØrf3v;c޽FիQ:2sh: 6;X$vY mM;6!Vqb U:Xz9[H# f QgPjFwXŶǯuWuz[j. . 1V\GM!j'zB/:V ^^!\?0"uN[FdE i+$WְVg!$`QQ_ppXWZ`,Zt% cyX8veloC+xO|HL^q1]xXצ/l4XR~f`v5KT zHl%kuqNFzbH(a(6ZZ.ef,pb`\h>ffy(R*1QY\K u5'{{" Hт,= ̂G ƂL1EDȸBce؊R+=HU|Hg4JcO:?@_X~"#%rmrw1V1pm {]IVS%EhX5ruvI,|cI0tl׉6lR⤘ iJW2uJjzXyb=^JY 35郱^ PyHA H`~ی#jׂ&MHHA&xte|2m"6҇Rë\űp #վl%Xu.:9aKHi@nurlAEI/f^%E k\ gkD+JeXd|>c /SCsuvK"+֥`JGqW뺸s*pjex]w2֢ʉt^;sf,ZBW.t]h4n)%}=XSq0cVO n]V }7TsER/k)2kŇٷNXqM>'f[Ռ3gF;kx;Es|YSÙ6.*FWk;qL"?\iBd/aVK[QӶsʐpbI3&K sƿWh,@oݠ $єR(nc2ַ_Z&[+i8s܈Vbِۃ%ZXV |GqwSƺFc1`Si'5ZHZd{tx _( IQnkQBcq$W􃗋i5h8˿l~;YPXt:+ilBc}WWj`L9!Wq YsEdt1dq,N~K+XYܟN61#(9!fo,`ξq63T!8iGǜ}/XX1 ~Uz.k Y7ek6+sY\ƙ-^mhufcYu8V[KO#Kȸ*hS)51%FWh|`f}};M_f.j3+jCR:` M?}T"7ƏLDX߮%5p__g+Tþ5Lv }G[wqu 윗U]wͽN=*=#y~G޵NlBup蛥 &l j}GCa8~4GERaSjV:{JciRNfI($rm3WQ魣=i.*'e(+9;DW+?X㌀Y(%",j=><8|I>)*0(3):$1蓅A֏24`ȡ+usl3rHZp£GӈgOPCh H EcXXVB@4Ơ>Qf]XпMH139dلebUUԝ9MV?Rc<% ѣ>F1f(_&iЯi2]> =#o=3kXJ_cL|Pw1-E;zgƒU<<"Mi4b9rr/'-4D A^2HwGu ޠJ<$i;e愵5j2$Hiz)HT}ADU:+#$q^㶺*a7bd}(c ZZ2 ǟe,Ql>pf,NEYotԗ!vޡXnՌb9ARdBje@vĞTpɶٟoa_~b 5b`bnzQ9H:Kd5fvP4%uY9A)lN,v:fWoJ^w/az8\5YMzzS6K` ;T,-frq R][&H6Ggw0CS',p-듍*))╓X&lU TIrUcsѳF`bILy zh܌kHH0{c J 9=3AfSD9 uWu2`L^V mܻV¥ jzEbI "hrh1;MoW~wfp-qHˣ#a\,"NF`Qb[ه\e, OXGbJ[Y,x4?O!@¯n]N\5_.R.Wo$A'D JLcdp $6~+AX"Rtޡx`D GBXtFRkXϝKut'YKGj/d3È;".Y o!l_C1Txdm:l}h_eKZ"y0U+4<@4 c]T|,HbA $Ufy%hhH'#ـc0T]gm7x}Y|StSBI60 [ c ʘH(ʀc` ެ~M.Fa$0I)Q` ɋ]B&?Z1.bSFJfC2 B![6Df^a  0iFgOQ2_&#B=f-v< Mg jW}%V+燐<ȢFDVKR:1  g,ρ bc1V; A0ZT1$U]XިHv_X}`/ZI<.@I 3 +yHAH`By:3b,VU'Ql5Zpޗ^e|vAXZrEᦱ£!w5tj#ÿF2 ӻ+ ی`:Ll3u(YUCu?Dyr'ک2qmJ=θb륌喱^wjՍefoo)O+}V5[fF>+7c2x$~9n$BbuXXJ,_NX/.g+Vf^w k3y+G~Jp2NXro*;~yz3NVxߨsTXM-ubK~Vž\yK|Ug9w׫9%U[$D&ֱNK90([DfcCs-*Ux̅{2a\fa2 x^w!%O|O:n@7ew[^fߓXNҥ"#l9֦Ӊ}1 Jh?ŒޚbjRB8ܔ.Xcڠ94/L ծC} / #03S0KA|&MaSBX|Pg&&š3 ]9QjUc" Earu% :t feOtz\ ߼7abk4VSQ/DTwxLmB cddkrM<+d &q:S$wQ?!G֜8cAc7M]JsBc}WW5Pfy!b2s4BMjF&8kr8L@ch~XrȪFu' 0V F5=;])GGrY+*n0GNwb<[YGvsjF4 e3ߥhL G\ƙ-^mh:3uVOd5%dKSzhSAҍ*ߺW{ a{s ^JN{C̟CwЖS*X d:"ϑI3ZsμJ5&H50ql%jw߮55 Ek`^yҾ꽫+CQݟsڱojcb`3v^}X^[٫ S^W3#Yn_bp㮏Zc> kȂ?f+Q ^ow; oo~=⻻r B$8K[ᅌHA]BT]& XDzJ7eGRFSbTE* uk*YObDҔREB$ /6-,z.c]I3 I>#|v음 73,1V,'lt gbU4P c8Nd},cMDRY1v8ha{cwRa3H*MI DNw랱m q,:j ,Mӵr7(Kft@XSՊm]?j?+7cXƀz>j%+7c}cMXzfc8=  py9ON}\a6c뗟{5R<a9)iC2Tۣ9./Wx> ƒ؏R$1D׷9rnjռ, \S6ތuja,+`,"`ڟ + ퟌ c7X-8m+ r)> <)t!z3N EAY!3rdԫvr51 [AwWxJ_0{Hj{ݠ0SX،u!cacsZ[.G<%E<qX]4tXUY`Ioe,Eň,< E<|hA Qfa2#z),+J0j,YT - c^*ȔCh3W[R{gx>2QY6X0;^KH1Όuҫ!Y5Ɗ`Q ,i3?ϗd1wtv3ZkFkE'mS=T,ej3<?^NJPfdIz#1GG>&D1TɆ<Y.n7RXiX4 G%C׬ldDm1?& +U4Vs`;^r2cAVAed Hh)^]R6A4=Y-`P f#X}=c2%=,8 m)MmPtr'٤E]ٓLTPH*rg *=X𺆳R C#@PO!o#~0[+Q%%Y<:HN*ƱR8vU7Nʶێ >Z5]Ze vE47cʠuJk3 [ϯ?QkZ4Vjy2I'WzXuی5VJOq׮ꃌ)뭱$ .39ȾDc-Y3'XkV{kS?OتjǕc W0_- RPqPO kVC6nZz02ԀoX|5>63νnsw~waŒɸ 0&'_7vld+1G·7wh;^Xq epyN{Q׍ ESC#k!wyRD6u c_x[c,$C2 iȗ0VuE? 9d5rkWҸ(,Ӛ%^3"6+)“eXa#YȺ*O7NwrKZ-0:&_ j]2YgXR߿qǮo:fB]kj@2o| xn[gUgWr)+'[K8҈6򇾟^Xjw v; <6:F޳ >.n6og>Jdg"v.2x iu5XKThnQa-[8wi5ΆO9{cٞ0xkPX:>z9 I#UI8T(ғcL#.`G̷ng"I?"+^¤K)ݔnIMUQ0֭&4eTEQģ(+ )5"{\4|.c]I3 I@>#| z'cLǼ) d|bqʛrfU1֪H( a c GeH*8K Kvʼ;XiԴAGl1pue< Qm q,CM˷1B2ʁEq?;n6cX'^yXX=hXn8}!q:p0ty%cYg0mƪ3/?7Zzc#dGrY;4f*cr{@+EV~渎 H,v ٟ7YdXUƂ~Ͳ=3*X6_#;u>N,AX5b< K=r >y8\72XNI Bd,FY _*Ǐ7_uX|ֹ7h"aXtvg'RUskْK}@R+L)fM5yÅbxĻ;./!umxɌ%d7-~\\\W(XG / 0֒]/z6papq4BO;s\_Dُ}Zb/cLX, Qtn0N u4Jɣ5  ׇvi rm1V ?c\T7ڢQ+c&&p+{kqWߘo*GddlBAi;?bд'X5 Һ`9gu℮<L\&v Et_PɇuTJ(E$KxA:|]׀V^ W2f,5*2w ؂|q9._lka+{w\V+ SZмKp? c(a'+ x B?4!(BRbjdӀ[K]+6cY\6Ƃ?0",]~r";` B|_ʀ ĒKѣ0ʡia,)R*\_]$<%ߣ!iW;sb:N1?y5?X[+C;%Wf+_:27:UcfRXZ;7vF3ie,NX*tEmQBJCh2glA1.Y__ t!uDpF*e]Cnd b79Y݌UmR2;}_}ȅ+4)t@dS+,k5dW mlUjކh xXz{UvƊʈ9h-oXNj),շ5 gC1L+tC, `|6%S(,\ed ESC٪ hBek3ƪ臌Z @wP)R\XR9\ݽ]{PҸhl bo{~1v^X\4 NVzdJz Y)c*:Nx1ے2֟k`I +}j]j 똥v X.2]5 mZ^aҷ?XFm|DZKφY׆eKIڸKP_Ƕv:(Hd]vzÕ'Dd:娄M)֖~URNc1@*©zi+핌'XOIg(mVGaoGmiw%>]`߯mYHYb -,j=/><8͌|) \}mnU`PgRg7|:$1蓅!t;(c]Iۜc`Grւ>Z'iŸSVXƂ Bx^(YGgkє8MH1:9dلe՚(XKZ%O HAEʗI++rZ52`$$Xj']rh,]:o Bxl0NƪFRnFuтuc8qЀ%'b,@k D0Sa6q wdl`)@=qdT9eYUx1 uyNI+K +OԈ ;e4i4e?YX̹BCʹ=2\ 7c)o}!K*AA{mSZ@=g?H R1 H.#*p>nQE`j0*AbI:KYl061вs%*Д4zϤGV0V~p?+zy3֤XBY +pĻk`d[ͦr&Uy!_/a2b?(T\e4)HY{9.av!8@Zq4?g?\DAZbROSr u.nݍD+񭝳 RD!S,1ߠ#Xel?S\>rfFesb:b'>Z9AʳVޓ/^IAπ1X4-_hl~=1hE_X]9~֓g C@e (@ΘwQ}jǒXt@,4˙(p릒!O*Ff-v;\e:p6ڠeurf3oZ߷2_fTmJ=θ˴^3Gjo }آsoѺZ5>'Sn,3x+}v7d,?V5[fF>+7cj,q…?Rw+"p9 5x[hf, ?͂gFƵ[Bcu!uƊ~iD34euXroﱥ\K Ϲ_+#/s']õ=T: [ ۀo(/n K4 (H @u`,t8_kw)Zf;+A/eATWyht4" 0?qkuL^8}Ї0AcQtP_wؘs)%ឌub=-𚠪6AX6VaDsK;wgƱbc8U.; G2B3k3O5H;PfFi˜x c%y6>GmEpiw۶c5^X5zi,N$t' MH~n+Fc8EOꍹ7g,'5kbOhٸT2s)F t"8ŋ-̲Ό}3׌c p,!^Qό~6>$zbPj ܌,x)9,fr73݁q|'5gR`%j웱*:$`Ef"n0}VgXRG8I5QX~-&穊%+jaM{xO֟ nrQcĠm;j [>fZ.^a|rlODWIoX80ZXrU141 v<'Orj=Kn}$+ZZώof~&uߡIxV@mذnIW,>*x6m7& iݑR'uZy~5:ݯz7~RH ҒPunm{&Iue;ywc4\ڨ%&km]-FURu+fQM~5cݥ@es٦?+rw;\zcuc,&t֫+YD& ^ܢ-.8rɪԗF=J*\1 7e+FXa2V-\Fz_],޼իY)fw{;0n5BH=%Hw~aJVPqi^ e2VSXNaq7-0 FWJ!,Qf\+  {U[7oYR/}*Wkde2(|1>5 h *2dm"껴K0y"ZD"FI>C?ޥ"8>bbĒ0 q/|!xr3[4X rJz#rˆ'oqH*ٌތhXGU.l6[%qOrꊝ0!bXddOªv@t/ ,X8V YhF)eX`, <6)<hFKŸo*gUGZ{E*+f-D)dLyuzG,@>T*nƲp֪&)Hu裺;,0+ulz$cI3%pVw0GlX*u7zΕF&EMWcՇXw~k0Y:61VϲRN=/#e߮u5[pkՀovKϏ^)sYE˳~?V O9s߮5ըԏ]IENDB`1Dd.p  b  C >A&results screen.gifb1Z>-``$D+0.n0Z>-``$D+PNG  IHDR aPLTE:bKGDH cmPPJCmp0712Hs-sIDATx^]۸ mo#Iaˉ#mg&FH_@x_@E@A~ ۂ?ߟP (/f@\Ϗϯ_į_A+^o"<'(B`By6/" @vOߋVADIA^7"o;.˷A~~^$ym@ ̘ GAh :7K %E|w˄pN׋q .-$H^ĩ'j#ąʋ-QF` q:NWXiHB` bū(i}Gv15eBB` =H]iv#o}u^CTx$>֍ 1oyzΙ!uTU^L7w7' =@`#*-z_pB^?6}#h c~Kϕ8?_|$!WGM 9B@(|} ? zϿe7@ t @A@XAQAyH-ټS! F W/S SpAi O8B OC unK8^^|׫Aˡ~RԴ { 0+yƱ3PȎi%j5 +%NQ}$${m %j1亂jIM#e"4#D/ gfd C}URvTG%bou!H: H/fCUz&gdsS#v;Q%WZ~4$HqI8G^"&:$| 1諵zҍ.V^Wa,j ͑ClكJP2}ϬM˩.Ѻ*F~EQ8z b2ĺbMJ02J2*{ixaY^X7dn$ja.\)ui*]dtASU4u^KηXeLpZ,c5ч+wCWQSHQ\P\51Ԇa4. xƏll@- PCP\0Ŗ *'dQHM/rKjIAEÒ, ApB]O`I!EБ"K 7oȈ}|JfNq{!ZA:P1Ǝ`23^Qzԯ'{\{ێMQˌ%W;o7C+Ȳgc?6iR!d) SOKg;"G"<2pzd)=aӣ HA ;EE rDyd( (R{$ #GAF#Av8=@I6HJR}kjd%U=nѪtG`YݎTfIz7 BwK ,Bhgd+3uJ\ ˴:*饭'ɽݙ-Bz'|a]Lit+5ܚYߴh~4"5K2JG'W7.#sٱbԛ 4u9m#bvUaWer`x5vDDUb} 姺pkKzUd"Gf &R0Ǽtq$s*0IFjl?)Y[s%:e̬o! TR -|? _bBG%A@y= e'hSBZx$Nԝ.QȘAK4Ot2 rx6ڣQeǬ&,|-yȾ^_Iz.B%^RM {( 9pY'];"H"@6tF6Ep>#Ȑ_[ Խ`͏6(DD9$&[n8 ;ݰezgu?FI\ݼ$H<`B7z)WCrؔ!h"sCL@/Ow"H|Rr}e ŸR=gAݧX[Kd"Kp5ARlϙGm6;[tϞyG69oH΅uL5)!*ݜic)#GyW~A>ɏ?\A}f}WxF|KoI f=A XA  :\G Dy`8 8V|  AƱAt<2$q@q @ S;AjT5o!J=a̵=ME?0OH!X?f6b +S|h+H[_z `x%&F0b΋(%QaaGUZVVY>\q+}">˕L:7%Wgg Y{( nj2{SEyhOV< T"G@eX'H^`rXNVA Bj?Uc&Tʀ]WNXs^bkBZ0olKS_S7Q&@R.k[JadUS̈́Rb2$9p†CjJTܐz|݁눶A8b%e(m d ZTZ_鄝T@v:!ZNւV~$m5W /d+6g.4$<-eIQrw"h ʿo 'ߕw'=~y'-֨#\ 07 w_r1P ?X1 Cw#|w`  ߍ#\ 07 w_r1P ?X1( Ƽ1pޠuf Qcy4x .&Æ(}Ŀhx[>`;"+OV=|A tfkՠ%ŎvS߲pI9fm]obW; m<+T'iTOA蹖+0?8Qy sEc-E,HifWcT8ٻK8}vl(%H8H{ODlmYg7腫"cL &Q-Mfm\\`;H(ȍdfRD) *2Љ "Y\ ?ۚx}J#pfx0%fPL(0u Lq{0&СUcHk!$!sE?*$SX$[|.FKwZH"Zj5 _$ԋ< b Y Vu+^z-u`J@-3Ů Ūf*e>"l銲@r/(}4h%ipOv@GZ)@~%;[.ncLSoA7Ha B/>YL@wصw!ŗKNV7ùUy|d&{Ixg@;Ggo۾ =OV;'jsDu1P ?X1 Cw#|w`  ߍ#\ 07 w_r1P ?X1 Cw#|w`  ߍFMuv 6s]PiU&'ꚫ;OrFmo{փQ1>iaHٓ&i|7C{=ݐs62NI2~kLeܨmMv2!~2`lׇjzזOټgz9L}cv hmݔ9k.(5y$-‚!h>[t$mq;Au|Yx`-hJc@MELPלF!&:GCj8rמ8Q!Wyqh4Iϣy}ë TA6K1g]Ql'EtiJ Prd1+,hl, ͨ@ŜEqv7@]4heh#"AnT2 l&U&/Uy$ʫ Hf)-8BjCGoq x%f=O^y1A jW- BJ̈́,7T<# WJ8A7UL$AN#ħ㼌螂9O䥏yӺ,د&H[))Ejᶡ_:^KtY;YAr<SU4_CxyeVX.4= †T1/&A$qA^j+a.F!QYm,մ ^\OJ:ΫJ,Eu;DmѲ !0:{,yCLϫt[U)hv@wQHyI3y@ ,y Lq24mM~CϬJ&{(80 g#Ɋg[,t`xKܡ 2@# ~  ȷD v~#coAHΏ |vL- )@I[";?1 |K`GA>;&@o? ?#i#O@a9dSۙV)V/=ZOݺoK ;/;aa<֛ҧoӆ+GCI+b-SN67驣 Y?4 mi8cOv*Uu% )7^5m@HaVH}s9-΀^մc*cU5͘Ns6E@5ZK:ǀ}n~eAN yiI9nPS+h9Q~ 7#Q b TF|-kq&AR'\k*DI̎/q缡hR%*J.ZY9)( UgbHPLa`t',[ƌ MJk4jR=d=-DT&CHtp }] K6ڿ2paLfE""b?QlcyS<~E+ ODA]A+œxALK( , 7a~'s%q Z3y~5*-ôt8=bZ4H:]T]`sTI)\ AՊT38iGWp'KjYcKOEFU%%R5# Vc+fA5 |={&B''JT XT9mڞkVڡ6aZ]HlMe1_+'yۅ%CbMX$".W*cq WK#Us_w$2^{}̖ $ TgQ(N fM%;sS- :kAT h,; D۹nyȾa]+Ow=̚FmxU)^j`KkE4ӠrgdjPTST*;Gsh8*|ՠR#XÖaSz> <^, 9(Cct Xa,*yDM,(M3#DXŕ* zȾ2 s+J9X"e3FT'T^+CMiPw)V&=aB~ N>'wDiƿbyڔ}lڌ RRGݖ3uAR0Oʲcۅ0_2yjE J+ڱ<ƕ-zN#:g@g M0!f XapJ5z5TFc* +r/bUMόap(GK[)chB(ݴ'\N o>zDoC|.:4s?H7qWߑ:Yq5Q?WzO׹W|wP/ Ⱦg A"dP/ Ⱦg A"dP/ Ⱦg A"dP/冩cvmަZI6nvu?:6(byl9JVwiuq 2FL~ T> S݊OO)Q?X<:2nxk}MNC>S) \vQk<~6nW;aDQ\?2pE,ΕIu !| S9'ٛc5}*2at*EEhYhi3!~dxeO'+=TbC_u;- j }VUoO &T#6ębZA wR˓Gu'+y?o@fF܉U˱#LS/'wd _tCr)ATpa+G'<ϐ RlgzC)"im *nEU)z1AN31:HeAs?UI+{,G~RH<]"L*)MkcAWh؍%ATyؔL/3٦whʞM TFm5_N˜vʁAOTU~)m%QWd?NğW$ @h'+^HcJ͟~䒌DvC h9xmi-Az[̿M;Ko-tl_~j tEi֫XEQ A^Y`^7KKJL^N 8idl?< d_>jrAD dzNV0ť~I#:kP/ Ⱦg A"dP/ Ⱦg A"dP/ Ⱦg A"dP/8Y1~ cI|L:ni3TM\,#-K>"4|9ؓA z GaӸ7ysΘQ}idd6vArb4m~Oڌ+ƀj8Е{ {ABu'h.!DWt2)ɕej bGBuNJ 9mO7$_IԆqSI%U.vܦت|0 OH6/d ADž{L=[㫐;#tR9Y#*.Fh} UB Vsp>A UηkC>p& s+E:2Z;- e@: /!Q@"R~!'4"ym\/4 <%$ bp-Pi>$Es D}D2:WtJRLRRzdqSϪTq2T[4x bYQ螬O͞D}Þ*^@t _OD:kEA-znWVqK ܵBV~גּL&~ a *Uujעx';ϗS°j#+#SCM-AکҘ֏8 GM '3$Prk8YIqŜry\h&Asx<2D @ <@O L!y1 `Ay ϋ9<@ C$Jv8z>ah9#FV:ֆ(Ϙnʎ&G~rW5`MM1h~J4J "?LdVXS9 *^R$ʮ]gRQQtц JI:̫lɉF[witVѪ-TZ.DN"R]:!?$M%uB&l aRGS=|  WI X2G/Ƽr;+6GG 'Z l AFS =٧ V-@fvqūݓGu.`EB|hZ~f[ʱ#ɲDSIe SFtB5 `du|&]<|`C6!H:5, 6>]cceUIURSUʸѮL_b"$9 V<}: y2A3[wl;P ޿Pӎ t#wvSLT`)Da^q\v@W mOjA}9f^$;Ɋu JR<+cũE&l,h{FY1~?ZA!# )Mz/U\Gu.@-b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7l b_@}c  ,*E7lB@~~d#"?@A_@4? 2LIENDB` /Dd^  C :A"batch - data.gifbZ.rvQzmʇ@6.Gn..rvQzmʇ@PNG  IHDR!TPLTEccccc ^bKGDH cmPPJCmp0712Hs*IDATx^]v C`M3Itt8`G$@_@!7C@Bϟ,x P2dF&H0d=__)c/?Wl@dF=¿lRÆ/Rkҍ!߿gӊ!EL f:| by!~m,~oѳ vQ2b.LJo JI!ï!b#GIŐ+u6^T !r~BJW}HCI ISOnQhHM~ C0igN=!gK6PLOtL>=gv)*,"5b^ ,#,kXT !>#Tb!I>ϞmC1xb:f*t_BCD9ECe 3Y&}s9'!VjHR1e`$uI~,@rZ0k'5=.:RK^LOGK ʥ͟ Om=[Jk%HąG`ן߾!>[}$Z]*V,Cs5éw߲dȒQ k 5vfh3/T @g#=ߒ!x j:$'4àlG&ՉDӔxY:< }C?!vjSS* r"t,Iȃ? l+6sl662c`!S!wCיv-ju! 2$S$iI M)e]:!yq'`ʗ ,E n( I2)"ydy'r夲 EZ yHɃ0G+=2FU4 0dD#DZ¼Ђ:G`#ZTJ~ȁvXeCI(҆ttm*.fʃBʑ͗4wtB!@kI/!kHg I-M U"3D -r Kx60'15oyIH7{[1ɱm^e2gHs+ּ +"MyX\w!HCZ.yZAiB-nDEO4%C83ôaٽѵ񿃓7WTMn|2E3KZʇ{?873T.v(C\VZq?d-XYpoX *Nz>ꗫ})ܭ1 Ol0TQUu^Zf "H+jhHɡLb86 .:+Ur 5Y2,-hRx) dvWKr̶ ycKU>ؽ\UҐp=SWCbAjD&~U?nm9F8Iѝn +s_M?#=r!{(ce 3bR!{(x2wξ'JwnSk?NDm=i=]CC mH="}b ce*6rAs/BCbY ZBS$ӄQ Y#uf-5*6̀X$c2!fOC,^k,~B!4U  o`RW+pJ& cC0-HZf=$UBq:8n.ek~\^) y; ygh'eO34䓲_CCށ'Iz!@ lАwI}|Rkh;P>CC>)[54R!w?АO;| ygh'eO34䓲_CCށ'Iz!@ lАwI}|Rkh;P>CC>)[uٮO>"ϣ^!7aYE?[>~cE4ᓉ )9д:I:@LvWgNȀIC f(7?jgnAx"C#G^ʇ4 2!w'*}w<0w\2ѐ!hmqe >bgfn; pu8ى?p@@ϔzC0GeH6Z)93LEhB aCZ1D\(qԜ>/@>=N3j戆ԣJ3M S%e Yk.ѐl^V W1Wc n3!wej H I :ḏ=?{CdpG"㴧'm; ც]tR|5 .L^BmеL̃֐fy< VVa֓硑{FG⊍5Kszz,b`}ZH+cC*#ݍ=H+ 2|Cn̚00x.|&_I $x_D_[&_АZ\+&4z9G! ^NQhȵq=oBCkyr|\ϛАZ\+&4z9GS4(EP*6*iɲ~U J[xjR$empcqMneq56Q!bq^\{+bC ߯!\98p/ͺΎ*F,ܥM?yO<|ܡ!z/``,sw!\{rpX]طd|&C `›5DLH X[\Tbݖa12Ļf0ZHgs!;oc9Ğq(ܽLy_GW[&_АZ\+&4z9G! ^NQhȵq=oBCkyr|\ϛАZ\+!UYՉ].ȮB'v $gu5\~'x [OxUA*OorH N- !'߸k,9"f1"*) S\U4ᨂlEW­5Ɇ[yx>|D3UEE(UYDS(I jRnji MRd";( #C`[[C@8zRL_3'&~ iVqiJ Np*٢?Q-2MI`ސ_Z[CҶd'"P8cj! B;sI !ٶbV谆ČFR'4DCCYK@:>}^$MVJ4*a]5[.˯ÅH6Mp&AE9CMrG@z(hi&SgP!?P^A jl3Cj ?-8W0(y_fdl3cYN|]k,Fxzhguף>d /4,zr Y=g 9燆񅆜EpCCVBC"!gl|!g\f#7 mzoC,2Ww%*wRV NY]ŅLXgU6ѩpq!#%eU\`W޺K U%XmcCu޹0LgqEgLz+s>uK3tƄ3Wd\4d{ɺ&AdFw MěV*+9_|@M0LRRPqϝ9+4'jy49O6{&0=exCm%P $V'Cp!UZ`٥6IizvD:Љn+Gb!SxDs4?PYш ݄]Ȗm h$%C 1.otuv"3SC ]#?C[c15H K=ǃ!e[j#S9J}u%td=zYy45yV;f@W/顲?Ğ}>E/|{jy}wzsA`qq]`qА'zh 7 q]`qА'zh 7 q]`qА'zh 7Ӑz %WC%i+BQPW,ΰTeل*S{VL Q>8\ IUW7Tt_DhlBG]WW6FRĔ3KSnDJqv4:f[iaÐ.aziQL,"zWk>` W'cBO Ld'8:ۤ. Inö4`|.œڤ}̘3¡k5&GܕPB,"x!3ujUúKC`4ÉԠǶ2I(p4[`D{NHNf!h%A*j`0$o3@Dއ?/6(]8s!#5MBg@Nu Жse R򈩀($vU2XωNti~~C,&Cf6̻֌JQf䡃Y9 qե!#e^;AI 1% $8SNX+S?40$G|b; ,4%CL{T(]v C.|'Wg/TՕ e(ϰTQ1רe&u|1CM\X+*_ T"2QCDS9fs艹 :N#0 =w~{(.yJwuh 7uȍ =4Ӎ8CC\0ݸQhȍ =4Ӎ8CC\0ݸQhȍ =4Ӎ8CC\0ݸўVH(suYix7S:='%l<8Q,$[T%j%Zf> Wgs5$UKu}I0gU_Ͱy2ݏ|TXWw84eА1?O%=:Imh@ʺ 5P( ʗq摯 ׿|se*4xisJvIAI)ȅ.SȤa&N=H[؛@Óe@$sMdCNwSCCnA"' 9-N EOr[r424x854i>dh nq umrxy!w5p tk6QOPAUK vH^܎אMtFĒRhB۸-/[/B-OORlQ8(t-!9a;츆gUj+ UkTsիژ! "ghN:رԔq2N'fQ2ruVJ`o2+yQ!)n$(#R+Zzzn,I#+F_٧H$z8+!+ތkHX#pF7mMV]48`K\cy# œ \rsV2 $*5$4DkڬNH.vwMm1.iO`_FmU7CdH*3E=je!e"1=xyxB&mkwPF!zl]1L 0CL2ХrB [RV`T9bsǪgVB,95ɣ`}LחΦhJO h!*6kXYw8}*?ڳ'KzR?vN,49-; 40S^40=M+7zPdBm_1#!bԘevw|݌}. 5=QUJTK*$] Ѐs'ej蕡Z.@6C֏2$~VTŏ=w2޵4w7eH Ŵ&%'6Ѿ-*v%yɃA K&r+ qBk2r@[Ҹ6I/+J $i>GRL_GOsA=A$ k Qb]+DUZa2yϝ'kHlwv[xm0pPȞ[ {2}nrh\7lrä2 0C! uơ!7LPȡ!Cpݰqh >rh\7l|@CbhûG 2kW*Jg߻23Xu ]ӎk3Tᬰhx4q#AW|l}f>"F Bvq w3Ӻ)MSCdȆ; uwY3Nrmkl!V;62g.{l{v)`4f Ѣe5q{`oأHs~O&l{-N^1M+ȬhbDɠjUdcPv5y2bCԜ@==rj4"pi!(FOu88VGU39x#Nㄼ)!EHa!8y:3dƞ;g3D Z-hs*Wλ{7c}}ҏSoX]q6{vu6S]vZH~t<`'Z=է8}C#ed0$MN0ߙ&&:gݹ$4dbv0%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81А.a:4d4N "4d"K Y" CCH BC&А%81JC~'׿|LdcL~~C~ůyCАZ>33R\~1p-Z`P\SOkHSm>KixM([!>Đ_xc@E C0d&U yه!a5C|10Ol@Xf%1KD=+΄V#fh}y,dL A|)94MV YXacTObㅋ!+uC2`J扞7'@}\ $dzL(D4Vw%Cr"oٰlpz ()/|2DGc%F yDSf8StS- l1u//8UB]K3[R<%/Yr(_wCĐb#(K<𮟵GӕS24ZE1/A %E#jɪ(Q3dH1DK$W Xl%i"yD i(Q3gi ~, KP^mw&EtMEhwk{u'PlP] V EkHAeSF Mp4C"D ! ¹"ac9DA*;CY1YqGG#"O,rE;g d2܌lm-u4NYd#?]b &:u Qt" 1 ^gKH$=Rd!j0rMqG-;PaaZN8ddt4l7!rD aDcTiDB4zvB|yQ~pw0~D oP8CDh&.bt{]E 6?a6<|1m 0`UPO ge PeC,RTr 1 y !7:Tt'!Cy(JºA ԡ@!!@A ο=:ccqhbHhSe>C򾤱 C$M2%]3C$͓fa"1d [b"53!vs$ӽF 'Zi!6, A 9 ITg𻼍q"j^f $1sЖRl^&ːMk*Z C<; EsҬ(=9&/̖.,Ph IA41]$􎾴"6VD*w$e|$!txUeƯ~=Ij&C(QrtmM, D*q5I˄vIX7qfS!! %:phΰ8ڀ!'C¡&hjƟK "4 Sf%nƞ3!¯˴~HjZk!CcQ4b-(P)ojl҈0<B=azH#ŒvViZQfPF!@C2@* ܝXBYh<gt!@yKҿm쮐 wLeK8C-bR_~t%ITyC U ኎C[wBD#c9zHN=kf!VR(2=!aJadzHaP]C )!Bu ! CFRmN!GdP|Oŝ'(IܭR_k-#B!R3ۅ]FK]rQ2[+ ՙ%wU\˅|iMI[ct$K@?AiLf+-[|kEC0C\>X*ň!)LcsO џbo" %~ Mr!;Ȏ{bXث1a!?ѻ٩Dl L2ReS‘BOR“z5S,Fn5o?__>\+7߭ZHy7P9dzȹ2z$dzH-F[! CΕ#! CFj1rrPz@ӔszȞ0qZ.hvJqKRM;/{m!~M"`zP>oLQ<٭:/9CSrxm.ZzezH8KVLjr"Ȫ9b^n(֎T1=(-!GgGo>?gfX9l@o4Or|L_2="2=22="2=2B=r=&b=䤑~z-C8b ɯ!Z(, !Pk Jlt=89 ƀb7C/vƝԟ8oS( 'cf4]D aC'CFaLe&6..B=S͋۶lլ!ķn,yJB8g@@ژD@yy}/kLT#" CFl9$C@ C2SomHa=T4# C5W7Lk#jLi!ꮋLk#jLi!ꮋHk"j}zHf!j?Az;-,CB"O;SFZh lPS>(6;CT蕖:Į$XƁ -y-2=$ogN+jWG@av0NTo ϐpQ]iKpPF`Mq}"\5ǭq&KT# Co | T(}dzH퀅O! C嶏Li)zSգ_i0w!*Gҁ=*T='U5zHU;zHNj"wP9TD!Ur!8CAC:pRUTʡtप&B CIUMR*!v23u׷޹G,,,;NJ!luGVZf[S?R!߿~}8*D?8ͮ3!{}1:mV W%E#CU?N1w8#OSx}(@ĝ60,L&CqL>90.<]C]7gS"\9`Hb ~!p ) ^L}J ѿa,8q]O^A7I2=$GbhTzTJ#UO!+E0(g.q]RB=IN=v9Oq!!]H@/r1?$9r 9@+7'r=dN|jE@̋Z^H"PDQWin 5/CCd8͛JWl}- zHCh!M!4&MҤ[2 zHCh!M!4&M!M =W=c1zH;zH+bz^"MC% q(""=DwՃ쫹nz{2=$\=(Z&u@۽T@bWrK?P@IMu8T7X浪%^FTEz]uȬ0D_ ZBƸd!-DuuCZ" CʚښB@4e2)L)j*k Ҕ0(2=I)DzHSØ@)wCbBQ-[(tĨ~b/"QVD߆!sH[n3Ѧd[Hx Qkٿ lG| MVF&L w ba1AncH>f^v":f2tOzHE"pr~oz#`߃ @э#!;y2=0R N*!fޏ1D® sӼ!rGeZDMD 51d7#$A6@7Q!E!S3$ VrCSFLF@y!⃖4/2=d^|rE@̋Z!>Cc/eӅ0hbMB0 `d=u] Cge{}RWh~9` bx# Ef#ULA !Cl5╅CTh2g&Ead&mLxqzH$ 2Tј3`~fL!3zL1? Z3~͘rBg|5 K+zHt=fqH/ z]{1S|g72$L/ˤ懬%"Y1ds~x79ifCC5w؞3a6$b,4oE)~Jn %]P/!.e2\u~&{E^kѝL.؝E¿VTI/NwߋQSbݹ7I!뭪6y.ӍJ_9ݽf2$yDwa 'n>jKg;qPWV"@|ug&6Ewv>itGR` "IH(MP{j烈JrLg~tG<аs˽|k#YΕ 9;cus(y?[Z]h&:̗8DT V"- ;Z\ՕН{RgPVt9T+@r 0Ew_d~!i^]n3$`+A}}7dP(Q%P#ݪw\gf}afd"ԗ@>dUerus9nS}D@@ϗ|?`ϗ7ϫ,-o宛+pN%Rg!?VxWFsbԳzZs5pA<3 u8r"ب ϵľ_@ ̧{KjxD"ug?V"zUbmTy)Yq*߂ynwwUNmb{HO3^hקMx#gxDO"}~,Rj&"LkS2Xllwul叚N:Sh>'%wjRk)P=Wgڕ[4bd"l_Lw~ȷZF7\4)U,9Z-J WiW~ńMKXq*d-n]SjNcqy+U~fd;.G.]y&M%f'.~oMCB٬hE;i[UeѲs NwlV5ބuW~gwѦdT\k9ZwnW `vv3LL~lܸ+!1;G)3uIk^~'WuXMc r}83n8k4gmX4__Zt` =]T k>%@x_Ռ5ݙqMJ+P6coRa\6O&Z-Æ'~Ax\IW]$;̽>'&{wU*:DUu\ (ܤEIC.T>u6 ?4Y?Ռu'%6'.EB0ɧmRH 44zVys,fb5fU01O9\:TM7wazϫsCfK%tT O?|[f"E1MZȽ=+@=*@wە/> ~Ss;kgOιu?s j"5@[ >=neZvQcpz1rA3RͮXHR!; 纨,5 0SF֨{O١l2c~n7v1z\5;pt kf-fbZC8H6B!Y&AH֧/; Լ Θh:Jf%Ś/0sTJ E.SN;{hgե1TfcʺۻBs.K㵽71<4Lx;NںmHT7m\ :?C_Tw$ruy}u%G)̰58[1j;Ot~FA$v?F5+ZC$B$ˣX_#bwr0u~/nn/+ڥmK}b NYC|___[#t Ķ5vSO!w_7C+İ%ͽxSe]2]Q1B-d Wa ~uQDlڽ+@=*t[ 0ZٝmnM03AxqG߭SKԣ5晑10cx3 H3#;bTw@|wc}|Q#ߡ"$0o=[{*bm S<຃߽c'd.#Fx.ϕE-?:M'Dϔ\uUܥ< p(ߏbiǷ*PPPTt{WƠ;bypi?ёShYw6U|݅(\Yh|=bպU0X3PsAk#ffl5Ή2X9d6fԙ]97ibmpJh|+t8zx 5t'lEF?vĖYK0RC.uaM.t` TD03ԅ^U.Ru_7@8œ0YwZ<_eNtu2 *p*wePfKDT!%[oj@ٚ>u7bbEB9p1=͎T:eP3d8СAZVu9:jX|JW^o݉M2.]Pn>/hZݽ>_~^gvb@k%qs_{YygzJ6 [wރ[w,bjiy*1P롻'tgpW>H2.5BY}AX24b!ohvdYw -zR8]nv+Hw4] wv~26m]z 8n;۬,L6D`y.c-p2q'i3Do ۄ:*1SYxthM1(]s;bV:^+k;$_W 1(/zk=hGZi3x7r,Bփ uv _3JPVj#N pB* ]O4Q!T@'h, C~ @Ow=DY@@NRz'=De;NHz"wv屉+{gIὪR&z(:w (Uo DȠ^ \ZQtG]y<ѝZc߯)Ov|RF\ﲥCgqc(|||PGA8IZsOet1QGȗjOwo w;2ӢbՎ) i◁'T3-t 10e-ǃs9sGW#= 33hI MghI]z\z݅ܛtבn:ԨRݩbs9t.o)^EGyx/M}  "CxP+i]BU'nyfo[?^҅.$ca͚ӝ}zGFw.b*dYGt=PHbiَ!*ϔ \')UjMyr6fIu_cuBTt35UzO& j*ˉ뻘 *B;H)N5q|(9)bH߉5u( 6\7|JE v;V^wqϳ tP6 7gQr"e7^G7MZ3-.Am's _&_ ?=63q "zj(1"D]x ndG%+%]k-Ԧd8|n15?s)PK\ ȖU fOdx&(A+ hNMz 8 Mpqɡb껼p'ʤ/6lDžuWS| ^h_rGcL P_!to|fv7\+İ~*~P^&n P_!tm5";[ >lA~5w[PC >lA~5w[PC >lA~5w J'+Mr¦A-זn7K-{wm^y(zwm[ j"D%o$^w"/;Ⱦ݀TeOfGdX~ӱq q}rP* ms|>>^?σrb˷북EV^ݽ&bNjfB>< j2M6tO(NEpBTG&m#AKa-ɱ?R鶁7)925N p $d>LugҺw +'G~oxo.CLcFdgUffm~d>ߡ(958F C#;sw@|wc}|Q#ߡߝ9j; p>;Ŝ䶁 wm;sp>SP8r6p/:5pQ8Fqt>Qi6gGf6p +95=1G鲏KuAgn Nd}ocz \AG ~c6tGm#Aև ˺=~+aP>/.>xp]_'z ⶁy jIMJ<=PQ&%n(|RwfE~7~T>)IG" ?*ݤģC OnR'E~7)hPwCG"x4{(;s{ΔylN?eJr|P/$tqG&*e,Lʥ rF*>NԦ;S+ Pv'ްBVH+ѹTU׋ q%Ƨ\FLqzq0[5tZJgJ9 NA9+ΎRe˻{揌!Eu[t$*x )Mu6t ٤;ʾJw024| "b'QMh6LW!~צu Qb( oE2m֝66ǧPf` sLV]H,tF(%s5oxLyΗ&g'P3_JQ(QkF&sbB9̮\ +5+b(nEtwB_?ݰ >/(툦"3czS-ɷNwڈ;u#~9|.o\bU%rBlq-4i6 ч2#U|HZIwcTNEߵn|]H_sju5Q)c.i V󵋟vL2g hBDM,9/EN'5TXU:>SWRtMUn(n$1=:`2G,6~wM^Q݉׷58  "|ߍ̇n>sC~7hxw9@!sx< 9Z<x|-n<`>w ]c圞D8q ](Rߢ3b-um8?u1a}0&=ex_:z$P6j;CD]摩ځoН79vzka 9jCߩDgLԝ.yhF'{Qם)#MBikV~u莧~֝NڱbO <:Yq^tDJ[O>Fj{Ep}O)~wkNdzqz;W Ӌ/%<cHDnv! {hݎ2w MQ$ C~w;=D4vnG~&(C@~ф!SQf4SFG+&KK79w=}q<{hU|H-&Ki(d' xN2Kzio}UR?"m5hϋvJ |0x ډbN;۵3}^\*-Qx!ʈ;<CEx} g 3>}G 6OA͟/$$p!ewmvl_s a W+^\;]+t'{ThrH;fSP.#00f6h׹EsݹShW5;.i_{.3tM)47;и2LDzͼ(þT#KebgQqStsJr)qphmm2O@1q~'0G𘬺N|Э}B4 ^5yzyEk}ets# +F0*x~C.-ۇЁBlcG[E&*T,ĺ3(LVDFה.reGs4c?-P˾-]ҧ_:|ԑdX۞`/]))twTbVT٧]}S.⡩ ߍ$GQ(CДŦɋ:;g|RcwA!sx< 9Z<x|-n<`>wqG~7D08G#"|ߍ̇͹39W] |`SP59f nPA ~"N=hPQrdQߩͽ \EQG ~.|=0m8|<wQp)d涁wS"jIyd>4=\nܝna}w5w mKQډInUNAsˊbOrχxjIMJ<=PQ&%n(|RwfE~7~T>)IG" ?*ݤģC OnR'E~7)hPwCG"x4{(;të9FE/G/ͳJy,)e =J DjtmǕ_| jnSKkS׹`4+/?4*R-)-QŅBk)wmO!8I$S-oY9t~иg{Ňlj*fS.>dvcPN Lt&i4Sm]tŇ- j9w U;nDΓ9Ⴛ EWkcwx.p}~ݹ@ 3KNZTSD#S&_w|&J<֡W]jgBFݹlR\-ڝw a niCĕvɭޒAʎ%P" ~}ӧ[5}C2=8+u1$ C~w;=D4vnG~&(C@~ф!e@"p;w ?HDnv! {h)( bJw/rRKeMODZS)nm kw*q\IHț;RLni˧&l9}YxPnCemixm][p ,@QN@'қԘ$ioC0S j9Pew.!"׃,.3 SΜ7]]~LͶBxRhPh*]J!E br,~=}YxR(Qѱwuݽ>_?s&mD!.=O,!"-1`tZ&!\ ӓ)o\N;E7Wy`ÎţCERօ;O_@`$J ƾ2`ڽJxY:LgIVvO1 LxӥPƄDT~nNu7\G^;E;ybvjRva6}!",{=~GUUXcS /ko7pW`vQVR!25;5$!e4)zdh0W^~tѾ+""+0wEwWd1=F #{:h]twOg"+ }WD~wEV=aZdHuwFϚCͭm{Y|KyYJd~H<=gCL)M]uTgw/A sC~7hxw9@!sx< 9Z<x|-n<`>wqG~7D08G#"|ߍ̇n>sC~7hxw9@!sx< 9Z<x|-n<`>wqG~7D08G#"|ߍ̇n>sC~7hxw9@!sx< 9Z<x|-n<`>wqG~7D08G#"|ߍ̇n>sC~7hxw9@!sx< 9Z<x|-n<`>wqG~7D08G#"|ߍ̇n>sC~7hxw9@!sx< 9Z<x|-n<`>wqG~7D08G#"|ߍ̇@w~~3A9@z}}|y~^g'oe#-u_~/~z"{3[C,9t}>EwocĬ b+Xq<Ӽ w`9ב;rpn\y&ܮ@GtW;wLzzs;8^ $oёo^wVx{NZXq:1d9!k 1w^w~}'Q𾱾뽶Kd!W~⥄u.N뻞k.w% \{h3nT~;2 $^m-Uvez}'Q1뻢 ˲qP^ <ﮰTtE̷y_[-<|Nv0*}T.w&/%[/AۿTTN{oq>ݯ0 h18y0@?8Z~-Iz OOJsZ@@P8 SVF\'`?t= `o-0t܎>wgSO>/$FO}5x@!]0[s;eީe] U[;cfcP 4sj;72e-iEVr) 2 Y.W퓸IYN=oZ qZ0Ռ&g,,u5~za"|_\l*E^$JOvMw]i[:Qxa9>_|d)͕&ӽ~"ijE-iyxͥ3}>Ew^oH*n;LSWoDug[>\  ܵ&[4GSytܾ+{qTB9/}>~8O/IENDB`)Dd#h   C DA ,batch - work area.gifb )qt&v:tmb0A(Mn(qt&v:tmb0APNG  IHDRWrlPLTEcccEXbKGDH cmPPJCmp0712Hs%bIDATx^] 3gwo6mhL?Dx~xn+4\ h-kp'^i*u-]N-6@eX.eM@~*Q_*PeKw\5%j*>o]K/+:Yd _~6\=*]e8_aU2@Q@nQ/,n׆4W6u9 U?+( ꎥ,:W,enr/1`ʢ(|*9\W!D xJA^hhG=gL=BM$?E J>0&$O ­_*%Fpբ-*Q+~)JptR$v_UuUW4n;vWܔ >`y$Ԑ`Qءph| oiWR8U%|A`PES HZZ8JEȬgo-JIBNA dg-4\х3*dkoy}"z~ݾHejRFIo(67}4`o?w6|<W>x"ʺ|>'^h~o⎪^'Wg3|Wjq$4}O^ixBxѝ4Mu%Is8-I>: 6 B@ep{_1:6N!'_j?f?i!lbGo'Y~-.? f*Z)<࿦񎫹ZiX/22{EKͬ]`+"}=O<^; 3oC!Tx3}22:TW^Eׯb4-lJ)d ;b}rY)/h+վ74}}~+~~: }wg&X%\eQQgJz0-!&d- dph pqnM`O!="~ ` @PfX '! g}`BRkJO.))r߀Md],)hX*I!,%moX3:\11yP:el Wŵd/ܑ\[Tq/,BaR6RZB$վK GW\m!F()j߰C8P\/">!qcuQWtƓH++B#@n&1hڪ}\U3`WI9 OH] kBH1+'>,keBW5'+hA~A )JǕWW *wO]8DR0ua`~08 ۓBGL_5ޒcU}*WI',5'CWg+c vW,ʈ:C+&SwWeIV@b8M);P|x.0K?yq%:$ g|=k$00_Q\AlCspB~~ơ{p9-Z;+$ |5M|:C 0._}fe;ϲs߷@?7#}?[J&-:X?:](i`{&|e)6$8'ɍP{HrhlNn$th4"_90ĆWP+Fs r#91ʡ|HE rh4"_90ĆWP+Fs r#91ʡ|HE rh4"_90ĆWP+Fs r#91ʡ,U>_"3ԛ&@}F4 ݄ŮYk}dt):^Ӏ_!a$ȌX= okgA&`kҀ_a0k #!7 @WߥwYcBW E|հքiW˖U|.i|u/A%~W%;P lY5YO4G:#AE|v|Wiַ*&H>ҬoTM_}Y>૷#H}PWo7G |f}nW}s]JwQۊ>W'cNfc>h^rTO:]cgiV2ժRg0!4S06hȑ*DxQ^#D% l%6K0v`G[m*JuYrD8䄿,I?8gPj+, Rуxh!iWֱsa"I#á ET&!ذAH&ås\|n~o߹M5P3!fzH%ITh0&Soė{:_@$ͦzV\w9U.J u|"%Z a:hCЈ%gn -x4DyhXtYxZi+2v/UA3@mJ9ƤǴA] zMgSEm(;o{9pعÎFw)QdRaܚUK4Hئ(n{U\*`ژl,O»=1_?T\UD$rxD%\3ɾW2TP(Tф=I5C0fK^k0@+$rH@ QhxUV"½gY@ "H ܷ'cco"妊h="c)> (RU`G 1=v'ڞucZ:Xb|\wZvw+Kf rFA_e//_y/9|ˋW^,K+_"mK3ʗH|R e//_y/9|ˋW^,K+_"mK3ʗH|R e//_y/9|ˋW^,K+_"mKS;AxdQSCy8W}ɷ 'ǧH94Vr Vj t(&aP+$<"QV(LIY rC=/Á'3ArhWxv]E9i@|ŨMΥׇn{ Θ y_u k-@UN_M(/kr].Ώ 6ckP7mP GTuGc By!-A[Y"M XfqJL`Nzس~~@<#IO2_*\a? Y(`mawJ ѝ1CD=?dȇpI,oUh,rYVO It~ޟ UH(E Fܽ[;[qUGKƎ:m)%AS󋯄Wdr\X0`aPQɨv=*2rdʽQʒS᪡d<0ӊ__U _,ԛw;lGq՘4 }r&5UG2*$j򠥡zûC_QF8%\z@h6`<:Md\@pis\CY|1AЇțÿs`Y}c=# Fb'x"o`X"pUSB}uol%'LߕnxD~+uľ2FӷHR5~%&"ث;(3# v{v`X&$<V^W7K\B_D. ri |ڕGf}lA`~ miѾEA;1AaQ~5  ۛUon^nwŝzw-F ~VGڪ’ktAܵ+"- ^2o)!LUv qEHIٽP  j pŵHZLE>$IcZ{G$& \#nݶCهCqGR+p 5O LB2{&ffMm{ WJސ:n}h &*$]ʈp%m)YoFṦ+|E݅F  c1Ha6#qբuR4ܿ^9>N&4\$yzƁKE7fo `j<+WI ] ]A"oDZvie4' t-;}$ WlVd/j[}W9=F+W_]A1'i Ifڸ=O@U 5U _hSz ܠ&60X&-XJe\H|'ӛ! +yaF8AYiǩFa?O|HĒ"{Î7>lm#SЩ˘PS&,B yPܾ&a27ض]QGUHU"*S6=h$pXd:JB8yU\1`/R㫬_1 &I;]r`b*Bh)ZMRAa68pK0ˢJr|`( `M+]ԙajBK26-0BaOy=V6j8V:aa0$9X2^\Qq[I؅%Weʿef4Si"G++,VXAi+,D,ኞ֑q"+T  BiMPO9ZrWzP⊰؈&LX^3r \QɎ,HCǕjW@c%xZE!f.%hgY5%O;jdW FZ3KTe%X-\`Pc(6 <G-?\ogT59.η?(l@Ŗ4|(l@QQQlIWKF _Ŗ4|(lԀ>.x1y}y2䌼$`~ި)%:Ke5 1Z;\%" 7dtlie>f==I Ơ>p>W.{#lI0Q&[WMsŰu*y}W9~$[N젺\(w$q5AGѷ&iRϨwOXcWUI $F!@C9 @MWt2e.nҗ4%٧M{]9>-?AUXqqKA2ò;([w>L)wOUGW߅]|$\UdrD3IRL9QɊ"WY::WܘS~^݀:NKȖX|Υ4#")2j)E ֋ͧ$ uabNYjI;S:CSA #%\[GU?kNuvjn3Pzq}oɖef.Duk2p/y8BAi4XJ&mYkx=+al⪎FFL4T"`pO׽p%>xqX=_NsX(L34 ֣TAj^Q9FMj+(\Ⱥ7oVQYG\KͿiqǟ#?7ݲnŕiph =绅MG wv>,GYF=OI@ܞ_O *:vB}h NUʼn$WL8x=Q _];`_ોa_6|uQaP QD˿[Jr?,jվ-&JM+W3*fysMOUg"'ԠpFzX * +WRgen:X12``D*-Q̔шNrXD\qD P+؋WSqH2@U"aտj}45gRD{L%$1Ky?xnx`^+X{MqG 頷GǞ"j0ͿzQJr ZX_:DhԦWZ=MԎG{5 Z5}EilzRkZWi ʦ(5}EilzRkZWi`vKboq1}DpEԯ"rR?O,@6lzR;jǢ184lv>f{TRjȺ4Iva?Y?NWDyU%ۤ :CVV.6gA%˜2nKYo8̆o \Y$fޒ u@%Iri"Tf,E+؅ϸJ; HAIMzBXWu9B-$x*nF|UC&u@7ӒHoc L\ Є-ٯ o'U rȌQ~R ;W⤕e2t*I' LMxKNM5oZ v*I[- L6Ly-dgq'&%u s> C/s=Yu1Nȃ $İ.A9$Dθ!b|f/:̒T:v_}&>*ޏ %R@xp C\]4Xnzߋ:"!qRf8ݛJ̿Ăq@*6ALrz\1 ,W$ 0xB-e!%:3ݏ Wŋ;+#:Uۣi[l`•$<^Wg "$Ui(ȸ{ Y^"F|~5~_q^p!d$Aiݠ Sn 5ܽC;=Qqtߓw_Yvͷ_.RՇM[|ѭ+ _2aܘʕWF+7r%h+s6ʍ\ |\n rc*W_2aܘʕWF+7r%h+s6ʍ\ |\n rc*W_2aܘʕWF+7r%h+s6ʍ\ |\n;^5s~}{~̈a~ǯ8`|aF*wBhhW4SquP0󤁟 <~#f<atSkZD_ "V(>[KkK#N :|"m{MMot\ut\ ӮvW I#]QnjR>1i]FWE m|5C"pk $2+h|EtY'{$wl?Dtc=zhu]YbݱiNQ jc("puIN + &Qfguw\edmT+Wq0U:CWt&0B}Wֻ+{o1+trOK|uI`+Z+;,?jM''GhG2,MAt3/J  `S.d\_K>xf>6_rh;3ݍAbˬ=}?3r<2H !~6뢻ȪFuWgq"28X9_MEgAEd|tQ%ztՔB}_];A{-@IיApo[߯_O z' z; ZW-]*л(Wy8< Uȇؑ#N'o7zŸu,ޙ/|?6̟rc(^7bX]."煷Ǚ_YA̩JU-A&y e/ʴMpHA O&wbKz$$)7'`c"!}Q/S"kM9-L>8Z$VrR.ۢ7.$zar"^)yRE MCG$ޓ_F$mzcމ,^-n/gIEw>" 7&*z/\D5)Ge\$'#DS.{[EVB+LrEr𱋀EuoI{i*0Z j_oq_jGoug,8x%~.lZnkkZSfG@`yWyR>u;w=! #@K'"{-?1}w@r'͞oEN"ON .lPpo-ߝTWOmtg4Bꎗ͡:Lěӕϸ>+结{=?H2Mlso4,Gjyzǃ-D>o\9JOgcl61U㶤'C],V>q7TFwe+w8gxv[{gNڎVQ!%) ~~?TVg\7[hwqbr8I',~ߔ{'⦇"Ţ#pY_|씀j/h+--p)U~oYscJZpq HNj!8[$'@oWY۔K1 7Л OUߏ7{Ioo^C!Տ$%R=8 qAIROC7 f+vVrd <E9T}Jo"Ϳ? xjza6/| L8'61Qy^J (7B-!)נT@\o[3o0:GV?9N'b_4_W ' )[؛-OǦz,a[*] 5 MR0EQro0sn/y9%MT,?CwwN?ퟔ{NƔ[ _qm0G̚,T1c~Ǵﭲt&ውf;g7 Xν]Ϩw|/˧m7,ɿa W?pI3cA)ZeU4iVd4]|5I+'freS)g]һf2|̩$+V|d653З{1%bH{OK#:5ϩ\xC iegnJy:iHVCCk]Z'}h7/i:—.?w=v@ZBhk rF#1ɿa =< oDwUwޫD{9?O *o `AU,^)\Wr8{h/kr!_P˅|%CWr:7~pn< OM;ǃ#liϢ&x?S$I f5>`F;յMy,7h k&VvW3e$f&ǙTw6WhBjAtMv}z=x8 Lt-*PvlcYzK&m>s% ؏>_Oe]U\ub‡Ӕ%/ᓐ1D[}n5CnuRXF`.OHQ Gq^O|RCᕒ*)F/*7zEw`Ry&Č& 5G@"UHO2 xMJx"fd@4: =zF@!b\r$]bH~#H)Vx ;Kۡm\SMk:|^zx84b˳mXH; c73ھ&n{z6NH*g]35O ɴg⋯n^l3 \gR ~1Q;ݯ@*^P%þӡޫDzI?C *Nz% ^28J3{ɰt*^P%þӡޫDzI?}>];=zGC=711i v'O[FxS5AlBծPo@v'uOf-Ϥ7IW 4wЛ[8 ڎVs=r3&Ey]j㱰OYXAQ4yF(DJ˭7HzIܔ/̄?kG4M^: fſr}Zhg-$0ND}:(Bo(^Ao/@y:'ε"ѻ )uJvQd[ՂM7a ,Nܜɇi@%5zj %x}WFI/-beƖ;<5TE3UX7J(˱@} /UPS3ΡBDT ~n%79#+[ߪOj6`_䃧/r5= jY;j.ʧ:*]LN&GD33 n7 l1N4m[iɄ%-K |UGuÚB me'WB>/%8'?0G=7=_C&meX/LFE R';iڊ?.:#09$SF`3m=C lnם1lz9_sxbYw{/Nv?㻸wދ.]lC?;{8>Pώޅz/Nv?㻸wދ.OgHn_xd>t#M)Zĭya7lˆ#TFb܂am x@P(y[uu vc,_aMq6MP|DIN&&.l;]_yFVt @ )6Ĵg5jOw⌟^&M9߄n6f g_fۅꍡٝw"L@s~҆Qڎ[ӁM%0PH?ބxx$NuONa]E+YtX91@"Ai3Z Qi'>)u[k ]25EoU(H(VsA;49[t)'qE5(|ޒ[Ob=Mm8}-hWKjj}QKnjXKNl9ɦSqr゚DVZ+V1m}l!zQ$)Vo%DvHkZI%{gUup΅Zz%aFrW&G}FkkU`:ƾ  κoi'CiUbf;7躲B/;tEvR&EShLoy86R1FxtK(;kKCVcg<^ fMeLr褋 n`hݲP8T;1'Z#}O2/zi6>77zLZRE{~pbpQFNпR؄'b%:?+΢tѣ7oIs?h!,ءފ[ëb>@*vOOwnl/bO" F( Ws2$pMw)AnʤNe^R{uJXصΦBMS`C6)i+Y#QC9ј D7P[]vjL"wPOB '"j"(cn8{,Z+'RNT&WެpwN6Z2 P2}e~VݩQ,8TMJ5t G~*p>w:OE;~4~~AήE RB/;:Pk.E RkRB/;:Pk.E RkRB/;:Pk.E RkhwqSnL *)?r_B[I&֖v=z51/v1௼%zToghh?=c#~Gk=١8QC=S686>+EWwdmCʣtxߣLT+B|SJ(Vtrykg Q松H_A?Vˢ2 7da4O᰼#` g7tx= T9v^KN+͙X?0wC b)%#h x6sY+͐yCYhFPqT>̱덞CNmy95 [u#\VK7>0z]%I"'NUcʜ9axjݾ)Un}8rrvͬBfzRjo-mHiFazQf3Q ]YP`T!Bz ; ""Q8F:G04>.B [ ~q=nyB] ##S\ ٔb!"rk~1guQ2&f_ʁ%k'ot^^x1Iۍ JHsP𪸵|^gݲKks"fq1uJJ:!h>2X8.z9mkn@ 5ɉ5Τ7k魇iyoׯYpgQ@{v¦.]zsfZDuN G1䩴 +ڌ@LgfX\ƺEokv,;MoΥd"=# @)9SN*RD,3tMۍP'إ\{X Ԥ} m(@hTo̗sHyv  LPuj4L &)q2_5Qu0e,DdU=6P7}ː7-hJ 6֧W¯o}5pE8m~8 (ÊOKW\2ʆ-m薸䔪Md<-gk]`ީSX#Li:.޳mL)UPVvf9(4Tة)UG{Sj`ьk|9Bx3^5fLzѻHtdWfv7zǖɘʉFpdmvnS ==^4W\ꕗ]vab 6\ZVJ&.5"e_/ݵ}m:e)X>& ϜRʙ=<޷9.c}}tJ2z.V Ϝ^Q!=W~+B*\asxEB !=W~+B*\asxEB !=W~+B*\asxEBޓ#ZnxZ>|. jVzCvo@"u=&Ŧ\u ۾3yDyNhߦ4fݎ9cf4rHܮEV]m)9޻nFDPiꃅD,GrJ1yf4 `D; *AZcu3g;ifVlT7fB PW `:OHX)z.'.v~"tCI7LjѭOEĹ697b[Iް X nL2,j؟Oo}Q޼!pUUxVa2Ŗ>r<_Wޮ8xrile 5kowhojx́Kڽ@~n\:> {Q nd;n 5 ڟbC AT"l┪uTɦk ޷۞c\oOW7&/liVﺃ<|F[j']ѻNaƆD~OtcԌNV}"o*sx#қUN3TA%y)UsۋU `4 =3 V<grzhr4`j;jT4V7lρDŽ O^jb +%Ǣ('bk8׭"wjdu!hxQz~JU¹E>ꃸtGW)M;ٱ'_D\|"R"\|"R"\|"R"\|"R"\|"R"\|"R"Y CqC~>|.B֙MK4J] M(ecwTmbl۩hv+勺טwv$R{۳[8F9<"`Ǻkvj /ezɍ`y8xi%:Mz#̈́Ⱥz)C-,M __\!sbhψujpz\9Id?J&%rGte(3XK}10fIO2ŜB}%>~rfƻӼ Bk͙)UA'>>J`L'M˭7xbV cE޹Fk!\Wr8{h/kr!_P˅|%CWrz/^)\Wr8{h/kr!_P˅|%CWrz/^)\Wr8{h/kr!_P˅|%CWrV? G?uϟW`!<({D$?Ǥ\H @ 2¹yd%[7GMDE3Ӡwwq5? 5B]X`J.G34WJV]'"x{ֈ;N6 yJd/ -]I?_sF~wKk!4G?\1!F2-z:AA}=0*4NNƽH8drޛ W^$}zlVN?8wpB^?դ0i^$y;|Sq{JNL{Xgb4#[QgQro$9)"9)oL8Ɨ~v{\'%[<;[2}8ʽ?6IZ"So-GQ==:.Dl6{ͭ"ߜ-ޯmĺwnS6# ,{Jm.WKM%oͥmごIIis&MrZ-=/COSVn3[0ь+nmamw;sxN(ƉޫIS٬gs^G?"T}gR a?r =?"֍QIENDB`uDd <   C A  2ɱre=$*?:S-`!ɱre=$*?:S4%xxYkWlvAR%c@ԶVbd" m ~qAA!PVj}>UgQ -"m  !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~     :!9Q#$%&'()*+,-./012345678R;<=>?@ABCDEFGHIJKLMNOPSTUXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~Root Entry  F@{ @ Data .ZWordDocument ObjectPool@@{_1055230632 F@,IOle CompObjbObjInfo  !"#$%&'()*, FMicrosoft Excel ChartBiff8Excel.Sheet.89q Oh+'0@HXh baylorbaylorMicrosoft Excel@tͻ@B9&Workbook "i,SummaryInformation( DocumentSummaryInformation81TableW IBa=< =xZ 8X1PArial1PArial1PArial1PArial1PArial1PArial1PArial1PArial1PArial1PArial1PArial1PArial1PArial1PArial"$"#,##0_);\("$"#,##0\)!"$"#,##0_);[Red]\("$"#,##0\)""$"#,##0.00_);\("$"#,##0.00\)'""$"#,##0.00_);[Red]\("$"#,##0.00\)7*2_("$"* #,##0_);_("$"* \(#,##0\);_("$"* "-"_);_(@_).))_(* #,##0_);_(* \(#,##0\);_(* "-"_);_(@_)?,:_("$"* #,##0.00_);_("$"* \(#,##0.00\);_("$"* "-"??_);_(@_)6+1_(* #,##0.00_);_(* \(#,##0.00\);_(* "-"??_);_(@_)                + ) , *    # "  Chart1Price Performance iDataZR3  @@  F Price PerformanceNameABCDXYTarget  = 4bMMN Н0ip04bb4M MM(4T0Hbt00MMMzm00b E000bbb߿0b(4bt0b3bPercent0]y,b(0\b0{T0T00Te0Tբ0T|?b10|)bb@kT0e0{T0L0b@b֜z0 b d{0cT0b  b ,b}0RT0,bbbhbz0b Zb"b%0bZ ` 0ZbQ` 0  I"??3` 0 ` j}  ` 0 迁3d23 M NM4 3Q:  PerformanceQ ;Q ;Q3_ M NM   d 4E4D$% M 3O& Q4$% M 3O&Q4FA$ &3O-. 3*&@@?#M43*&@?#M4% ' -M3O & Q Price'4% MmIMZ3OL& Q  Performance'4523  O43d" 3_ M NM  MM< 444% M0m3OQ'4% 6 @M3O &Q B'4% *a@M3O#Q Target'4% 8#@M3O &Q A'4% De@M3O &Q C'4% *@M3O &Q D'4% @M3OQ'44e?@@$@?e$@@@"@@e>   IMXerox DocuPrint P8XSDDMXerox DocuPrint P8 -  ddd"dX??3` 4#` 4#` 4#0 0 3d 3Q:  PerformanceQ ; Q ; Q3_4E4D$% M 3O&Q4$% M 3O&Q4FA"!c 3O@}3*#M43*#M4% M3O&Q Price'4% 8cMZ3O>&Q  Performance'4523  O43d" 3_ M NM  MM<444% MP3OQ'44e?@@$@?e$@@@"@@e>  I  dMbP?_*+%MXerox DocuPrint P8XSDDMXerox DocuPrint P8 -  ddd"dX??U} m @  ՜.+,D՜.+,< PXd lt|  1 DataChart1Price Performance  WorksheetsCharts 6> _PID_GUIDAN{312A605D-83E2-4CAC-A961-FAA134EC4786}~;3q]3yȆݹgsC([`z1QG11Sbi:CO}ƥ:&Z3/ ꧪUuRQQݧ 3 ܾ}*|.F?DlQOؔb;! DH3<'ŝ5pgz.ΰ}uO3rtGc/=br&ٔ/.cl޳'OmGAHűA:NA5u/w$!CV˶&x8z {39 c ^Ĺ$W] aKHfqMr4 :,;ڃ{h97yMl`e 6d!7dw؉M7C;dIMn^CCA93-h3o,&V?Bի:bjDWr`.\">yߨRFkq\R`?DފVQwABvplK䍵H7ɗ5ZIhdtJb{~G_>n%vc7ڡZ<-&>%S^;:'V㹝ɾak™f#ʭ!VX,:,h?>3ʦmKf,:lVgf~.PNz:JYfv!XTn4FnTkGZ⿔5c1{D3}O+{-7轞'v_ ?qIwigoo %D9^ w/+Fp1!ø,يZR7.r6Lj3 2kBlζy]1>9AVT&WfB*Z7 3j.{{:"]\W+GԽ7Iฮ,=^\ʨw]":[ЍxȉʱYؗyf/駔e4ySweX}p;u7־k2A'Dd4<   C A  bj'C F^MzOr.F'x2n>'C F^MzOr.PNG  IHDRqgAMAPLTE{{{{{{{&Xe IDATx흉*EyQP *;ޓL7U …n!=@/`1D+zЬG?:RСŀAb@ th1A: HZVHH`5@[ `1tWn]K{.t=3NM>XNXq `X Ziv`1׷XN8j ֬,3Ƥfԛ:]OVR|P$mEOe :8>SEvNT]&M >ā 2+\ alG:hM'TnBM@E֠;Έ?$ }Ax* uc`#]`;$Nnt4/ <6ºC'#ûvX" Sln 4XL8cl<);ƃ@G-dR}ƫ_vKvejחlliP )M5Iʚ$1Ӂ . Boϳ/B[:ij[B'Jhh,lߗMH3T,pGN5a jZǡ%&ՒIh(Р?-U5ՠ+|B[bGv`1A6:r8(ltt8a0Ehяۇ_8H%}hСy4Wq%j(]4X lśkByh(`J pSe V Lɖ4l@Lɖ47>uHY)rhA d`uBz@EIN:Vɛ-P?R!`QA74ctfs4>`r\FJ=BQ!`QA72񹆊qô7/d5, 4f[F"_ m0:D:bBKk4XhͶ Ǜ4qPRUR`@El˔l0 6LAZ h(uNF'}};NV(*+7Ru3.wv vA`]'xb@ʻOj r91 T\AƱ V v~R/f@-fCu\hVO%`j@}w` JnGzOYF\E]w:j⧴ġ>kx`OZM 3ՀàASwb h e5h;0Gka`O3`wt?4 4_MyӆH0~.,0qn:t?у7 MݖA3pABݗї]yAOg ,R"m8\r)(Б5wF9 { ǣn|,hFhG8{hvl5 <_n{oG!r{F w9K rۻ__Oߘ $u— 1;(u?{N/խy"AwvwhvF?'񯵷2`|Mg^ $L!S|u*"1уs\ Fִ>AN?xs Uwt X?+AStgQj Y:guc~uA~jNbYsq]E5j// zzcn{_3cd$X!,JaUT6B4h(_n{lPRM0aG?Ϥª5b.ljPN| W iW:]bB- @́^$qWZ䍣U7 e4xnAꄥ`/8r+Sg$YenB`uSTe4/q)ċ$< Dwgd8]6h(_n{blr]? m@,xWScr+ 1|փHU>/|| _n{A>U>/ Yh*zϣ,]L򟳫X, 4XTi lM̦ /NK* Qf.KMU( qtJd{e'"4Xhtrj[ "KR5J Ւ [ Ɗ' @CK-ȕJՠw咲pRfHfcAw9@ȕ{;56\@ w[;4")'zq[h-[d.ܔ N5cVGA)$4XjPm-Wh0_hpՠ,gA@X=c`i(¦z*,fB>Rz-nSch4UyFIN:ViTzB[_xc APOhp:N39mia~ e:LrxTCBy/}/oo9,y1o\9L˿}iI֩k(9cAUհOAA> <8^i,1`r)4L*y,4Lc6/'QmYQP*4^H5wܛ`?[ג|,d= T5@:?c_ґȣ4Xܡ=OJ}xOK<@ѝtO?WI+&Y}Jli(G]@`_z ~'j4n8 ~b#(4 Xz{~M&XMzsk5l6W Y+9VA ^YX:q60΃ p](OǓ4^izwgA` <0 ;` <Ұ ཱo*\\+U̻Oc45ᱪ($^\\fk^; ʨ";N[Z ;*%n5A>üϼtp~ԧ-{NS=;lteW|K=^$LkEVֱEZJRII8iԺL _%Ӝ~^"t}2!ȑ;WCwG #Q+i0GjPODg'T2m.hJJ#4A2l%?3ݶ~/fe fp`Wn\ <-;AIokvEclOi0h= 7f~w.zsju:*#e/i0{vKU=މ ;V9;=J3#5<Ҕ??։E1pmRB:yu03D zGQ߈xqb.Vh0^iXhaT\|<`i_órzG +  =ORJz|k5)U4VQVUh Ίgsr7t5(Hpd9yz UA uO`~h }5v9޹(Tfj *ƒA < (%h0^i,4i4\` <@f]rn`J&B4jvSkAhj9Y}A k`Z@jsq0N0wɼ^Vv4hfwN&a=iVY3Ň9}.l<<-=}wp(4 E9昶i0-JoOYd63,p/ה-˰>։Z UaNgÖg0zz* tMC2{ޓ6ϼZ NJyFk[{-yvyr5 Y@=vbkpq#nVo^5(?3^K% 2TS>~eZ R`Zji0LZ~\ z> M$Tx*4Xh0wW`ZLީA#5.ܖ7.ɃNjP?HzJJtyj; aHӠtH2 uT交|چ͇ H5K-q)z8h FFa7=qK1!>.2ҧ|L3NjP}x!ͮuWX48g{Qz i'_U2aO%FTOKfzݚ22??{f;`IL:_4FaԻ| ,{X1 UǒqM/#R3 cJr0o! b>"j8q&_FxN 4XLP mwRM}rq 1`𔺡q|SY[b >A}:vdyj\j~هZף?~ P3 >h9[˿!y*u_u; e"j}n s]:QC h~$<뺉0I+Og`ʱ`}-~ 񳯛-E^5,֠>Xtl`,Ai?a֨K 3o'E-JwigϮY<òP!yK!s=k2G3C@̤wCqA4;>M2C3i@#YX5l?"8x}ChvL'%]D?+I ;PtaIkpOE0;h]F )üw Kº&#|B- sxMdCw/S|J^}Yp\J.`,}p^J. [M/Ax>vg{追0=G$Aw=AMkЮ:ʕa졷KcNcAnbpk>wWon:@=}/?zxrTmIeKHѠG  BJOWBO‘ڞ($*[okKAiɦaSƒD[+#'w0Ϛ}ةœj׭گMiBZU!QSͳGm*o8QXIr>M~z'N!cn7AA>m^k1* yQRW z.?Sj'X)Nxt)M|fe0d썺Α7xM<@}>F"&\x{k| 2';w 8̹0{/83oZ~unҠsQu{J6}Ƒ @YiD.Mܴ>P.Z2@Ë:@YXWDKTunBgyJZ[ *9Dp`}$rnN#H^AwYb^HUq n0qh#t]T}4jVc4X؜<ЙʞbL鳠ã:NR4xWtph:u&9yg،McuvT7wgo+IMw㳆Tung$j .=h:! k2taafT5ƃ;Kܕ___]/dhn" ZS s͠&7e7餄p\VN-u Z_I%]j\ފ:Y &bs30;j{4I;hrTNO+g!\0cbjuL=;r ^!Pے;$hu$e_C R= uJGZ(nkB =ť?WX=[-G͚)WmZ% pzq9Ѡ_z#m_gS<0Z][yޥhJ߮K8dT>%mHvAgb 4 ԙXS\ퟀO!PN:һ)8Ju@5H, 7ipgIv{ ( /ms;5k?m_:IuQaj)A碚J;-A/589=z Def&Bbx3j (j EV>??Ai)J::L [1ϋ8Q{!ߵD7ʞ4΁ŕb)A̶IucN5(.CqxiP#NPOkp{f\+G+-C4X,g}JkP=]Zc휺c4ؑZdéK=Xl ]DVD/6@7(=H of^? v/\%cU@ cݙIDATEXʕ*@ o sϛY5.4 a?dQ9@hLLJ6՜`1ܣ#k+VsE{HO* @#fl;\B hbK$oo7ӂ*m xI2`  *W5WZaЁ\TI-6-D}Kü__m=鄓5nw-cE=5HoB7!FG'碁Wy|?qD'v? qUDQoA i0"40 V|}x땃Aܪ̭<ܪcy$amaXyF;a/6Jh0"YL4ӦIimXO:eU8-ykaN_ -r !-Jc"EoߓJYd63,p/ה-˰s m4x}jlY4(WR\ϯ}:b[Y8X3/_US `4(<|[:Wmͪ !Z Y&dbkpq#O> H1 Appendix  & F 5CJ$aJHOH H2 Appendix  & F56CJOJQJaJ4@4 TOC 1 xx5;CJaJ2@2 TOC 2 ^ :CJaJ2@2 TOC 3 ^ 6CJaJ.@. TOC 4 X^XCJaJ<@r< Header  !CJOJQJaJ&)@& Page Number< @< Footer  !CJOJQJaJLOL Diagram Caption - Main 5CJaJ<C@< Body Text Indent ^0"@0 Caption$a$5CJJOJCode,x$1$a$CJOJQJmHnHu6B@6 Body TextCJOJQJ^JJY@J Document Map-D M OJQJ^J;qw~%&%%&%G&%i& [\]^_`abcdefghv#$?%&ByQ;l B~` N } J y 0 e  ; q " $ & / B HI~5 6 !!d%e%&&+'F'Z'j''''' (((((///0161U1182R2S2 4466777::<<;<<<@>A>l?m?DAEABBDDHH{K|K~KKKIL~LLLME>p>>>>>>+?W?X?d????*@U@@@@@ADAbAAAAABPBTBBBBC8CZC]C_CzC|CCCCCCCD{{|~}}%~F~G~~~000000000000 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 000000000000000000000000000000000000000000000000 0 0 0 0 0 0 0 0 0 0 0 0  0 0& & 0/ 0/ 0/ 0/ 0/ 0/  0& & 00000000 0& & 000000000000000? 0? 0? 0? 0? 0? 0? 0? 0? 0000000? 0 ? 0 ? 0 ? 0 ? 0 ? 0? 00000000 0& & 070707070707070707070707070707070707070707 0 0~K~K0KB 0KB 0KB 0KA 0KB 0KB 0KB 0KB 0KB 0KB 0KB 0 KB 0 KB 0 KB 0 K0K0K0K0K0K0K0K0K0K0K0K0K0K0K0K0K0K0K0K 0~K~K0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l0,l 0~K~K0c0c0c0c0c0c0c0c0c 0 0QQ0w0w0w0w0w0w0w0w0w0w0w0w0w0w 0QQ  000000  00o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o0o  0000000000 0QQ0000000000000000000000000000000000000000 0 00000 00c0c 00a0a0a0a0a0a0a0a0a0a0a0a`0a0a 0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0a0aH0a0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O0O 00WX0WX0WX 0WXWX) 0WX) 0WX) 0WX) 0WX0WX 0WXWX) 0WX) 0WX) 0WX0WX 0WXWX) 0WX) 0WX) 0 WX) 0 WX0WX 0WXWX) 0 WX0WX 0WXWX) 0 WX) 0 WX) 0WX) 0WX0WX 0WXWX) 0WX0WX`0WX 0WX0WX 0qWX0WX0WX0WX0WX0WX0WX0WX0WX0WX0WX0WX0WX0WX0WX`0WX0WX 0qWX0000000@0@0@0 0""dddddddddddddddddddddg; > = > R IrTy "0H; 0,ABSRZut}4 ga3ADGKNQ4TX \cw$Vb~:=l&">@Yuw0LO69Kgj !=@]y|?[^  - I L \ x {  ) E H X t w   + . D ` c r  6 9 P l o   " n*n7n8nLnmnF]acw9PR~ t%t%t%t%t%t%t%t t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t%t %t%t%t%t%t%t%t%t%t%ttttttt:#8ADKNS^agt!tt@  @ (  VB  C D"VB  C D"'(  HB  C D B S  ?~$t"g $ u$u6 _Toc535692424 _Toc535692431 _Toc535692432 _Toc535692433 _Toc535692434 _Ref535573243 _Ref535573261 _Toc535692435 _Toc535692425 _Toc535692436 _Toc535692437 _Toc535692438 _Toc535692439 _Toc535692426 _Toc535692440 _Toc535692441 _Toc535692462 _Ref535682074 _Ref535682081 _Toc535692442 _Toc535692443 _Toc535692463 _Toc535692444 _Toc535692464 _Toc535692465 _Toc535692445 _Toc535692466 _Toc535692446 _Toc535692467 _Toc535692468 _Toc535692469 _Toc535692470 _Toc535692471 _Toc535692427 _Toc535692447 _Toc535692448 _Toc535692449 _Toc535692450 _Ref535653547 _Ref535653551 _Toc535692428 _Toc535692451 _Toc535692429 _Toc535692452 _Toc535692453 _Toc535692454 _Toc535692455 _Toc535692456 _Toc535692457 _Toc535692458 _Toc535692430 _Toc535692459 _Toc535692460 _Toc535692461& & / 777~K~KK,lcQQwo~ɵؽE ca<<<<WXWXZ+\]'dfnqqqw~  !"#$%&'()*+,-./012345. . A 777KKK^lvv{ѹž]#o\\\\aXaX[6\]+dfnqqqw~!A"/0<D<HH|K LT/Uaa3n4n5nMnUnVnXnYn\n]nfngnonqntnunqmsusvssscO6o+_ǵͶCֽ[%@};   ))45@YYzZ{ZZZ[[[[[[[[[[[(a2a^adaeafaan@oHooqqqwr}rrrrrrrrrrrw\xdxxxwzxz{!{C{D{{{|{||(|/|E|L|e|l||||||~%~&~?~~~hnioE)+:<p 2&>&N99::O >E>M>>>>>>>>?GH4I!NOcSUXX(a'dnnoo>rErrrssss&t+t[t\tttAuBuuuevjvvvKwPwwwwwwwww/x9xhxlxxxy yPyYyyyyyPzWzzzzzC{D{s{z{{{{{{{$|%|A|B|Q|R|n|o|||||}}=}>}U}V}m}n}}}}}}~~~%~&~?~E~~~333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333333b>l"AYx0P:Kk !A]}?_ - M \ | ) I X x  / D d r   : P p  ! nnnFbc9S$~%~%~&~?~H~g~i~t~x~~~mefC:\development\Source_Code\School\AI 2\PCShoppingAssistant\docs\deliverables\PC Shopping Assistant.docmeYC:\WINDOWS\Application Data\Microsoft\Word\AutoRecovery save of PC Shopping Assistant.asdmefC:\development\Source_Code\School\AI 2\PCShoppingAssistant\docs\deliverables\PC Shopping Assistant.docmeYC:\WINDOWS\Application Data\Microsoft\Word\AutoRecovery save of PC Shopping Assistant.asdmeYC:\WINDOWS\Application Data\Microsoft\Word\AutoRecovery save of PC Shopping Assistant.asdmefC:\development\Source_Code\School\AI 2\PCShoppingAssistant\docs\deliverables\PC Shopping Assistant.docmefC:\development\Source_Code\School\AI 2\PCShoppingAssistant\docs\deliverables\PC Shopping Assistant.docmefC:\development\Source_Code\School\AI 2\PCShoppingAssistant\docs\deliverables\PC Shopping Assistant.docmefC:\development\Source_Code\School\AI 2\PCShoppingAssistant\docs\deliverables\PC Shopping Assistant.docmefC:\development\Source_Code\School\AI 2\PCShoppingAssistant\docs\deliverables\PC Shopping Assistant.doc> D*+S#M`t  aH v\>PF]X,^_*+c Zq llF]Xo$/p$jt!'x̛8(QF+ F>~+ШP9- T. 80xC512!7 )>*+*{BgfSB &OC E IoK CK,ObhNRC R =>ShS WmT*+H+T UPVU)Z*+dV]F]Xt 6]  +^ B_*+/a SIb`b vC_e*+b5i 3k &1m%vm*+ Jp*+A;q_Vt L~t DuPx :j| ?|Z8w >} 19b~* hh^h`OJQJo( hh^h`OJQJo(@CJOJQJo("  hh^h`OJQJo(h ^`OJQJo(h ^`OJQJo(oh pp^p`OJQJo(h @ @ ^@ `OJQJo(h ^`OJQJo(oh ^`OJQJo(h ^`OJQJo(h ^`OJQJo(oh PP^P`OJQJo(P^`Po(@@^@`o(.0^`0o(..@8^@`o(... ^`o( .... ^`o( ..... ^`o( ...... `^``o(....... 00^0`o(........^`56CJOJQJo( ^`OJQJo(o pp^p`OJQJo( @ @ ^@ `OJQJo( ^`OJQJo(o ^`OJQJo( ^`OJQJo( ^`OJQJo(o PP^P`OJQJo(@CJOJQJo("  hh^h`OJQJo( hh^h`OJQJo( hh^h`OJQJo(hh^h`56CJOJQJo( ^`OJQJo(o pp^p`OJQJo( @ @ ^@ `OJQJo( ^`OJQJo(o ^`OJQJo( ^`OJQJo( ^`OJQJo(o PP^P`OJQJo( P^`P Appendix : @@^@`. 0^`0..``^``... ^` .... ^` ..... ^` ...... `^``....... 00^0`........P^`P@@^@`.0^`0..``^``... ^` .... ^` ..... ^` ...... `^``....... 00^0`........h ^`OJQJo(h pp^p`OJQJo(oh @ @ ^@ `OJQJo(h ^`OJQJo(h ^`OJQJo(oh ^`OJQJo(h ^`OJQJo(h PP^P`OJQJo(oh   ^ `OJQJo(@CJOJQJo("  hh^h`OJQJo( 88^8`OJQJo( pp^p`OJQJo(o @ @ ^@ `OJQJo( ^`OJQJo( ^`OJQJo(o ^`OJQJo( ^`OJQJo( PP^P`OJQJo(o   ^ `OJQJo(@CJOJQJo(" hh^h`. P^`P Appendix : @^@`. 0^`0..``^``... ^` .... ^` ..... ^` ...... `^``....... 00^0`........0^`0o(. hh^h`OJQJo( hh^h`OJQJo(P^`P@@^@`. 0^`0..@8^@`... ^` .... ^` ..... ^` ...... `^``....... 00^0`........ hh^h`OJQJo(hh^h`. hh^h`OJQJo( hh^h`OJQJo( hh^h`OJQJo( ^`OJQJo(o pp^p`OJQJo( @ @ ^@ `OJQJo( ^`OJQJo(o ^`OJQJo( ^`OJQJo( ^`OJQJo(o PP^P`OJQJo( P^`P Appendix : @@^@`. 0^`0..``^``... ^` .... ^` ..... ^` ...... `^``....... 00^0`........^`56CJOJQJo( ^`OJQJo(o   ^ `OJQJo(   ^ `OJQJo( xx^x`OJQJo(o HH^H`OJQJo( ^`OJQJo( ^`OJQJo(o ^`OJQJo( hh^h`OJQJo(hhh^h`56CJOJQJo(h ^`OJQJo(oh pp^p`OJQJo(h @ @ ^@ `OJQJo(h ^`OJQJo(oh ^`OJQJo(h ^`OJQJo(h ^`OJQJo(oh PP^P`OJQJo( hh^h`OJQJo( hh^h`OJQJo( hh^h`OJQJo(@CJOJQJo(" @CJOJQJo("  hh^h`OJQJo(^`56CJOJQJo( ^`OJQJo(o pp^p`OJQJo( @ @ ^@ `OJQJo( ^`OJQJo(o ^`OJQJo( ^`OJQJo( ^`OJQJo(o PP^P`OJQJo( hh^h`OJQJo(hh^h`o(. hh^h`OJQJo( hh^h`OJQJo(@CJOJQJo("  hh^h`OJQJo( hh^h`OJQJo( hh^h`OJQJo( P^`P Appendix : @^@`. 0^`0.``^``... ^` .... ^` ..... ^` ...... `^``....... 00^0`........ ^`OJQJo( ^`OJQJo(o   ^ `OJQJo(   ^ `OJQJo( xx^x`OJQJo(o HH^H`OJQJo( ^`OJQJo( ^`OJQJo(o ^`OJQJo( hh^h`OJQJo( hh^h`OJQJo(@CJOJQJo("  hh^h`OJQJo( hh^h`OJQJo( hh^h`OJQJo( hh^h`OJQJo(hh^h`.P^`P..^`...x^`x.... ^` .....  X ^ `X ......   ^ `....... 8^`8........ `^``......... hh^h`OJQJo(@CJOJQJo(" Bv\/p$?|,6bhNo3k80 Jp)>vC_eWmT^_%vmU)Z&OCB_66,D19b~US#MPVA;qSIbP9-/a8(H+TQF+865t 6] T. +^v\!'aH =>SZqhS_Vt2!7C RcE:j|DuPxIoKSB`t >}`bb5iL~t*{BCK1mF>~+RlldV]P86@h ^`OJQJo(6 @CJOJQJo(" 6` @CJOJQJo( D6@CJOJQJo(" >>         6n        6n                                   6n        6n        6n                 $ DAO%~~~@Xd8~P@Unknown G:Times New Roman5Symbol3& :Arial?5 :Courier New;WingdingsO1CourierCourier NewSTimesTimes New Roman;F Technical5& :Tahoma"1hEka^xaF:G7;$0dw2Qmeme