ࡱ> @ lbjbjFF ",,d<<<<,<tngl===== ? ? ?fffffff$iRbkg@E>" ?@E@Eg==i(ghKhKhK@E`8==fhK@EfhKhKece=`= p0<Id1df>g0ngKd:@lJT@l4e@leT ?@hKAB ? ? ?gg$!XK!Road Map for Defining and Constructing the LEAD System Kelvin K. Droegemeier 2 May 2004 1. Identify 5 canonical meteorological research problems and 5 canonical meteorological education problems which are beyond reach today and which LEAD will provide the means to address. Collectively, they will serve as the targets to which all activities in LEAD will point. The ability to address these problems, not simply via demonstration but in a rigorous and complete manner and not only by those in LEAD but by others in the community using LEAD as a deployed system will serve as metrics for measuring the success of LEAD. 2. Specify the meteorological and other data sets that will be part of the functional LEAD system, starting with the 7 principal sets shown in bold: Surface: METAR (NWS/FAA sites) Surface: Mesonets (Oklahoma Mesonet, WestTexas (TexasTech), MesoWest (Utah), MADIS (FSL)) Surface: NWS Surface Co-operative Mesonet (not yet available) Surface: Lake and Ocean Buoys Surface: National Lightning Detection Network (NLDN) Upper Air: Rawinsondes Upper Air: NOAA Forecast Systems Laboratory Wind Profiler Demonstration Network (WPDN) Upper Air: Commercial aircraft data (ACARS from FSL) Upper Air: GPS Slant Path Water Vapor and Precipitable Water data Radar: NEXRAD Level-II Radar: NEXRAD Velocity-Azimuth Display (VAD) Mean Wind Profiles Radar: NEXRAD Level III Raster Products Satellite: GOES East and West Visible Satellite: GOES East and West IR (10 micron) Satellite: GOES Sounder data Satellite: Polar Orbiting Sounder data Satellite: Quickscat winds Streamgauge: NWS River Forecast Center Stream Gauge Data Radar/Gauge/Satellite: Surface precipitation estimates Satellite/Model: Soil Moisture composites Model: NWS/NCEP Global Forecast System (GFS) gridded analyses and forecasts Model: NCEP/NWS Eta Forecast Model gridded analyses and forecasts Static or semi-static data including land-surface characteristics, soil properties, terrain elevation. 3. Identify the atomic functional tasks of LEAD that will serve as building blocks for the entire system. From these, begin constructing the principal elements of the workflow, including canned workflow templates that users can employ as building blocks for more sophisticated capabilities. Locate data from among 20 or 25 mesoscale data sets (real time streams or archival information, including observations as well as gridded analysis and model output) in the base LEAD system, by specifying any one or a combination of attributes Add data sets and streams, and associated cataloging data and meta data, not contained within the base LEAD system Select one or more data sets Define storage resources, locally or across the grid Move data to the selected storage resources or operate on them remotely (e.g., using visual area networks) Make Available to others, with appropriate authentication procedures, catalog information, and interchange technologies, any data chosen by the user Identify and Allocate computing resources for on-demand, pre-scheduled, or use-when-available modes of operation Read data using semantic interchange technologies. This capability will be provided for all data sets within the base LEAD data system, and tools will be provided for users to develop additional interchange capabilities for other data sets, including those generated by the user and made available for use by others. Operate on data using a wide variety of application tools including the WRF model and associated components, the ARPS Data Assimilation System (ADAS, including as a front-end to WRF), the ADaM data mining engine, and the IDV visualization tool. Orchestrate the above tasks, via a graphical interface, with workflow that can change dynamically with time (see below), and that monitors the status of each task as well as related resources (e.g., network, data feeds, computational and storage systems). Trigger resources (tools, data sets, data streams, observing systems, network actions) with workflow in response to the outcome of other processes with full re-entrant capability. Manage data (store, move, make available to others) and meta data using cataloging services 4. Identify a sequence of meteorological research and education scenarios that extend from the very simple to the very complex, and use them to create a series of system generations that can be organized functionally, i.e., by the task being performed (such as accessing a data set), or via elements of the system being engaged (such as the use of a tool like IDV). Each of these can be mapped into sequences of work, within each sub-system, to form a series of building blocks and test beds the end goal of which will be solution of the canonical problems. Out of this will come the architecture for each sub-system, timelines and tasks, as well as other elements such as workflow templates. The starting point can be those scenarios indicated below: Scenario #1: After selecting a geographic region using the GUI, locate and access all temperature observations from the ground up to 5 km altitude that have a frequency < 30 min and begin ingesting them on ongoing basis starting immediately. Scenario #2: After selecting a geographic region using the GUI, locate and access all NEXRAD radial wind and reflectivity data, and all GOES visible satellite data, between 12 and 18 UTC on 1 January 2004. Decode them using semantic interchange technologies for use in user-developed software. Scenario #3: Determine if output from a rumored high resolution tornadic storm simulation, produced by a post doc at the University of Oklahoma, is available and, if so, whether any restrictions exist regarding use. Scenario #4: Obtain Level III NEXRAD radar reflectivity data, and the GOES-10 visible satellite data, valid closest to the current time over the continental United States and visualize them on the same polar stereographic map projection. Repeat this procedure but add all available wind profiler and surface observations. Scenario #5: Obtain the most recent Eta forecast model grid for the continental United States and overlay the 500 mb temperature field with the 300 mb absolute vorticity field for the 6-hour forecast. Then six hours later, overlay the forecast with surface, upper-air, and GOES-10 satellite observations valid at the same time. Scenario #6: Continuously ingest Level II NEXRAD radar reflectivity and velocity data from all available radars in the continental United States, for the next 8 weeks, and store on a remote system for an upcoming study that seeks to relate thunderstorm frequency to the efficiency of the national airspace system. Scenario #7: Locate and access all available surface, upper-air, and satellite data over the eastern half of the United States, closest to the current time, and use ADaM to identify any relationships between the surface temperature and upper-level moisture fields. Scenario #8: Locate and access all available surface, upper-air, and satellite data over the continental United States, closest to the current time, and use ADAS to create a gridded analysis for use in visualizing the temperature and moisture fields in 3D. Then apply ADaM to identify any signals between the surface wind and upper-level cloud fields. Scenario #9: Locate and access all available surface, upper-air, and satellite data over the continental United States, closest to the current time, and use ADAS to create a gridded analysis for initializing a 24-hr WRF forecast run on demand across the grid. Apply IDV automatically to generate and post to the MyLEAD portal animation sequences of the forecast output as it is being generated. Scenario #10: Locate and access all available surface, upper-air, wind profiler, commercial aircraft, and satellite data over the continental United States, closest to the current time, and use ADAS to create a gridded analysis. Apply ADaM to the analysis to determine whether a contiguous line of thunderstorm cells exceeding 200 km in length is present anywhere in the domain. If so, launch a 24-hour CONUS WRF forecast at 24 km grid spacing, and a nested 3 km grid spacing forecast over the line itself, across the grid when resources become available. Scenario #11: Continuously ingest Level II NEXRAD radar reflectivity data from all available radars in the continental United States and apply ADaM to generate meta data on echo size distribution, echo-to-echo spacing, storm height, and cell lifetime. If any individual thunderstorm comes within 100 km of OHare Airport, use ADAS to create a gridded analysis for initializing a 24-hr WRF forecast run on demand across the grid. Catalog the results for access only by forecasters at American Airlines. Scenario #12: Continuously ingest radar reflectivity and radial velocity data from the Oklahoma CASA test bed and use feature detection algorithms to identify low-level convergence lines. If such lines are found, refine the scanning strategy to sample them at high temporal resolution. Scenario #13: Continuously ingest radar reflectivity and radial velocity data from the Oklahoma CASA test bed and use feature detection algorithms to identify low-level convergence lines. If such lines are found, begin ingesting GOES-10 satellite data and use ADaM to determine whether clouds are growing in the vicinity of the line. If so, begin ingesting all surface, upper-air, and radar data in a region 500 x 500 km2 around the line, bring them into ADAS, and use the subsequent gridded analysis to launch a WRF forecast, on demand, across the grid. If the forecast shows storm development near the line, create an ensemble of 100 forecasts using perturbed initial and boundary conditions and process the output using ADaM for statistical comparison to ADAS analyses, which are being generated every 15 minutes using streaming data. Scenario #14: Set up the orchestration needed to run a 48-hr WRF forecast each day for the entire semester for use in a Synoptic Laboratory class. This includes specifying all input data, the flow of work into ADAS, then into WRF, with the output visualized automatically using IDV and placed on the class web site. It also includes scheduling computing resources across the grid for use at the same time each day. Configure the workflow so that, if a data feed or other system component fails, the entire system will restart. Catalog and store the output for use by the introduction to meteorology forecasting contest. The system generations should collectively feed into the following year-by-year capabilities. Year-1 (by September 2004) Access and read, using ESML, selected cataloged or streaming observations (METAR, Rawinsonde, ACARS, NEXRAD Level II and III, GOES Visible/IR Satellite, and Eta Model Grids) from the GWSTBs Configure and launch single-grid WRF forecasts and/or ADAS analyses manually, via basic orchestration and GUI within the MyLEAD Portal, using local (GWSTB or other) or Grid resources (manually allocated), with orchestration handling all execution elements and data flows Store WRF and ADAS output automatically, on remote resources or the GWSTBs, using initial version of MyLEAD Virtual Work Space Visualize WRF and ADAS output, and raw observations on GWSTBs, manually using IDV via the LEAD portal Analyze real time WRF and ADAS gridded output (ensemble or single forecasts) using existing capabilities within ADaM Year-2 (by September 2005) Utilize WRF mass coordinate version of ADAS for analyses with an option for using the height coordinate version Access and read previous data from GWSTBs using ESML, along with Wind Profiler, Surface Characteristics, Terrain, Water Surface, and Mesonetwork observations) Configure and launch geographically-relocatable WRF forecasts (one-way multiple nesting, including ensembles) and/or ADAS analyses manually using local (GWSTB or other) or Grid resources (manually allocated), with orchestration handling all execution elements and data flows Catalog WRF and ADAS output automatically and store locally via MyLEAD Virtual Work Space Visualize WRF and ADAS output automatically using IDV via the LEAD portal Analyze WRF and ADAS output using new capabilities within ADaM, via the MyLEAD Portal. Year-3 (by September 2006) Locate, access and read all data from GWSTBs and other servers within the data cloud (e.g., THREDDS, OpenDAP) using ESML, including experimental radar data from the CASA Oklahoma test bed as well as V-CHILL Configure and launch geographically-relocatable WRF forecasts (two-way multiple nesting, including ensembles) and/or ADAS analyses automatically using local (GWSTB or other) or Grid resources automatically allocated, with orchestration handling all execution elements and data flows Catalog WRF and ADAS output automatically and store locally via MyLEAD Virtual Work Space Visualize WRF and ADAS output automatically using IDV via the LEAD portal Analyze WRF and ADAS output using new capabilities within ADaM, via the MyLEAD Portal. Year-4 (by September 2007) Locate, access and read all relevant mesoscale meteorological data within the Data Cloud via the MyLEAD Portal and ESML for use within LEAD or user-specified tools Use the MyLEAD Virtual Space for all work, including storing and making publicly available codes, data, and results Configure and launch WRF forecasts and/or ADAS analyses automatically, on-demand in response to weather or user inputs, using MyLEAD Portal with fully automated scheduling on the Grid and streaming observations Use WRF, ADAS, or other algorithm output to change manually the configuration of research radars Use advanced capabilities in ADaM to analyze, compare, and discover, particularly as an automated process within the workflow stream that can trigger events based upon findings Use IDV to visualize WRF, ADAS, or other information Year-5 (by September 2008 Final Year of Grant) Locate, access and read all relevant mesoscale meteorological data within the Data Cloud via the MyLEAD Portal and ESML for use within LEAD or user-specified tools Use the MyLEAD Virtual Space for all work, including storing and making publicly available codes, data, and results Configure and launch WRF forecasts and/or ADAS analyses automatically, on-demand in response to weather or user inputs, using MyLEAD Portal with fully automated scheduling on the Grid and streaming observations Use WRF, ADAS, or other algorithm output to change automatically the configuration of research radars in response to weather or other decision drivers Use advanced capabilities in ADaM to analyze, compare, and discover, particularly as an automated process within the workflow stream that can trigger events based upon findings Use IDV to visualize WRF, ADAS, or other information 5. Identify fundamental barriers and basic research in meteorology and CS that will need to be performed to create the LEAD system. 6. Define the basic LEAD architecture as sub-systems and specify the functional requirements of each.  User Sub-System The MyLEAD Portal is defined to be the primary, though not exclusive point of entry into the LEAD system. For example, a user may wish to operate on a model data set created within LEAD though without using any LEAD tools. The Portal must be very intuitive from a user perspective allow users to be self-educated very quickly. Its layout should be highly functional and visually appealing and contain the following (see  HYPERLINK "http://redhook.gsfc.nasa.gov/~imswww/pub/imswelcome/" http://redhook.gsfc.nasa.gov/~imswww/pub/imswelcome/ for an excellent example) Brief overview of the LEAD concept, project history, target audience, and the capabilities enabled via short narrative case examples (tutorials) as well as graphics. This should be a very exciting and visually appealing section that will engage users and encourage them to do interesting new things. Link to the LEAD web site. Hot News section/box for updates on the latest features, possibly including an alert box status message. Link to an acronym list including search capability. Link for educators, the page for which includes links to a broad variety of meteorology educational resources online. It also should contain a few scripted workflows that instructors can run on canned data sets. Link for students (may segment grades 6-12, collegiate), the page for which includes links to a broad variety of meteorology educational resources online. It also should contain a few scripted workflows that students can run on canned data sets. In the case of teachers and students, LEAD might create classroom accounts on the Teragrid for non-intensive experimentation. The Portal should itself serve as an educational resource for meteorology and computer science, providing users with a behind the scenes look at how the portal works. The Portal should utilize streaming video for its tutorials, welcome messages, and other user-oriented elements. An Other Links list to related IT projects in the geosciences but especially atmospheric sciences. Registration information (as guest, registered user; dealing with forgotten password) and related account access information. Topical FAQ list (system issues, data issues, user software issues) Links to various major system components (e.g., tools, data, workflow engine) Logout button Link for submitting user feedback as well as questions NSF logo and grant number Statement of use policies (e.g., privacy) to avoid misuse and to deal with legal issues regarding liability for use Service for subscribing to user updates and special notices The Portal must allow other tools and user resources to be added to the overall system, including the workflow engine. A basic set of command primitives should be the foundation of the Portal, e.g,. LOCATE, DIRECT, ACQUIRE, VIEW, BUILD. Options must exist for saving command sequences, including complete sets of workflow, for re-use. This is particularly important for dynamic workflows, where one may wish to re-create a particular scenario that was built on the fly in response to the weather. The MyLEAD workspace is actually a collection of services, both local and remote, to which users have access from within the Portal. The Portal itself is simply a view into the services. Multiple instantiations of MyLEAD may exist, and they could be organized into groups, i.e., a hierarchy. The Portal will adopt the CAPS/ARPS geo-reference GUI (see below) to allow users to easily define regions for computation, data acquisition, etc. Users will have the option of including in the base map icons for the location of the following: NEXRAD radars, upper-air balloon launch sites, wind profilers, surface stations, and operational model grid domains. When first receiving access to the Portal, users will submit a profile (e.g., contact information, area of interest, URLs) that will become part of a global user directory. This will facilitate collaboration and the sharing of LEAD resources. Data Sub-System Users should be able to specify data requests, or catalog their own data, using one or any combination of the following descriptors Collection/Generation System (e.g., operational radar, geostationary satellite, buoy, model) Data Category (e.g., wind, precipitation, moisture) Physical Quantity (e.g., dry bulb temperature, absolute humidity, radial velocity) Data Set Name (e.g., NEXRAD Level II, ACARS, AQUA, Eta Grid # 214) Location in the Physical System (e.g., ground level, below ground) Temporal Frequency or Range (e.g., < 5 min, < 1 hr, < 6 hrs) Geographic location (using the GUI shown above) A complete set of descriptors for all 20-25 data sets within the base LEAD system must be made available within the Portal. Ontologies will connect each attribute. The figure below shows the initial 7 LEAD data sets (far right) and attributes that will be used to build the catalog and meta data sets. The data system must be aware of the location of all servers and feeds that provide access to the 20-25 data sets composing the base LEAD framework Semantic interchange technologies must be available for each of the 20-25 base LEAD data sets, with options for users to develop their own for other data. Users must be able to create catalogs and meta data descriptors for each of their own data sets, in some cases building from global catalogs. For example, if a user wishes to download 2 weeks of NEXRAD radar data from the THREEDS server at Oklahoma, it may build the user catalog using the THREEDS descriptors as a starting point. Users must be able to allocate storage for working with data. This task crosscuts with orchestration and the portal. Controllable devices, such as the CASA Doppler radars, must be linked to the data system so that tools can interact dynamically with them. Users must be able to monitor all data feeds and examine their catalog holdings, preferably using graphical tools. Grid and Web Services Test Beds Hardware - Each site will define their local testbed node hardware configuration. Testbeds will be running Redhat linux version 9 Storage requirements are site specific. Each site will determine local storage requirements for their site. Applications- Testbed sites are not required to support the same applications and may be providing different services during each phase of the project. Testbed sites will define software configurations for their sites and software configurations will also be documented on the LEAD website. Each application identified should have minimum prerequisite software (libraries, etc.) defined. LDM/IDD ESML THREDDS IDV ADAS WRF ADaM Middleware Testbed sites will be using the same middleware components, so these components must be identified with correct version numbers so that all sites will be developing to the same platform. Development to begin with Globus Toolkit release 3.0.2. This is the latest release of Globus toolkit and is OGSA compliant. Data Identify local requirements for datasets and formats at each testbed site. Will any remote data stores be available to testbed sites in Scenario 1? Administration NCSA will provide Certificate Authority for client side certificates. Server side certificates are provided locally. Orchestration Sub-System Users must be able to script workflow (i.e., design a sequence of tasks) using a graphical interface (e.g., along the lines of AVS, or the new system developed by the University of Texas at Austin). Icons for all tools and processes must be available A library of command primitives will link all tools and processes together and serve as the foundation for actions, e.g,. LOCATE, DIRECT, ACQUIRE, VIEW, BUILD. A number of simple workflow templates will be available, e.g., identify and access data, add an operation, add an operation plus storage allocation, etc. The use case scenarios will serve as a guide for building these templates Users must be able to save and edit workflow, including dynamic workflow at various stages in the execution process Workflow must be re-entrant to support iterative processes Workflow must be dynamically extensible, i.e., the workflow must be able to expand with time based upon outcomes of calculations, user input, or observations Workflow must be fault tolerant, allowing automated restart due to the failure of any component Users must be able to monitor all stages of the execution stream (preferably via graphical interfaces) Users must be able to determine the availability of computing and storage resources on the grid Users must be able to allocate resources (security, authentication) and submit jobs for execution on demand or in a pre-scheduled fashion. Options must exist for workflows to be repeated at regular intervals specified by the user. Users must be able to estimate the execution time of static workflows and specify limits on resource usage and application performance. For example, a particular workflow may involve a weather forecast that must be performed within the next hour. Tools Sub-System All base application tools (ADAS, ADaM, IDV, WRF) must be accessible via the portal from the GWSTBs Associated with each tool will be a description, tutorial, and canned data set Version control must be available for each of the tools so that users have a history of what is available and what was used for a particular problem (workflow must include logging of code version numbers) Pre-compiled versions of each tool should be available for multiple computing platforms Source code for each tool will be made available to the extent possible Users must be able to add other tools Interfaces must be available so that each base LEAD tool can communicate with each of its counterparts Users should be able to work with tools outside of the Portal Each tool must be fault tolerant The workflow engine must be able to monitor the status of each tool Tools must be able to send commands to controllable observing systems as well as to other tools, via interfaces to workflow, to effectuate dynamic operability Semantic interchange technologies must be available for every format required or produced by each of the base tools Users must be able to create their own semantic interchange technologies for tools they add to LEAD 7. Create a series of end user focus groups including those in education, research, and operations. Engage them in discussions about the LEAD vision and prepare them to take part in early tests of the technology. 8. Test and evaluate capabilities within the focus groups, with formal mechanisms of feedback to the LEAD team. 9. Deploy and support.  7Z^ " $ m & = { ~   0  2  W kYZtƾje hB;b\+hB;bh5B*OJQJ\^JaJ phhB*OJQJ^JaJ phhmH sH h5\mH sH h h5\hB;bhgh*hB;b>*h*hg>* hu>*hMghcS|hSyhu5h"hu5h)hu>*huhGhGhG5CJ aJ %78NYZy z   0  2  W  1 & Fgdi gdu$a$gdGll1kZUrVMM_ & F d@&]gdu gdu gd& & Fgdit=vU[rx V]MXMT _`ad2O h >* hpEf>*hcS|hq>*hFwhq>*h?Fhq>*hq hu5\hihu>*hMhu>*huhYgh +Shg(hB;bhB;bB*OJQJ\^JaJ ph hmN\hB;bhB;b\"hB;bB*OJQJ\^JaJ ph2_`XYNOwxST ""#"/#0#$$#&$&U(V(O* & Fd@&]gdq gdq d@&]gdqYdOZxT_ #"."0#;#$$$&0&V(b(P*\*s++--..1121111144448*8;+;^>_>>AAAA-B.BDBVBuB}B~BBBκⶲhuh|.h|.hfr>*hYghfrhw5B*CJ\aJphF hw>* hw5\hwhw\ hw\hw hqH* hq>*hwdxhq>*hpEfhq@O*P*r+s+..1121111l2{33`44 & F d@&]gdw & F d@&]gdw d@&]^gdw d@&]gdw & Fd@&]gdq d@&]gdq444a567m7788*88:n::;;+;;C<= *d@&]^*gdw & F d@&]gdw & F d@&]gdw d@&]gdw=x=)>^>_>>4??{@AAAA~BBBBBBB gd& gdfr d@&]gdfr & F d@&]gdw d@&]gdw & F d@&]gdwBBBBBBBBBCDDDDD.E/ERR=V.Z/ZOZPZXZZZ?[K[\\\\\\]]6^:^^^Y_Z_s_LfNfǿǴǧyuhx hx5\ h_vh;uoh{ bh{ b5\ h{ b5\ h_v5\h;uohI/o hI/o5\ h0JjhUjhUh h5\hhwhsksjhuUmHnHuh&h+?h+h+?>*h'Z -BBBBBBBBBBBBBBBBBBCCC & F d@&]gd d@&]^gd$ d@&]^a$gd gd&CJEwFFF0GHII-JJKKKL#LZLtLL$MMNOOAP & F d@&]gd & F d@&]gdAPQRRRR6SSST]TTT U=V & F d@&]gdI/o & F d@&]gdI/o d@&]gdI/o$ d@&]^a$gdI/o d@&]gd & F d@&]gd=VVnWX0YY.Z/ZOZPZZZ?[\ & F d@&]gd_v & F d@&]gd{ b *d@&]^*gd{ b$ d@&]^a$gd{ b d@&]gd{ b & F d@&]gd;uo \\\\\\\\]6^^Y_Z_s_t_ d@&]^gdx$ d@&]^a$gdx d@&]gd & F d@&]gd_v & F d@&]gd;uo & F d@&]gd_vt_=`q`aajbbCcc djddSeMfNf_f$ d@&]^a$gdO-m d@&]gd & F d@&]gd & F d@&]gdx & F d@&]gdxNf_f*k+k,k-k0kXklll l(l@lxlylzlllllllllh E9hsksjIhwU h+FhD;h0eChD;>*hD;h|.h+F h|.>*h|.h+F>*hh|.hA>*hAhYgh)hhsks>*hO-m hO-m5\_f`ffgg:hhhiMiniiSjj+k,kllyl{l gd|. gdD; gd& & F d@&]gdkc[:o18>K̊]F! w_#sb0u~L{5QF`kCZN-\oQ2įFXQVt˷ QDQktt%`ftNF?w 'bVFɎ>?0\a' s5\nn3/3gxȾ/;c5[ͥcq~#KW3›XvF鍈F͟㑆Ey^|rŷ[lJm>|HR@2 c<|PVh68gTDG_t$oJՔ6Thhc,on̜tAǚmFϥ"@Jj* 6UB冐qQ]個Y+outy+V`$ƓjxObz1̝jo>tV@oKz6u:{Vf+j璫g13ufJZ똺a;κ$kntrRuT w݉*7V/ń^r aQ,~T,Q>(D`FOh=ѳSZV0tt¸L҈p:~Y=!fN}M hT4u+Up8TR+=^bϔetd2)&Bq<FS'9AbS@,bEiKYGHWsȘ+t5Gv}̌CCcE= oGJj c!%fF/D!փ{>:W:V:X[fU7.nlʬElrp Rֺ{ZWZ:sbuuUS1B=\Ro-GnO%{㣈V3',J;žP0/v.a<_mZ'W/Y׾~WYp5\6i.%a1ousD7-3y<3bQ5Z$;oY+Rd4pH ׊ԏlݟXyCuSscEsM}[ڔ+[W*fa\>_jeOe pV5op:ZtcfquS͢:̰gHj-W'ޚ0R NHLE3`yɾW[[_ djS;J-|}϶d, Ī5uF@JUڰ ?7ƀ}^ 87ww5B-HW#smڒp͜b5v},9͘n2{?cfep͐]Z.50~{:FW]NOϥ͏ͮ5MKݏt-Ay-> 6Uyxuk{} nVNcdL᳄ r'ӏ1r'42hS=M#%jKGΝGwCO(7oB̺utLIT {Օ&0i̝wީ+']`5[nEWN_,_\WNdh0K,ѕ)0̍7ި+`uu]+'K`tr&|)_&=l\ݰ830ߙ ߞ7c2I;'Bc@HőhbzY'J]:(1VP"(=b.x3lly͗Q\ste,"4 =J蟳P-Z5#DǃC38qUI5^řĚnXx#SW]`AĚnXӿ+'B`"tbr"FYWNDykk]MQX Nqm8 &>hH}0Q`$IKr&Ye5CueL.a{٩]Ͼ+'M`҂Y45fĚmX}dvrgL r$l{z?%0Y,[ekuO`+{ ry1E Q6(hكfpmܠ'0y,_w=D`= ] aA/S؜c&((Q`DCL~TfkEU$}+Yb#*" se9IP'AC%gTcq/c˙EYiu"jGʬCsD,f <_K"b? ӹnD,f |kQ{9v묝8Fxb0XWNhc7PWNhcsʉwu6Vu 6V~r 2t 6VʉmvHjJѕ(0XWu$ 6V;rd|b>+'E`^pIm,1r&- ^ҕ.0c>Е!0Xϛlr2Fy~>`֕_`F]9Yilzǟrr&8'g eE@]D^ eX\z$7Ч_ӟ&'ghc?PWPƂ|2DW0+)łF<#q0EjTc@6ee_+GIJ '7ײYbg <S"/oe,Yt5b&ݜ y[f#6`|yd6EXk/n`B}K{YO/6/Y)XҡWa&zˀx_2~^ 7ܾ^_Ջ*O/uz)>J63@/a\Jqzu?:y+[KsU/OnoշRT24z){URRZ_n36^DY(;SuuG^>LUub/)z%%R,&>3tt=Z)S;W56/&4;s&)9ΩO|O1|ɢDXD#lRlmx3EY( ~Lqi( rEY(׆z/%TQWe٢l(eDY(soY4%*P=l&Pgxɢz]dg^Fu:uT',2=ƨR>k&Sͳ=>E?=b\h޻)>&\Opf'5Teάn_XYBNR>Fσ)iu0|dt3mPδ,ݽWpTJR)}0)D'aVވRAY&f|F%LoSDE><>-rOm*m1šF(b9q~e9ObHvid:C&_g@c s17bfsR=cc%z@pi+ɝd,!ki.yCaˠa9~09/̍y ,XwR-ZDnRSSC/^ʪ%7t[444;56&L.]Jc.ѕ+WUVok7 Yz5=~[ߒ5k>r|< z衇àVvZ~Wc=F='$O=ygs֓3z#|%_"'?S?D|_KNlnl.+Kk#ג10x^%>Sz%"՗s-%%-^|Tb/ZCשdJJ?^./Q\\.-)Y>/l_?*Wi_ߢ+_z&+R"#tNOv<&Sk ~%}\o\VzVK;ztg5DyL]>M,K{_x4<&it\^+}y~)3=G<=nv^GO[ }:9;槃xp@Sj #$ B])%}eFJQry{{H:6نI"I !oA<,$/o}F+y{*_Bm +p\DG/COh* Grzh9Oσ^4^NQtIWP^F7EZDF ZHOKYEZ@EMҟ1?yG2/&,m}F:qe5Q^̧5r1ANBow@An!jr\ *"5r:|Bn/zA&/Vнr|||B^)'a6mr|ٯsBiG.mqoZPۦCm<\'n{"eܶѓm*iK6*5 AfeݟO9RO{ !p!q9T$lr\!}>$.4,> >`{@zˇ`]뢧 AiH|v*-9 .ǐ>@^RpP:ҥ!@{@H?Re8(Ґz:/mF#;Ž-}4wKIqO.FC p=/e:q+h-QbsyvM]n=]e}$mcfr+h-,}.A#}0$.@]=hCvx69!% q_F#TAtp骐t9c6x[6 .\(P6pzu!qyX4wXra鱡q-rayY>.\\P=cCu@@o".>MLT=]@ H|vW@*ҏi>#tIuok{$ܿ}.fPvGi}eE΍W3zE|ݒMWY^!;]qj^ $ZK(sƖړpxp8+N㿨0nQn }Ѯ^yO|{^Jgu;oᄦ暛T{23jaWPeU…3sozTYX~q64F[~]aoٲWmEl4u ujꪋ۬ {v-Q"9zv/}S;bAmu{ں>r=;Xsw!zV;eWbwW=_(+9"?`VqO?DqD@D NormalCJ_HaJmH nHsH tHDA@D Default Paragraph FontRiR  Table Normal4 l4a (k(No List$L@$ GDate6U@6  Hyperlink >*B*phdd78NYZyz02W1kZ U r   V MM_`XYNOwxST"#/0#$U V O"P"r#s#&&1)2))))l*{++`,,,,a-./m//00*002n2233+33C45x5)6^6_66477{89999~::::::::::::::::::::::::;;;J=w>>>0?@AA-BBCCCD#DZDtDD$EEFGGAHIJJJJ6KKKL]LLL M=NNnOP0QQ.R/RORPRRR?STTTTTTTTU6VVYWZWsWtW=XqXYYjZZC[[ \j\\S]M^N^_^`^^__:```aManaaSbb+c,cddyd{ddddd000000000 0 0 0 0 0 0 0 0 0 0  0  0  0  0  0 0 0 0 0 0 0 0 00000 0 0 0 0 0 0 0 0 0 0  0  0 00_0_ 00 00 00 00 00 00 00 00 00 0 0 0 0 0 0 0 0 0 000 0 0) 0) 0) 0) 0)0 0 0, 0, 0, 0, 0, 0,0 0 00 00 00 00 000 0 03 03 03 03 03 030 0 0_6 0_6 0_6 0_6 0_6 0_6080090980980909098098098090980900980980980980980980980980980980980980808 08 08 0;8 0;8 0;8 0;0 0;0 0; 0; 0; 0; 0 ; 0 ; 0 ; 0 ; 0 ; 0; 0; 0; 0; 0 0 0 0 0 0 0000 0 H 0JH 0JH 0JH 0JH 0JH 0J0 0J 0 H 0 H 0 H 0 H 0 0P 00P0P00 0P 0;R0 0;R0 0;R0 0*S0 0*S0 0*S0 0*S0 0*S0 0*S0 0*SP 0P 0TP 0P 0P0P0P0P 0P 0_W 0_WX 0X 0X 0X 0X 0X 0 00 00 0 0 0 0000000 00 00 00 00 0 0 0 X 0 ` 0 ` 0 ` 0` 0` 0` 00b`0b`0b`0`0`0`000 078NYyz02W1k U r   V MMNOwxST"#/0#$U V O"P"r#s#&&)))l*{++`,,,,a-./m//00*002n2233+33C45x5)6^6_66477{8999~::::::::::::::;;;J=w>>>0?@AA-BBCCCD#DZDtDD$EEFGGAHIJJ6KKKL]LLL M=NNnOP0QQ.RORRR?STTTTTTTTU6VVYWsWtW=XqXYYjZZC[[ \j\\S]M^N^_^`^^__:```aManaaSbb+c,cddddd0000^~00000 0 0 0 0 0 0 0 0 0 0 0 0 00 0 0  0 0  0  0  0 0  0 0@0@0 @ 0 @ 0 @ 0 @ 0 @ 0 @ 0 @ 0 @ 0 @ 00 @ 00 @ 0 @0 @0 @0 @0 @0 @0 @0 @0 @0 @0 @0 @0 @0 @00 @0 @00 @0 @00 @00 @0 @0 @0 @0 @0 @0 @0^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00 @ 000000000000000^>0/0@:| @0  @0 @+ 0( @+ 0  A+ 00 A+ 0  A+ 0 A+ 0( A+ 0 A+ 0 A+ 0 A+ 0 A+ 0 A+ 0 A+ 0 A+ 00 A+ 0 A+ 0 A+ 0 A+ 0 A+ 0 A+ 0 @+ 0 @+ 0 @+ 0 @+ 0 @+ 0 @+ 0^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00^~00 0 0H 0H 0H 0H 0H 0H 0 0 0H 0H 0 0 0H 0H 0P 0P 0P 0P 00P0P0 0@00 0tBNfl7;=AH1_O*4=BCAP=V\t__f{ll8:<>?@BCDEFGIJl9<<.=dX8@ (    < ?"% B S  ?:dxRt;d<=T>dB?@A BCDrEtFGHIJ\~KLLTMNOPtQ[R4STU V CCvvjj _P_PXX)X2X2Xd     KKzzww gPgP%X.X.X8X8Xd 8*urn:schemas-microsoft-com:office:smarttagsCityB*urn:schemas-microsoft-com:office:smarttagscountry-region=*urn:schemas-microsoft-com:office:smarttags PlaceType=*urn:schemas-microsoft-com:office:smarttags PlaceName9*urn:schemas-microsoft-com:office:smarttagsState9*urn:schemas-microsoft-com:office:smarttagsplace8*urn:schemas-microsoft-com:office:smarttagsdate 12004DayMonthYear9ALSU^`ilt &1!*1<qv(+;B  q x 02<E>BDK !!y$}$Z%a%J&N&* *K*N*e*k***++++1,7,,,,,----$./.S/Y////0P0V0001(1T2Z22233P3Y3333344556666<7B7&8,8/939;;WA_AEEG!GGGMLPLMM}RRRRRRRRMSTSSSTTTUUUVVzVVVVXX^^^^dd EE FFOOXX YYdd33333338YZ^021)),,0*03+3:::;D#DZDtDEFJJKLTT*WYWZWtWXYN^`^,c0c{dddddd  r it_T+//-H1eQ6z 4x^tzI0ׂc(SHrX0ׂu ]H8Lq*L#h ^`OJQJo(h ^`OJQJo(oh bb^b`OJQJo(h 2 2 ^2 `OJQJo(h   ^ `OJQJo(oh ^`OJQJo(h ^`OJQJo(h rr^r`OJQJo(oh BB^B`OJQJo(h^`OJQJo(hHh^`OJQJ^Jo(hHohpp^p`OJQJo(hHh@ @ ^@ `OJQJo(hHh^`OJQJ^Jo(hHoh^`OJQJo(hHh^`OJQJo(hHh^`OJQJ^Jo(hHohPP^P`OJQJo(hHh^`OJQJo(hH^`o()hpLp^p`L.h@ @ ^@ `.h^`.hL^`L.h^`.h^`.hPLP^P`L.h ^`OJQJo(h ^`OJQJo(oh bb^b`OJQJo(h 2 2 ^2 `OJQJo(h   ^ `OJQJo(oh ^`OJQJo(h ^`OJQJo(h rr^r`OJQJo(oh BB^B`OJQJo(h ^`OJQJo(h ^`OJQJo(oh bb^b`OJQJo(h 2 2 ^2 `OJQJo(h   ^ `OJQJo(oh ^`OJQJo(h ^`OJQJo(h rr^r`OJQJo(oh BB^B`OJQJo(h ^`hH.^`o()hpLp^p`L.h@ @ ^@ `.h^`.hL^`L.h^`.h^`.hPLP^P`L.h ^`OJQJo(h ^`OJQJo(oh bb^b`OJQJo(h 2 2 ^2 `OJQJo(h   ^ `OJQJo(oh ^`OJQJo(h ^`OJQJo(h rr^r`OJQJo(oh BB^B`OJQJo(h ^`hH.^`o()hpLp^p`L.h@ @ ^@ `.h^`.hL^`L.h^`.h^`.hPLP^P`L.h ^`OJQJo(h ^`OJQJo(oh ^`OJQJo(h t t ^t `OJQJo(h D D ^D `OJQJo(oh ^`OJQJo(h ^`OJQJo(h ^`OJQJo(oh ^`OJQJo(h^`.^`o()hpLp^p`L.h@ @ ^@ `.h^`.hL^`L.h^`.h^`.hPLP^P`L. rXLqu ]tzIT+/0 r -H1c(S6z 4                    %                                   %                ONAMgai'Z "D;+?*c())[2ra27 E9j;>0eC+FG?@ABCDEFGHIJKMNOPQRSTUVWXYZ[\]^`abcdefghijklmnopqrstuvwxyz{|}~Root Entry F@]0Data L$1Table_tlWordDocument"SummaryInformation(DocumentSummaryInformation8CompObjj  FMicrosoft Word Document MSWordDocWord.Document.89q