From Filtered Push Wiki
Jump to: navigation, search

Mapper Work Plan

This page summarizes the work proposed for developing the Mapper component.

This work has been superseded by development of the Annotation Processor and Drivers.

Initial tasks

  • Set up an instance of Specify6/MySQL (aka SP) collection database on a Linux system accessible from the network. This system will represent cases where some business rules must be enforced at the application level.
  • Set up another collection database that is not Specify (aka NS) and not MySQL, on a different machine accessible to the network. This system will represents cases where no business rules are enforced at the application level. It may include a small, custom GUI for data entry and other interactions. The database schema and GUI will be completely under our control and support multiple variations.
  • Set up a mock FP "network" system in which we can fake FP services as needed to test the Mapper and run test scenarios.
  • Create a minimal Specify plugin where we can explore interactions with Specify and create custom user interfaces.
  • Develop a minimal FP read-cache application, including a (relational?) database for storing collection records, and a simple GUI for viewing cache contents, deleting unneeded records, and requesting records from FP.
  • Talk to Rod Spears about whether he favors the plugin approach we are discussing (rather than us writing, with his help, SQL that works directly with the Specify database). Just to make sure the more complex solution is necessary.

Proposed test scenarios demonstrating capabilities of the mapper

  1. In response to a message (from mock FP), retrieve records from either SP or NS that meet certain criteria, using a long-running mapper service associated with the database, and return them represented in Darwin Core.
  2. Request records from FP, and store in local cache.
    • For NS case use a simple GUI associated with the local cache.
    • For Specify case ??
  3. Assist in the creation of new collection records in SP or NS, using records from each possible combination of the resource below:
    • Records in the local database.
    • Records retrieved via live queries to FP, which may use live queries to other database.
    • Records in the local cache, filled previously by queries to FP.
  4. Insert records created in 3 into SP or NS.
  5. Handle incoming annotation (Darwin Core record "patches" with justifications).
    • For NS database automatically propose SQL for applying patch to database, and optionally perform the queries.
    • For SP database ??

Work in progress

Schema for SQL Server database MyBatis configuration sample