Last year I wrote up a summary of Morsulus’s process for applying updates from the monthly LoARs to the Society’s O&A database, and then more recently I put together a high-level visual overview of the context in which the Morsulus herald does his work.
More recently, I thought it might be useful to use a similar visual style to summarize the monthly update process, as a way of giving people a graphical roadmap to the data flow before they dive into the step-by-step technical nitty-gritty.
All of the information flows outlined above are part of the single step labeled “Morsulus-Tools” on the global roadmap diagram.
This entire process is actually repeated three times for each month’s LoAR, corresponding to the the proof-pass-1, proof-pass-2, and final release cycles. In addition to allowing other senior heralds to review and provide feedback on decisions, the proof-pass rounds also provide an opportunity for Morsulus to put the LoAR through a dry run and to report any small issues that come up for correction in the later cycles.
In each of those cycles, the following steps unfold:
- Morsulus makes a copy of the O&A SQLite database from previous months; all changes are applied to this copy without modifying last month’s database so that the results of the proof passes can be discarded without lasting effect.
- A zip file is received from the Silver Staple herald which contains a collection of XML files corresponding to a given month’s LoAR.
- The xml_to_actions script is used to parse all of the XML files and output a single pipe-delimited text file that contains all of the individual actions specified in that LoAR — mostly registering new names and armory, but also releases, transfers, reblazons, and other rarer administrative actions. If any problems emerge, the XML files or resulting actions file can be manually patched as a temporary workaround and the errors are reported back to the Sovereigns.
- The apply_actions script is used to apply the file full of actions to the SQLite database. Again, if problems emerge at this step, the file is manually patched and errors are reported back to the Sovereigns.
- Next the dump_db script is used to extract new blazons which have never been indexed with armorial descriptions and write them to a new file.
- Then the index program is run, which provides a graphical user interface in which Morsulus reviews each new blazon and constructs the corresponding armorial descriptions.
- When that indexing is complete, the xlate.pl and oldcheck.pl scripts make adjustments to the armory descriptions and ensure that no errors have crept into the data, and then the merge_descs script migrates the new armory descriptions into the main SQLite database.
- When all of the data has been processed, the new_run_checks script scans the main SQLite database to check for errors or inconsistencies.
- Finally, the dump_db_all script extracts all of the necessary public data and exports it as a pipe-delimited text file which can be copied to the live web server and used to serve O&A requests.