On this page I post details of talks I’ve given to the software engineering community.
Travis Thieman and I delivered a talk to the CTO School meetup in NYC covering our experience migrating GameChanger’s build and deploy pipelines from being heavily based on Chef to one based around Docker.
This presentation is split in to two main sections. The first section covers the motivations for why GameChanger, as a fast-growing startup, identified a need to replace it’s existing Chef-based deploy model with a model which reduces deploy-time risk and allows its engineering team to scale.
The second section is a high-level walkthrough of the new GameChanger deploy pipeline based around Docker.
One of the biggest selling points of MongoDB is its ability to directly persist arbitrary object structures without requiring the developer to navigate issues like building an ORM layer. However, this flexibility comes at a price - creating meaningful test data which adheres to these more complex structures can be much more involved.
At GameChanger we observed that developers typically had to write large amounts of test data setup boilerplate to perform an effective test against a MongoDB-dependent function, dis-incentivizing them from writing rigorous tests. So we created Monufacture - a Python test data generation framework for MongoDB that makes setting up test data a breeze.
In this talk (originally given at the MongoDB Meetup in NYC) I break down some of motivations and design decisions behind Monufacture, demoing its functionality and giving some tips on how to write effective tests of your MongoDB-dependent code.