This is the first of many of a series that is starting today. Thrashing Code Metal Monday’s picks for heavy tunes to code to. Ya know, if you can handle it.
With that introduction, may the thrashing code begin!
This post includes a collection of my thoughts on design and architecture of a data generation service project I’ve started called Data Diluvium. I’m very open to changes, new ideas, or completely different paradigms around these plans altogether. You can jump into the conversation thread. What kind of data do you often need? What systems do you want it inserted into?
Breakdown of Article Ideas:
- Collected Systems API - This API service idea revolves around a request that accepts a schema type for a particular database type, an export source for inserting the data into, and generating an amount of data per the requested amount. The response then initiates that data generation, while responding with a received message and confirmation based on what it has received.
- Individual Request API - This API service idea (thanks to Dave Curylo for this one, posted in the thread) revolves around the generation of data, requested at end points for a particular type of random data generation.
Alright, time to dive deeper into each of these.
- Part 1 - Setting up a GCP Container Cluster
- Part 2 - Working with a GCP Container Cluster
- Part 3 - Setup of Drone.io on a GCP Container Cluster - Currently being written
I set out to create a container cluster to work with in Google Cloud. These are the notes of that effort I undertook. On the heels of this article I’m putting together the notes also on getting Drone.io fully setup with an appropriate domain name and the like for use in full production grade work. For now, here’s the lowdown on the steps I took to get educated on and informed about setup and use of the Google Container Cluster.
First exploratory script I ran based on instructions here.
gcloud container clusters create working-space \ --zone us-central1-a \ --additional-zones us-central1-b,us-central1-c