Spawn Labs

Dynamic Media Firewall Service

It provides an API that enables the RendezvousSvc (which mediates negotiations for a play session) to create and delete firewall rules as clients start and stop play on cloud appliances.

One of my goals was to avoid maintaining rules state outside of iptables. (Restoring in the case of a shutdown, or synching to a fallback firewall would be more fragile.) To make this work, I used a strategy I called Chain Reference. Rather than inserting rules directly into the default iptables tables (PREROUTING, POSTROUTING, FORWARD, etc), I created chains of rules which could then be easily deleted as a group.

Here, then, are the iptables transactions involved in starting a playsession:

The entry point for interacting with the service is through the "Access" layer (a group of classes for receving CLI, http, udp, and RabbitMQ messages). Beyond that is the DynamicRuleAPI class. It defines available use-cases for modifying the "dynamic rules" (as opposed to static). As its toolset, it uses DynamicRuleActions, which calls upon the IptablesRuleExecutor to handle the iptables calls at the low level.

Testing the Dynamic Media Firewall Service

To validate the service (and the underlying firewall machine), I need to send standard traffic through the media firewall (MF) as if 320 client-to-appliance connections were active. At the same time, I want to actually play on one additional client-to-appliance setup and confirm that the play experience is good while the traffic test is underway.

To do this, I set up some number of computers, each running a process with some number of threads (configurable) representing both a MockAppliance (MA) and an associated MockClient (MC). Each thread will be sending and receiving appropriate bi-directional traffic through the MF and back to itself (so, 640 threads).

I track packets during the test, and at the end, evaluate performance based on packets lost, on time, late, etc.


The multi-threading needs to be really efficient to simulate so much traffic.

Clients and appliances must be instantiated and readied before unleashing them all at roughly the same time, using receiversReadyIndicator (a CountdownLatch).

Here is the startup method:

private void test() throws Exception {

// prepare

prepareClients();

prepareAppliances();

sortStreamsAndReceivers();


// start

startReceivers();// spawns a new thread which blocks

receiversReadyIndicator.await();// Wait for all receivers to be ready

startStreams();// blocks main until streams terminate

Thread.sleep(postStreamSleepMs);// wait for final packets to arrive at receivers.


// Test is over once thread continues here

shutdownReceivers();


doReport(false);

}

Android Gamestick Client

The "Gamestick" (internal name) was an HDMI dongle with a 1GHz TI OMAP 4430 Dual Core chip. The 8GB SD card has Android 4.0.4 (ICS) installed. Bluetooth connects it to a game controller. Wifi to the intartubes. The side-USB (2) is for development purposes, and the micro-USB on the end (3) gives it power (and for development purposes allows the dev PC to connect for ADB, debugging, logcat, etc)

Our game browsing and streaming app is what we call the Gamestick Client.

I wrote at least half of the app. Two of the pieces are:

  • CatalogService: Handles periodic update of the locally-cached game catalog from the cloud. Stores in a sqllite db.

  • SpawnPlayerActivity: The foreground activity responsible for negotiating the play session with the cloud servers, displaying the video/audio streams, and sending the control inputs.

The biggest challenge, however, was the build and deployment pipeline. It required simultaneously incorporating these:

  • Android

  • maven (and android-maven-plugin)

  • Artifactory

  • maven-release-plugin

  • Jenkins

  • Injecting maven version and jenkins build no into the AndroidManifest

Any 2-3 of these is pretty easily accomplished. But getting all 6 to work together, while enabling both dev builds (on Eclipse or IntelliJ Idea), Jenkins builds, and maven deployments and release builds, is a tough job.

Here is the simple Help screen proudly showing our maven version and jenkins build number triumphantly: