Integrating with Unity

Now that you have Kinesis set up and the correct IAM permissions in place, you will be able to ingest data from your game to your data analytics pipeline.

  • In Unity, open the EventsManager.cs script. This script batches event data for certain in-game events and sends the event data to Kinesis. The batch size is set to 4 records for this workshop, but in production that value should be much higher. This script creates events for when the app is loaded, when a player signs up for an account, signs in to an account, makes a transaction, and more.

  • First, look for the streamName variable and edit it to include the name of your Kinesis stream.

private static string streamName = "Peculiar-KDS";

Another important change to make in the code if you are using the Game Analytics Pipeline Solution is to change the application_id to that of the Game Analytics Pipeline, which you can find in the Outputs section of the CloudFormation template. This is out of scope for this workshop.

Let’s summarize what the code in this script is doing. We are initializing the kinesisClient with our Cognito credentials so we can put records into the Kinesis stream.

private static AmazonKinesisClient kinesisClient =
    new AmazonKinesisClient(CredentialsManager.credentials, CredentialsManager.region);

A record is also created for each event, which contains data including the event id, event type, event name, timestamp, and more.

 Dictionary<string, object> record = new Dictionary<string, object>()
            { "event_id", event_id },
            { "event_type", event_name },
            { "event_name", event_name },
            { "event_timestamp", current_time },
            { "event_version", "1.1.0" },
            { "user_id", CredentialsManager.userid },
            { "session_id", session_id },
            { "event_data", event_data }

Now lets look at the Put_Records method. In this method, a PutRecordsRequestEntry is created. This represents the output for the PutRecords SDK call.

List<PutRecordsRequestEntry> formatted_records = new List<PutRecordsRequestEntry>();

The data is converted to JSON using the Newtonsoft package to serialize dictionaries.

string jsonData = JsonConvert.SerializeObject(rec, Formatting.Indented);

Finally, data is sent to Kinesis using the PutRecordsAsync SDK call. This writes multiple data records into a Kinesis data stream in a single call.

Task<PutRecordsResponse> responseTask = kinesisClient.PutRecordsAsync(new PutRecordsRequest
            Records = formatted_records,
            StreamName = streamName

PutRecordsResponse responseObject = await responseTask;

The code in this Unity project and the schema for the data works with the Game Analytics Pipeline as well, so you can take learnings from this workshop and use it with the official solution.

  • Save the code and test it by playing the game. Sign in and purchase some wizard hats and even try buying coins. You will see that everything you do has an event that is generated and added to a batch of records. This is shown in the console. Click around until you see that data has successfully been sent to the stream.

  • You can even view stream metrics in Kinesis. Go to your Kinesis Data Stream and select the Monitoring tab. You can view metrics such as put record success and put record latency from the producer applications (in our case, the game engine).

You can also view metrics like get records latency and get records success from the consumer applications (in our case, Kinesis Data Firehose).

You can also view metrics in Kinesis Data Firehose such as Records read from Kinesis Data Streams and Bytes read from Kinesis Data Streams:

As well as metrics like S3 data freshness, records delivered to S3, and more.

  • You can view your data in S3 by navitaging to the S3 bucket. In the AWS Management Console, go to Amazon S3 and search for the S3 bucket you created. This workshop uses the bucket peculiar-wizards-data-lake however your bucket name will be different since bucket names are globally unique.

  • Click through all the partitions – the data should be organized by year, month, and day. You will see that you have data in your bucket now!

  • You can download the files and open a text editor to view the contents.

Now, there is not much data yet because you are only one player! In the next section, we will simulate a lot more data being generated from many players.