In the gaming industry, users are often allowed to open accounts and make monetary transactions; they can exchange real money for in-game currency. Players want to be able to add or withdraw funds quickly, and they also want to know how their balance has changed over time. Having this information available helps our customers better understand their users’ behavior so they can identify marketing opportunities and improve sales predictions.
In this blog post, we’ll cover how to build a virtual wallet service. This service can be used by gaming companies to manage players’ account balances, including virtual currencies. We’ll show you how to minimize operational overhead and maximize scalability by using a serverless approach that uses Amazon API Gateway, AWS Lambda, Amazon Quantum Ledger Database (QLDB), Amazon Kinesis Data Streams, and Amazon DynamoDB.
You will find sample code to demonstrate this architecture and instructions on how to deploy it to your own account here.
Overview of solution components
API Gateway is a fully managed service that makes it relatively easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. We will use it as the front door and the only way for external systems to interact with our wallet service. Lambda is a serverless compute service that lets you run code without provisioning or managing servers. We use it to run the code for our backend system. Lastly, to store data from users’ accounts, we use Amazon QLDB. QLDB is a fully managed ledger database that provides a transparent, immutable, and cryptographically verifiable transaction log owned by a central trusted authority.
Traditionally, these systems have used relational databases as systems of record; however, QLDB has features built for this type of use case. It uses optimistic concurrency control and serializable isolation to ensure that no one can modify account balances while updates are performed, preventing possible race conditions. Because it’s a ledger database, it contains the history of all modifications. So, there is no need to build complicated audit logic into our application or risk an administrator modifying transaction data.
Use cases and API design
At the very least, our wallet service must implement the following APIs: AddFunds, WithdrawFunds, GetFunds, and CreateWallet.
GetFunds will look up the current value in the database and return to the caller. Although not covered in this blog post, the results of GetFunds can be cached using an in-memory database or API Gateway’s caching mechanism.
WithdrawFunds will start a transaction on QLDB, select the current amount in the wallet, and remove the requested funds if they are available. Because the code will run inside a QLDB transaction, it is not possible for another process to modify the amount in that wallet while our transaction is running. The same principle applies to AddFunds. When AddFunds and WithdrawFunds are successful, they will return both the previous and the current balance on the account.
CreateAccount will create a new document in QLDB only if it doesn’t already exist.
To keep things simple in the following example, we have one table (
Wallets) and only store the
balance as a QLDB document. You may add other attributes to the table (such as different currencies, country/region, etc.). However, keep in mind that documents in QLDB can never be truly deleted. This may be significant to your business in terms of regulatory compliance. Other attributes, such as
Transaction ID and timestamp, are built in by QLDB and do not need to be kept as attributes.
CREATE TABLE Wallets;
CREATE INDEX on Wallets (accountId);
INSERT INTO Wallets VALUE
An index is required to enable lookups on the
Query transaction history from DynamoDB
Showing users a history of their transactions is a common use case. You could query that information from QLDB directly using the History function. However, using a separate data store to serve that information is a more scalable approach. In this case, we take advantage of QLDB’s integration with Kinesis Data Streams and stream our transaction data to a DynamoDB table.
Kinesis Data Streams is a massively scalable and durable real-time data streaming service. DynamoDB is a key-value and document database that delivers single-digit millisecond performance at any scale. It’s a fully managed, multi-Region, multi-active, and durable database with built-in security, backup and restore, and in-memory caching for internet-scale applications. Using these services, we make transaction history available via an API (GetTransactionHistory), as shown in Figure 1.
Every time a document revision is committed to the QLDB journal (UPDATE, INSERT, or DELETE), QLDB produces a revision record that contains the document and transaction metadata. We use the
accountID as our primary key. The transaction timestamp from QLDB’s metadata is used as our sort key in our DynamoDB table (as shown in Figure 2). This enables simple lookups without increasing load on our QLDB database.
The Lambda function shown in Figure 1 (Streaming) will transform the document stream from Ion (the format used by QLDB) into JSON. It will then add an attribute (expire_timestamp) to each DynamoDB item so that we can expire old transaction records, if necessary, using DynamoDB TTL. The original transaction information will remain available in QLDB in case it is required for audit purposes.
This architecture can be expanded to support multiple virtual currencies and currency exchanges.
Analytics with Amazon S3, AWS Glue, Athena, and Redshift
Transaction information is a valuable source of data for analytics use cases. To enable these use cases, data stored in QLDB must be exported to Amazon Simple Storage Service (Amazon S3). Then it must be transformed and enriched using extract, transform, and load (ETL) tools such as AWS Glue and queried directly from Amazon S3 using Amazon Athena, or loaded into a data warehouse such as Amazon Redshift.
In this blog post, we explained how to build a serverless wallet service that can be used to provide a single account across different games. With this architecture, you can accommodate use cases such as keeping user’s account balances, querying their transaction history, and extracting data for analytics processing. Get started by downloading the sample code from here and deploying it to your own account.
Figure 3 provides a diagram of the complete solution:
While not covered in this blog post, you will find a detailed explanation on how to export data from QLDB into a data lake or stream it into a relational database in the QLDB workshops.