Adrian here.

Starting with the 2008 RSA conference, Rich and Chris Hoff presented each year on the then-current state of cloud services, and predicted where they felt cloud computing was going. This year Mike helped Rich conclude the series with some new predictions, but more importantly they went back to assess the accuracy of previous prognostications. My takeaway is that their predictions for what cloud services would do, and the value they would provide, were pretty well spot on. And in most cases, when a specific tool or product was identified as being critical, they totally missed the mark. Wildly.

Trying to follow cloud services – Azure, AWS, and GCP, to name a few – is like running toward a rapidly-expanding explosion blast radius. Capabilities grow so fast in depth and breadth that you cannot keep pace. Part of our latest change to the Friday Summary is a weekly tools discussion to help you with the best of what we see. And ask IT folks or any development team: tools are a critical part of getting the job done. Adoption rates of Docker show how critical tools are to productivity. Keep in mind that we are not making predictions here – we are keenly aware that we don’t know what tools will be used a year from now, much less three. But we do know many old-school platforms don’t work in more agile situations. That’s why it’s essential to share some of the cool stuff we see, so you are aware of what’s available, and can be more productive.

You can subscribe to only the Friday Summary.

Top Posts for the Week
Amazon Seeing ‘Momentous’ Change of Guard as Public Cloud ‘Booms,’ Says JP Morgan
6 roadblocks to cloud security, and how to move past them
GitLab-Digital Ocean partnership to provide free hosting for continuous online code testing
How Continuous Delivery Challenges Security
Google Reimburses Cloud Clients After Massive Google Compute Engine Outage
What does shared responsibility in the cloud mean?
Microsoft takes an ‘apps for every platform’ approach with host of DevOps tools
Internals of App Service Certificate
Announcing the Preview of Storage Service Encryption for Data at Rest
Tool of the Week
This week’s tool is DynamoDB, a NoSQL database service native to Amazon AWS. A couple years ago when we were architecting our own services on Amazon, we began with comparing RDS and PostgreSQL, and even considered MySQL. After some initial prototyping work we realized that a relational – or quasi-relational, for some MySQL variants – platform really did not offer us any advantages but came some limitations, most notably around the use of JSON data elements and the need to store very small records very quickly. We settled on DynamoDB. Call it confirmation bias, but the more we used it the more we appreciated its capabilities, especially around security.

While a full discussion of DynamoDB’s security capabilities – much less a full platform review – is way beyond the scope of this post, the following are some security features we found attractive:

Query Filters: The ability to use filter expressions to simulate masking and label-based controls over what data is returned to the user. The Dynamo query can be consistent for all users, but filter return values after it is run. Filters can work by comparing elements returned from the query, but there is nothing stopping you from filtering on other values associated with the user running the query. And like with relational platforms, you can built your own functions to modify data or behavior depending upon the results.
Fine-grained Authorization: Access polices can be used to restrict access to specific functions, and limit the impact of those functions to specific tables by user. As with query filters, you can limit results a specific user sees, or take into account things like geolocation to dynamically alter which data a user can access, using a set of access policies defined outside the application.
IAM: The AWS platform offers a solid set of identity and access management capabilities, and group/role based authorization as you’d expect, which all integrates with Dynamo. But it also offers temporary credentials for delegation of roles. The also offer hooks to leverage Google and Facebook identities to cotnrol mobile access to the database. Again, we can choose to punt on some IAM responsibilities for speed of deployment, or we can choose to ratchet down access, depending on how the user authenticated.
JSON Support: You can both store and query JSON documents in DynamoDB. Programmers reading this will understand why this is important, as it enables you to store everything from data to code snippets, in a semi-structured way. For security or DevOps folks, consider storing different versions of initialization and configuration scripts for different AMIs, or a list of defining characteristics (e.g., known email addresses, MAC addresses, IP addresses, mobile device IDs, corporate and personal accounts, etc.) for specific users.
Logging: As with most NoSQL platforms, logging is integrated. You can set what to log conditionally, and even alert based on log conditions. We are just now prototyping log parsing with Lambda functions, streaming the results to S3 for cheap storage. This should be an easy way to enrich as well.
DynamoDB has security feature parity with relational databases. As someone who has used relational platforms for the better part of 20 years, I can say that most of the features I want are available in one form or another on the major relational platforms, or can be simulated. The real difference is the ease and speed at which I can leverage these functions.

Securosis Blog Posts this Week
How iMessage distributes security to block “phantom devices”
Summary: April 14, 2016
Other Securosis News and Quotes
Rich quoted in Fortune
Adrian quoted on WAF management
Training and Events
We are running two classes at Black Hat USA:
Black Hat USA 2016 | Cloud Security Hands-On (CCSK-Plus)
Black Hat USA 2016 | Advanced Cloud Security and Applied SecDevOps

Share: