Secrets Management: Features and Functions (updated)

By Adrian Lane

In this section we will discuss the core features of a secrets management platform. There are basic functions every secrets management platform needs to address the basic use cases. These include secure storage and disbursement of secrets, identity management, and API access, for starters. There are plenty of tools out there, many open source, and several bundled into other platforms. But when considering what you need from one of these platforms, the key thing to keep in mind is that most of them were originally developed to perform a single very specific task – such as injecting secrets into containers at runtime, or integrating tightly with a Jenkins build server, or supplementing a cloud identity service. Those do one thing well, but typically do not address multiple use cases.

Now let’s take a closer look at the key features.

Core Features

Storing Secrets

Secrets management platforms are software applications designed to support other applications in a very important task: securely storing and passing secrets to the correct parties. The most important characteristic of a secrets management platform is that it must never leave secret information sitting around in clear text. Secure storage is job #1.

Almost every tool we reviewed provides one or more encrypted repositories – which several products call a ‘vault’ – to store secret information in. As you insert or update secrets in the repository, they are automatically encrypted prior to being written to storage. Shocking though it may be, at least one product you may come across does not actually encrypt secrets – instead storing them in locations its developers consider difficult to access. The good news is that most vaults use vetted implementations of well-known encryption algorithms to encrypt secrets. But it is worth vetting any implementation, with your regulatory and contractual requirements in mind, prior to selecting one for production use.

With the exception of select platforms which provide ‘ephemeral secrets’ (more on these later), all secret data is stored within these repositores for future use. Nothing is stored in clear text. How each platform associates secrets with a given user identifier, credential, or role varies widely. Each platform has its own way of managing secrets internally, but mostly they use a unique identifier or key-value pair to identify each secret. Some store multiple versions of a secret so changes over time can be recalled if necessary for recovery or auditing, but the details are part of their secret sauce.

The repository structure varies widely between offerings. Some store data in simple text or JSON files. Some use key-value pairs in a NoSQL style database. Others use a relational or Big Data database of your choice. A couple employ multiple repository types to increase isolation between secrets and/or use cases. Their repository architecture is seldom determined by strong security; more common drivers are low cost and ease of use for the product developers. And while a repository of any type can be secured, the choice of repository impact scalability, how replication is performed, and how quickly you can find and provision secrets.

Another consideration is which data types a repository can handle. Most platforms we reviewed can handle any type of data you want to store: string values, text fields, N-tuple pairings, and binary data. Indexing is often performed automatically as you insert items, to speed lookup and retrieval later. Some of these platforms really only handle string, which simplifies programmatic API but limits their usability. Again, products tailored to a particular use case may be unsuitable for other uses or across teams.

Identity and Access Management

Most secrets management platforms concede IAM to external Active Directory or LDAP services, which makes sense because most firms already have IAM infrastructure in place. Users authenticate to the directory store to gain access, and the server leverages existing roles to determine which functions and secrets the user is authorized to access. Most platforms are also able to use a third-party Cloud Identity Service or Privileged Access Management service, or to directly integrate with cloud-native directory services.

Note that a couple of the platforms we reviewed manage identity and secrets internally, rather than using an external identity store. This is not a bad thing because they then tend to include secrets management to supplement password or key management, and internally management of identity is part of their security architecture.

Access and Usage

Most platforms provide one or more programming interfaces. The most common, to serve secrets in automated environments, is an access API. A small and simple set of API calls are provided to authenticate a session, insert a record, locate a secret, and share a secret to a specific user or service. More advanced solutions also offer API access to advanced or administrative functions. Command-line access is also common, leveraging the same basic functions in a command-driven UNIX/Linux environment. A handful of others also offer a graphical user interface, either directly or indirectly, sometimes through another open source project.

Sharing Secrets

The most interesting aspect of a secrets management system is how it shares secrets with users, services, or applications. How do you securely provide a secret to its intended recipient? As in the repository, as discussed above, secrets in transit must be protected, which usually means encryption. And there are many different ways to pass secrets around. Let’s take a look at the common methods of secret passing.

  • Encrypted Network Communications: Authenticated service or users are passed secrets, often in clear text, within an encrypted session. Some use Secure Sockets Layer (SSL), which is not ideal, for encrypted transport, but thankfully most use current versions of Transport Layer Encryption, which als authenticates the recipient to the secrets management server.
  • PKI: Several secrets management platforms combine external identity management with a Public Key Infrastructure to validate recipients of secrets and transmit PKI encrypted payloads. The platform determines who will receive a secret, and encrypts the content with the recipient’s public key. This ensures that only the intended recipient can decrypt the secret, using their private key.
  • Shared Filesystems: With using containers it is common to share secrets by placing them in a memory-only filesystem, tmpfs in UNIX parlance. This enables a secrets management server to provision secrets to all containers hosted on the same hardware. Access to this data is limited to applications on the same system. Because the data is stored only in memory, structure access is very fast and secrets disappear when the server is de-provisioned. The downside is that such secrets are stored in clear text, so great care must be taken to launch only authorized containers and to configure the ‘namespace’ to prevent unauthorized applications from reading secrets. If malicious code is introduced to the container, this model falls apart.
  • File Distribution: In this model a secrets management platform can move secrets into a file for consumption. Unlike the model above, not every application can access a files contents simply because they have access to the shared file system. By placing items into files you leverage access management rights to gate which users or processes can read the file contents. This is an increasingly common method for sharing secrets cloud environments. Using Amazon AWS as an example, secrets management tool will place a secret into an encrypted S3 bucket or service-endpoint. Applications uses their the resource identifier (ARN) and their assigned role to read the contents of the bucket, which AWS automatically decrypts on the applications behalf.
  • Policy Based Distribution: Some platforms, such as container orchestration systems, use policies to define where containers live and what resources they can access. Some secrets management platforms piggy-back off the native policy system, or have their own policy system embedded, and direct secrets according the policies. For example, containers belonging to a specific eCommerce function may be provisioned with secrets to connect to a third party payment gateway, but no other container group can access that secret. Permissions are largely provided by the native orchestration settings.
  • Wrapping: Some commercial platforms and cloud vendors use symmetric key encryption natively, with a new and unique key provisioned when the service or agent is initialized. Similarly to the PKI scenario, secrets are encrypted – or wrapped – on demand with the recipient’s key, then transmitted as an encrypted value. The key is ephemeral, just as the cloud service would be, and discarded when the agent or service is terminated. This is common when moving secrets between agent instances of a secrets manager.
  • Passing or Injection: In some cases secrets are provided automatically. When launching a virtual server, the secret might be a configuration file provided at launch. Containers may be supplied a unique identity certificate, which grants access and privileges back to a secrets manager, or scoped more narrowly within a swarm or pod. This helps mitigate the risk of rogue code entering an environment and automatically gaining access to secrets.
  • Clear Text: Yes, unencrypted plaintext. While not recommended, people and systems sometimes still fail to protect secrets – especially after a user is authenticated. For most of you this is a non-starter, as it prevents you from ensuring that secrets stay secret. But you need to understand that this still happens; if you see it, look for a different product.

Advanced Features

As the need for secrets management evolved we began to see commercial secrets management products. These plaforms they are architected to support several of the major use cases discussed earlier, and typically offer more advanced features, such as deep log creation and integration options, tighter integration with IAM services, secret generation, and secret revocation. As this segment matures we are beginning to see more advanced feature sets and better service integration, so you need to write less glue code. Below is a list, in no particular order, of advanced features we have come across.

  • Administrative Roles: It may sound odd, but many secrets management platforms were designed as personal productivity tools, so a separate management API or function set is somewhat new in this market. Logging, storage, secret creation, recovery, and failover settings are becoming table stakes for corporate secrets management platforms, and should be accessible only through mangement interface – not exposed to general system users.
  • Secrets Creation: Secrets management platforms are now capable of creating and issuing SSL certificates, passwords, TLS certificates, identity tokens, encryption keys, and other useful items.
  • Revocation: Like some key management systems, secrets managers can enforce secrets revocation by no longer allowing secrets to be access. Basically these secrets can have a ‘sell by’ date after which the secret is no longer valid to provide temporary access, and is often discarded from the secrets repository. In other cases this is achieved with ‘ephemeral secrets’ described below.
  • Ephemeral Secrets: Things like containers, servers, and IaaS/PaaS functions are ephemeral. Resiliency is provided by launching many instances of an application, simply replacing any which become unhealthy. This concept extends to security as well, with the idea that provisioned secrets can be just as ephemeral as a cloud server. With ephemeral secrets we generate new secrets for container classes or server instances as needed. If a secret is lost or a container fails, we generate a new secret on demand. This is useful for identity certificates, encryption keys, and similar types of secrets used between several services. These secrets are not stored long-term; instead the secrets manager keeps a dynamic list of which services have been issued which short-lived secrets.
  • Encryption as a Service: Some secrets management platforms encrypt payloads on request. A simple API call passes the payload in with a unique identifier of either the encryption key to use or the intended recipient; the secrets management platform serves as an encryption engine. This relieves developers from worrying about encryption libraries, random number generation, or other encryption esoterica.
  • Audit Logs: In this day and age if you want to sell security software to enterprises, you had better offer audit logs. More and more platforms offer log files today, and some even offer syslog and/or JSON formats. The quality of the content and filtering remain issues for many, but we have reached the point where most secrets management tools include logging capabilities.
  • Proxy Access: The line between Privileged Account Management (PAM) security and secrets management is beginning to blur. This capability essentially means that a secrets management service keeps access credentials secret, but provides a token (or role, in Amazon Web Services parlance) to authorize requesting entities.
  • Multi-party Management: Sometimes called Consensus Management or Split-key Authority, this feature require two or more administrators to authorize certain management functions. In some cases the Master key is itself divided into pieces, with each administrator given only one part of the key. In either case, approval for issuance can only be provided when consensus is reached.

We list all these features to help readers seeking to address specific use cases. Our goal is to help you understand the available capabilities and how they can help them address your needs, while satisfying their own internal IT security requirements. We also want to help help you understand why certain products work the way they do and provide an idea of where the market is heading.

This is a long post covering complex ideas. We encourage you to leave comments, questions, and critiques; we will endeavor to answer them in both the comments field below as well as in the final paper.

Up next: deployment options.

No Related Posts

If you like to leave comments, and aren’t a spammer, register for the site and email us at and we’ll turn off moderation for your account.