Cloud Intelligence is part of the LeanIX Cloud Native Suite. It discovers Cloud Components from AWS, Azure and GCP automatically and imports these into a LeanIX workspace to help customers increase agility, manage cloud-specific security and governance, and create transparency about cloud spend, especially for multi-cloud environments. The data from the Cloud Intelligence workspace can be aggregated and automatically synchronized into a LeanIX Enterprise Architecture workspace.
Why is there a separate workspace?
The scanned data is typically too granular for Enterprise Architecture use cases. Cloud Intelligence will discover every single Cloud Component, e.g. every single AWS EC2 instance, S3 bucket or Lambda function. This is helpful for use cases like Cloud Security or Cloud Cost Management but irritating for business users who want to understand the high-level architecture. Using two separate but integrated workspaces allows users to access information on the right level.
How does it work?
LeanIX uses Cloudockit as a strategic partner to scan information from the hyperscalers. The following diagram depicts the high-level architecture which heavily depends on the LeanIX Integration API.
How do I connect my Cloud Environment?
Cloudockit programmatically collects information about the cloud environment using the cloud provider's API, hence the privileges for the account need to be set up accordingly. If access is lacking to certain resources - e.g. to the Trusted Advisor - this information will not be included in the collected cloud information and hence in the Cloud Intelligence workspace, but the other data will still be load.
For productive usage, we can support two different modes:
- Cloudockit service hosted by LeanIX. In this case, no action on customer side is required, but credentials with appropriate roles (e.g. AWS SecurityAudit policy) needs to be configured on LeanIX side as documented below)
- For larger setups (>20 accounts or subscriptions), LeanIX would provide a Cloudockit Scan Agent as VM. The customer runs the VM, e.g. in EC2, and can efficiently access his accounts without managing credentials for every single environment. Detailed information can be made available on demand.
Enter your credentials
First, navigate to the administration settings, then navigate to the "Cloud Native" section of the administration menu:
Third, add the credentials for each account as outlined below.
Once you're done, save the configuration with the button "Overwrite all configurations".
Configuration Properties
The details of the JSON properties are listed below.
All
id
12345
Please use the account/subscription/project id (see below under data.SubscriptionId) as the id for the entry, this makes it easier to reference specific configurations
All
name
aws.somename
This is an arbitrary name for the configuration. It is meant to identify different accounts easily, and could e.g. contain the name of the component and or team
All
type
aws | azure | gcp
For AWS accounts, this should be set to “aws”.
AWS
data.AWSAccessKeyId
abc123
This is the Access Key Id for the AWS User
AWS
data.AWSSecretAccessKey
password123
This is the Secret Access Key for the AWS User
Azure
data.AppClientIdForAutomation
abc123
This is the id of the AAD application
Azure
data.AppClientKeyForAutomation
abc123
This is the key of the AAD application
Azure
data.TenantId
abc123
This is the tenant of the AAD application (see https://www.cloudockit.com/whichtenanttouse/)
GCP
data.GCPServiceAccountJSONCredential
"{'type': 'service_account', ...}"
This is the JSON created when creating the account credentials. Ensure that the account is a valid string. A suggestion is to wrap the JSON with double quotes and replace each JSON key/value pair with single quotes.
All
data.SubscriptionID
12345
For AWS, this is the AWS Account Id to scan.
For Azure, this is the subscription Id to scan.
For GCP, this is the GCP Project Id to scan.
Hidden Secret Keys
Please note that there will be no visible access to the keys once the button "Overwrite all configurations" has been consumed. In case of active key management, the key can be replaced, however the replacement will also not be seen after the configurations are overwritten.
AWS - User Setup
The setup is described here: https://www.cloudockit.com/knowledge-base/aws-role-and-keys-access-guide/ and https://www.cloudockit.com/documentation/aws-create-user-for-cloudockit/
As listed there, you need to enable access to the Billing information for the user. This is done by creating a new policy using the JSON block below, giving it a name such as LeanIXCloudScanBillingPolicyReader and assigning it to the user's group:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ce:*"
],
"Resource": [
"*"
]
}
]
}
In addition, you need to enable access to the Trusted Advisor as well using the JSON block below. For the name, use LeanIXCloudScanAdvisorPolicyReader, and assign it to the user's group:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"support:DescribeTrustedAdvisorCheckResult",
"support:DescribeTrustedAdvisorChecks"
],
"Resource": "*"
}
]
}
In the end, the user's configuration should look like this:
{
"configurations": [
{
"id": "12345",
"name": "aws.somename",
"type": "aws",
"data": {
"AWSAccessKeyId": "ABCD1234",
"AWSSecretAccessKey": "ENTER THE SECRET ACCESS KEY HERE TO STORE IT ENCRYPTED",
"SubscriptionID": "12345"
},
"active": true
}
]
}
AWS - Tag Level Costs (Business Context)
In order to achieve costs for the Business Context fact sheet, AWS has to be enabled to capture tag-level costs. To do so, please see here: https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/activate-built-in-tags.html
Please note in this AWS documentation, AWS specifies that the Master Account owner can activate the tags needed for underlying accounts to capture tag-level costs.
It could take up to 24 hours for the tags to be activated.
Tag Keys
Once this is set up, please communicate with your CSM what the tag keys are to be extracted from AWS Cost Explorer API, as LeanIX will add this to the Cloudockit configuration to bring into the LeanIX workspace.
Azure - Account Setup
For basic connectivity information, please refer to the Cloudockit documentation:
https://www.cloudockit.com/knowledge-base/connect-your-azure-subscription/ and https://www.cloudockit.com/knowledge-base/how-to-create-an-azure-active-directory-application/
Reader.permissions are sufficient - see https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#reader
Azure - Configuration
{
"configurations": [
{
"id": "1234-5678-4321-1122-22334455",
"name": "azure.somename",
"type": "azure",
"data": {
"AppClientIdForAutomation": "9988-7766-5544-3322-11223344",
"AppClientKeyForAutomation": "ENTER THE APP CLIENT KEY HERE",
"TenantID": "1111-2222-3333-4444-5555666",
"SubscriptionID": "1234-5678-4321-1122-22334455"
},
"active": true
}
]
}
GCP JSON Token
GCP gives the service account a JSON file as a key. The file is to be formatted as a single line string which goes into the below configuration under the field AppClientIdForAutomation in the data section of the configuration. For any questions, feel free to reach out to your Customer Success Manager to help with the formatting.
{
"configurations": [
{
"id": "9876-5432-4321-99775533",
"name": "gcp.somename",
"type": "gcp",
"data": {
"AppClientIdForAutomation": "{'type': 'service_account', 'project_id': 'project-name-1234', ..., 'client_x509_cert_url': 'googleapis.com'}",
"SubscriptionID": "project-name-1234"
},
"active": true
}
]
}
GCP - Service Account Setup
To connect GCP data, please follow these steps: https://www.cloudockit.com/knowledge-base/gcp-authentication-guide/
In addition to the guide above, the following changes need to be applied:
- Enable the “Cloud Security Command Center“ and "Cloud Security Command Center API“; to do so, just search for this API in the GCP Console and „ENABLE“ it. The Google Security Command Center itself needs to be enabled as well.
- To access billing information, please follow the guide from Google to set up a daily export of cloud billing data:
https://cloud.google.com/billing/docs/how-to/export-data-bigquery
Below we have a recommendation for what the name of the dataset should be. As the table name is generated, we ask that the table is reported to your Customer Success Manager so the integration can be reflected accurately.
BigQuery Dataset: "leanix_billing"
BigQuery Table: Report to your Customer Success Manager
Whitepaper
See https://www.leanix.net/en/download/a-guide-to-managing-cloud-tagging-policies for a deep dive on how to handle and improve Cloud Tagging
How are security & compliance violations detected?
There are three major ways to detect violations:
- From the hyperscaler directly: We scan data from services like Azure Advisor, AWS Trusted Advisor or Google Cloud Security Command Center to make them available centrally
- From Cloudockit: Cloudockit provides its own best practice Compliance Rules and lets users configure their own custom rules, via UI or in a format like
{
"JsonRule":
"{\"condition\":\"AND\",
\"rules\": [
{
\"id\":\"MicrosoftComputes\",
\"field\":\"MicrosoftComputes\",
\"type\":\"string\",
\"operator\":\"cdk_not_contains\",
\"value\":{
\"condition\":\"AND\",
\"rules\":[{
\"id\":\"ProtectionPolicyResource|Name\",
\"field\":\"ProtectionPolicyResource|Name\",
\"type\":\"string\",
\"operator\":\"is_empty\",
\"value\":null},
{\"id\":\"Vault|Name\",
\"field\":\"Vault|Name\",
\"type\":\"string\",
\"operator\":\"is_empty\",
\"value\":null}
],
\"valid\":true}}],
\"valid\":true}",
"CloudTypes": [
"Azure"
],
"Name": "CDK-Azure-VM-Disaster-Recovery",
"Description": "No Disaster Recovery defined on Virtual Machine",
"Criticity": 0,
"Message_Hyperlink": "https://docs.microsoft.com/en-us/azure/site-recovery/azure-to-azure-tutorial-enable-replication"
}
- From a dedicated operative security tool via the Integration API. In our understanding, there are good arguments to combine a tool for security experts with Cloud Intelligence, as there are different granularities and use cases.
Security and Violations added information
For more information, please visit: https://www.cloudockit.com/knowledge-base/compliance-rules/
How does Cloud Intelligence helps to manage cloud spend?
Cloud Intelligence extract Cloud Spend information directly from the hyperscalers. Details depend on the available data, e.g. AWS does currently not supported to extract data on a Cloud Component level, but only on higher levels like accounts. Cloud Intelligence aggregates cost data and syncs it to the EA workspace so that Enterprise Architects can get an indicator for hotspots. Cost data is also presented over time using (LeanIX Metrics)[https://docs.leanix.net/docs/metrics] which allows identifying trends.
Can I integrate with other sources, e.g. my private cloud environment?
Yes, all the data in Cloud Intelligence is accessible via the Integration API. Also, the data model is configurable - see Configuration Overview. However, please take into account that the mapping to the EA workspace (see above) might only work if you stick to the default data model. Please contact your Customer Success Manager for guidance.
Is it possible to add more hyperscalers?
If you want to connect to other hyperscalers than the current ones (AWS, Azure and GCP), please get in touch with your Customer Success Manager.
Updated 3 months ago