-
-
-
-
-
- Chapter Overview
- Immuta Architecture
-
- Installation Overview
-
-
- Install Immuta in an Air-Gapped Environment
-
- Chapter Overview
-
-
-
- Environment Variables
- Databricks Change Data Feed
-
- DBFS Access
- Delta Lake API
- Ephemeral Overrides
- External Metastores
- Hiding the Immuta Database in Databricks
- Limited Enforcement in Databricks
- Project UDFs Cache Settings
- Py4j Security Error
- Run spark-submit Jobs on Databricks
- S3 Access in Databricks
- Scala Cluster Security Details
- Security Configuration for Performance
- Spark Direct File Reads
-
-
- Configure Databricks SQL
-
- Configure Redshift
- Configure Azure Synapse
- Configure Google BigQuery
-
-
-
- Add a License Key
-
-
-
-
- Chapter Overview
- Data Sources in Immuta
- Create a Query-Backed Data Source
- Create an Object-Backed Data Source
- Bulk Create Snowflake Data Sources
- Create a Google BigQuery Data Source
- Create a dbt Cloud Data Source
-
- Local Policies in Immuta
- Write a Local Policy
- Manage Data Sources
- Subscribe to Data Sources
-
-
- Chapter Overview
- Projects and Purposes
- Value of Projects
- Create Purposes and Acknowledgement Statements
- Create a Project
-
-
- Derived Data Sources
- Create a Derived Data Source
- Use Project UDFs (Databricks)
- Policy Adjustments (Public Preview)
- HIPAA Expert Determination (Public Preview)
- Adjust a Policy (Public Preview)
- Use Expert Determination (Public Preview)
-
-
-
-
-
-
- Immuta Proof of Value (POV)
- Data Setup
-
- Schema Monitoring and Automatic Sensitive Data Discovery
- Separating Policy Definition from Role Definition - Dynamic Attributes
- Policy Boolean Logic
- Exception-Based Policy Authoring
- Hierarchical Tag-Based Policy Definitions
- Subscription Policies - Benefits of Attribute-Based Table GRANTs
- Purpose-Based Exceptions
-
-
-
-
-
- Query Your Data Guide
-
-
-
-