Enterprise Wide DATAbase Activity Monitoring (DAM) Without the Pain
It’s Time for Next Generation DATA Activity Monitoring (DAM)
Legacy Database Activity Monitoring (DAM) solutions are 15 years old, are expensive for the value they add, degrade performance, impact operations, cause additional downtime and complicate patching or upgrades to the database code and the underlying Operating System (OS). They also provide no accountability when business and analytics applications connect using session pooling, shared service accounts or result sets are cached on Application Servers. The final nail in the coffin for old DAM tools is their lack of support for cloud data platforms (Snowflake, BigQuery, Azure SQL, AWS DaaS) and other new forms of persistent data repositories like Kafka, Spark or NoSQL.
Many organizations have implemented these legacy DAM solutions to provide an independent, tamper-proof audit trail of all access to all data in their core data repositories, data warehouses and transactional systems. Organizations desire a single pane of glass view to all sensitive data access.
DAM solutions require core changes to the database and OS
These solutions typically require both Kernel and User level code changes to the database platform and underlying OS. They also generate large volumes of additional network traffic from the database while consuming precious CPU capacity. The additional layer of complexity also delays implementing many security patches or software upgrades for the Database application and/or the underlying OS.
For platforms like Teradata, the DataBase Query Log (DBQL) already automatically provides even more detailed logging of all database queries or activity. DBQL provides a superset of any possible logging from dedicated DAM products available on the market that typically only capture all SQL commands sent to the database. Many other database platforms have equivalent functionality leading many organizations to question why they are tolerating the significant cost, performance and operational impact of pure play DAM products for virtually no additional risk reduction benefit.
Unfortunately, organizations would need to manage the same centralized collection, standardization and event correlation of these disparate logs from each system. Then still manage the massive volumes of log data on activity that is 90% benign or very low risk data processing operations with the same limited End-User context and accountability when analytics applications are used for most data access.
There is a better way…
Imagine a solution that delivered ALL of the following:
- Application and Data Repository independent audit trail of all access to sensitive data
- Capture Metadata and risk scoring for all data result sets returned by the database with full DBA and/or End-User accountability
- Transparent to End-Users, Applications, ETL and Database operations
- Rapid, minimal cost implementations, for full DAM functionality
- Out-of-the-box support for all mainstream traditional on-premise data stores (Teradata, Oracle, MS-SQL Server, DB2, Greenplum and many more)
- Out-of-the-box support for cloud data stores (Snowflake, AWS Redshift, BigQuery, Azure HD Insight, Azure Data Warehouse, Databricks and traditional on-prem platforms above)
- Data flow, data lineage and audit trail for n-tier business applications and microservices architectures that access or process sensitive data including Lambda, Kafka and Spark
- Enforce centrally managed, policy-based, fine-grained row or column level access control, encryption, anonymization, data obfuscation, dynamic masking, geo-fencing, Consent/Preference Management and Right of Erasure for CCPA & GDPR compliance
Without any of these:
- No Database software agents to install, configure and maintain
- No dedicated expensive DAM hardware required for collecting or aggregating log data
- No database downtime or frequent technical problems to solve
- No additional CPU consumption or performance degradation on Database platforms
- No additional network components or traffic to/from Database platforms
- No Database Server OS kernel changes or reboots for patching or upgrades
- No single point of failure
SecuPi can be configured to monitor data access activity with enhanced user context between the application and database layers based on data classification and risk scoring. Avoid processing benign activity log data often exceeding 90% of all database traffic while missing other data access points.
An independent, configurable, tamper-proof audit trail of all access to any database is automatically provided along with collecting detailed metadata on all result sets returned to an application or user including columns, row counts, individual query risk scoring and advanced User Behavior Analytics.
Sound too good to be true?
Not at all . . .
Next generation DAM functionality is a side benefit of SecuPi’s comprehensive data security and privacy compliance solution. SecuPi can certainly be used only for the enhanced, centrally managed, tamper-proof audit trail of all data access whether through a database or some other data repository or processing platform (Hadoop, Kafka, Lambda, NiFi, RDS, Snowflake, Redshift, BigQuery, Python, ETL, DBeaver, Toad, Jupyter and many more) but you don’t have to stop there.
SecuPi Data Security Platform delivers a superset of DAM functionality by simply monitoring all sensitive data access, logging all selected SQL queries, database reads or writes, Publish or Subscribe messages, ETL processes or virtually any other data access or processing technology.
The SecuPi Central Policy Server, Overlays and Agents ensure that System Admins and DBAs cannot circumvent SecuPi data access controls.
Author: Les McMonagle, Chief Security Strategist at SecuPi.
Please contact SecuPi if you want to learn more or to arrange a demo.
See post of on LinkedIn.