Pricing
We help you to understand which product is right for you.
DataGPT Classic, Pilot Plans
DataGPT offers a three-month Proof of Concept POC phase that renews into an ongoing license. At the end of the 3 month period, the customer can opt out or the POC will automatically convert to an ongoing agreement. We can negotiate longer pilots if requested.
- Includes 10 users
- $50 per additional user/mo.
Onboarding of up to 1 schema
250 million rows of data
10 Users
24 months of data
AI Analyst Mode
Standard Support
Security and Legal
Standard Data Security
Standard Terms required
Pilot
Premium
For companies with multiple use cases and more complex data environments.
3 months duration
- Includes 50 users
- $62.50 per additional user/mo.
Onboarding of up to 2 schemas
500 million rows of data
50 Users
24 months of data
AI Analyst Mode
Core Analysis Types + Anomaly Detection
Premium Support
Security and Legal
Standard Data Security
Standard Terms required
Pilot
Enterprise
For companies with multiple use cases and more complex data environments, requiring SLAs.
3 months duration
- Enterprise Pilots are custom quoted based on:
User volume
Data volume (rows per month)
Historical length of data
Security Requirements
Length of pilot
Security and Legal
Enterprise Data Security
Required for customers that require their own MSA or Custom Terms
Discounts
Discounts do not apply to Proof of Concepts (POCs)
Annual upfront payment-5%
2 year commit-15%
Account Features
Plus
Premium
Enterprise
Access your Databoards
Databoards are the intuitive interface where end users can interact with their data via DataGPTʼs Chat AI Analyst), Data Navigator, and Dynamic Benchmarking modes
Construct your Data Model in the DataGPT Schema
See appendix for schema definition
Schema Control
Admins can view and edit the schema configuration including adding new metrics and dimensions and adding additional context for the LLM specific to your business and use case.
Databoard Embed
Add a Databoard into your own internal environments for end user interaction via an iframe.
Admin controls
Admins can add or remove users from their Account, modify the schemas and access all databoards within the Account.
DataGPT Multi-Team Control
DataGPT Premium and Enterprise plans provide logical multi-tenant Teams. DataGPT databoards can be partitioned into multiple tenant-specific environments called Teams. With Teams, each tenantʼs data is isolated and protected with access control, and is invisible to the other tenants that share the same DataGPT account
Admin Databoard
An intuitive interface that gives you visibility into your team's activity - sessions, prompts, and more - so you can monitor usage and provide effective support.
Chat Mode (The AI Data Analyst)
The worldʼs most advanced and accurate AI Data Analyst, enabling users to talk with their data and get analyst grade answers to even complex questions, like "Why did revenue drop?"
Plus
Premium
Enterprise
Ask questions in natural language
Let AI suggest questions to ask
Optional autofill features when asking a question
This means you use the exact metric, dimension or segment name that is used in the schema, which the AI analyst surfaces via the autofill feature, to ensure clear alignment with the AI
Filters
Apply a filter providing the Analyst direction on what subset of the data you want to ask questions about. Bolean enables you to filter in by any segment(s), or filter out by any segment(s).
For example, you could filter the data by Segment, 'Country: USAʼ, and the AI Analyst will recognize as you ask new questions that you are asking only in the context of the US data.
Standard Daily Summaries - Key Metric Performance
Each day the AI Analyst will provide a daily summary within the Chat Mode interface highlighting the key metric performance for up to 10 selected metrics.
Advanced Daily Summaries - Key Metric Drivers or Anomalies
Each day the AI Analyst will provide a daily summary within the Chat Mode interface either highlighting the key drivers of up to 5 key metrics or highlighting relevant data anomalies.
Data tab
A section in each response that lets you access the underlying data used to generate it - especially helpful for responses with many segments - offering pagination, sorting, and CSV upload for easier navigation an analysis.
Up to 1,000 rows per csv export
Up to 5,000 rows per csv export
Up to 5,000 rows per csv export
DataGPT Delivered
The AI Analyst will automatically send email notifications to users of a Databoard, providing analysis based on pre-configured prompts
4 per month
10 per month
50 per month *$100 fee per additional 10 delivered
Multiprompting
The Al analyst will seamlessly process the assigned task along with the sequence of analytical functions in a single pass, delivering a clear and concise report.
20 per month
50 per month *$100 fee per additional 10 multiprompts
Analysis Types & Question Types Supported by the AI Analyst
Plus
Premium
Enterprise
Core Analysis Types
Ask about the Data Itself
Asks what metrics or dimensions have been configured in the schema. Ask how the configured metrics are defined and calculated in the schema.
Descriptive Analysis
Get the basic value of your metrics. Example: "What is our total sales today?"
Metric and Dimension Analysis
Ask about how metrics perform across different segments or combinations of segments.
Ex: "How is our Revenue from VIP members in the UK performing across product categories?"
Derived Metrics
Expand your analysis possibilities by building upon other defined metrics in your databoard.
Ex: Calculate my conversion rate (won opportunities / total opportunities)
Trend Analysis
Ask about patterns in your data over time.
Ex: "Whatʼs the trend in Page Views over the last 30 days?"
Comparative Analysis
Compare metrics from one time period to another.
Ex: "What was our On Time Delivery Rate this Month vs Last Month?"
Correlation and Causation Analysis
Understand relationships between different metrics.
Ex: "Does an increase in marketing spend lead to higher sales?"
Impact Analysis
Understand the impact of specific segments on a metric.
Ex: "Which product category has the biggest impact on our total sales?"
Prescriptive Analysis Key Driver Identification)
Identify key factors affecting your metrics and suggest actions. Ex: "What are the main drivers of our customer churn?"
Metric Statistics Analysis
Get extensive statistical information about the metric time series: variance standard deviation, percentiles, min/max values and time series rolling average for a given period "What are the standard deviation and average values of our daily sales?
Ask the AI Analyst to create a chart for you.
Questions about segments can be displayed as Line chart, Bar Chart, Stacked chart. Questions about metrics can be displayed as Times Series.
Segment Comparisons
Compare the performance of any two segments side by side regardless of start date.
Ex: How did Marketing Campaign 1 that started on January 1, compare to Marketing Campaign 2 that started on February 18th over a 30 day period? Or Why did House of Dragons Season 1 have more Viewership in the first 30 days than Season 2?
Advanced Analysis Types
Anomaly Detection Analysis
Configure your AI analyst to monitor for important changes in your metrics at any level of drilldowns in your data. These could be based on thresholds, relative change, and top/bottom segments. Ex: "Are there any anomalies in our Revenue this month?"
Data Navigator Mode
Data Navigator provides a unified view for teams to explore their data freely. Rather than asking and waiting for the AI Analyst to generate a report of Revenue across every Country, simply click the 'Revenueʼ metric and 'Countryʼ Dimension and Data Navigator will display an instant report sortable by values, impact, change in value or %age change
Plus
Premium
Enterprise
Calendar Selector
Adjust your comparison period (CP) and observation period (OP) and your data navigator will update all calculations in real-time.
Key Metric Performance
Everyday Data Navigator displays our Metricʼs performance including current value, change in value, and % change
Automated Key Metric Driver Analysis
Key Drivers Analysis is automatically triggered daily and displayed in the 'Key driverʼ tab for within each Metric card
Reports
Generate reports with a simple click by selecting a metric and dimensions or segments. For example select the metric 'Revenueʼ and Dimension 'Product Categoryʼ to see the revenue generated across each product category
Report Sorting
Sort Reports by % Change, Value, Change in Value, or Impact
Exhaustive Drill Down
Drill Down into any combination of segments for instant data exploration and multi-dimensional reports.
Automated Data Visualizations
Data Navigator includes several automatically generated visuals. Time series visuals are included for each metric's performance. Line, bar, and stacked charts are automatically generated for each segment.
Boolean Filters
Use the Boolean Filter on the Data Navigator to filter the entire navigator by any Segment(s) to focus on what matters most to you. Bolean enables you to filter in by any segments), or filter out by any segments)
Sharing Features
Copy and Share a hotlink to any location in the Data Navigator with another user
Model & Configuration
Plus
Premium
Enterprise
Schema Configuration
Construct your Data Model in the DataGPT Schema. An easy way to think about schemas is when you have distinct use cases. Each use case will often signal a unique schema.
*See appendix for full schema definition
Tables and View Access
DataGPT will access only the data you make available, either by providing access to a single View or providing access to multiple tables that you want to make available to DataGPT.
Metric Configuration
Add you metric definition and and metric calculations in the Schema for the Al analyst to use in its analysis.
Dimensions Configuration
Add the dimensions within the view or table that are relevant to the analysis.
Semantic Layer Customization
Add any synonyms that are especially unique to how your business talks about their data, or add clarity for the LLM by adding labels to metrics or segments with similar sounding names to help it understand when to use each respective datapoint.
Custom Instructions
Add Instructions for the LLM to follow when you have specific steps you want the Analyst to take when answering particular questions that might be unique to your business
Data Connection
Plus
Premium
Enterprise
Connect to Google BigQuery
Connect to Snowflake
Connect to Amazon Redshift
Connect to Azure Synapse
Connect to PostgreSQL
Connect to Databricks
Connect to SQL Server
Connect to Other SQL based data platforms supported on request
Connect to Google Analytics
Security
Plus
Premium
Enterprise
Essential Data Security
The Customer Data Storage Layer is the only layer that contains storage of customer data. This is stored in an encrypted and shared secure S3 bucket in AWS US-East-2
Standard Data Security
The Customer Data Storage Layer is the only layer that contains storage of customer data. This is stored in an encrypted and shared secure S3 bucket in AWS US-East-2, but can be located in a specific AWS Region upon request
Enterprise Data Security
The Customer Data Storage Layer is the only layer that contains storage of customer data. This is stored in an encrypted and secure S3 bucket and EFS which can be located in a customer specific VPC in AWS us-east-2 region or other regions. If you require Regions outside the US East -2 region it may impact your pricing.
SOC Type I
SOC Type II
Complete logical data isolation
Disk data encryption at rest
Data encryption in transit - TLS
Single Sign-On via Microsoft SSO and Google SSO
Usage Limits
Plus
Premium
Enterprise
Historical Length of Data
24 months
24 months
custom
Max data size
250 million rows of data
500 million rows of data
custom
Support
Plus
Premium
Enterprise
Knowledge Base including Video Tutorials and Support documents
Email Support
Support Hours
24x7
24x7
24x7
Response Times
Work Hours including M-F 9am ET to 6pm ET
Work Hrs: 8 hrs
Week Days After Hrs: 1
Day Weekends: 2
Work Hrs: 4 hrs
Week Days After Hrs: 1
Day Weekends: 2
Work Hrs: 4 hrs
Week Days After Hrs: 1
Day Weekends: 2
Assigned Customer Success Manager post onboarding for escalation and emergency communications
Onboarding and Training
Number of Schema Onboarded by DataGPT Engineer including:
Plus
Premium
Enterprise
Schema configuration
1
2
custom
LL Fine tuning
*additional onboardings can be included for an add-on fee of $3,000 per new schema configuration
1
2
custom
Customer Data Team Training on Schema Management and Customization
User Live and Recorded Training Sessions
The User training sessions are provided once during the onboarding process and are not recurring annually.
2 training
3 training
5 training
Slack or Teams Channel for communication through onboarding process
Assigned Senior Support Staff
Direct access to senior support staff for more complex issues and advanced troubleshooting
Add-On Services
Additional Trainings $250 per training
Purchase additional training to accommodate multiple departments and databoards with an optimal training experience.
Appendix
Schema Definition
Construct your data model in our schema builder with four key components: Tables, Dimensions, Metrics and LLM instructions. Our flexible schema lets you map tables and define dimensions and metrics using a simple Ul. Once set up, DataGPT automatically determines the most efficient way to query and extract your data for analysis. LLM instructions allow you to define the necessary semantics for different agents - such as Planner, DataAPI, Metadata, and Summary.
Multiple Schemas: You can produce multiple databoards from one schema, however here are the scenarios where multiple schemas are necessary. In most cases, an individual schema can often align with distinct use cases:
- Complex Data Environments: If your organization operates across multiple distinct data environments that cannot be logically consolidated into a cohesive model.
- Diverse Data Sources and Structures: When dealing with a variety of data sources that have fundamentally different structures or formats, a schema per data source is required.
- Varying Departments with Customized Needs: If there is a need for significantly different analysis requirements within the same organization, utilizing multiple schemas can offer more tailored solutions to ensure those distinct needs are met (For example: some transactions need to be analyzed from a product point of view vs. an accounting lens and have different business rules applied.)