TL:DR
You don’t need one dashboard per customer, even if every customer has their own database.
By parameterizing the database connection at runtime, you can build one reusable dashboard template that dynamically routes queries to the correct customer database based on who’s logged in.
This works across Snowflake, PostgreSQL, MySQL, MongoDB, and more.
How it works, at a glance:
- Use runtime tokens (for example c9_dbname) in your datasource instead of hardcoding database names
- Assign token values per user or per group (UI or API)
- Embed the same dashboard for all tenants
- Choose:
- SSO Embed API for full interactivity and saved user state
- Secure URL Embed for fast, view-only dashboards
- SSO Embed API for full interactivity and saved user state
Result:
One dashboard. Hundreds or thousands of databases. Near-zero maintenance.
Table of Contents
- The Challenge: One Dashboard, Hundreds of Databases
- Architecture Overview
- Step 1: Create the Datasource with Runtime Parameters
- Step 2: Create Queries with Runtime Parameters
- Step 3: Build the Dashboard Template
- Step 4: Assign Runtime Parameter Values to Users
- Step 5: Choose Your Embedding Approach
- Step 6: Test Your Multi-Tenant Setup
- Real-World Example: 1,000+ Snowflake Databases
- Advanced: Parameterizing More Than Just Database
- Common Pitfalls and How to Avoid Them
- Performance Considerations
- Summary: Your Multi-Tenant Embedded Analytics Checklist
- Next Steps
- Frequently Asked Questions.
- What is database-per-tenant embedded analytics?
- How is the database selected for each user?
- Do I need to duplicate dashboards for each customer?
- Is this secure? Can tenants see each other’s data?
- Can I parameterize more than just the database?
- What happens if a user has no database assigned?
- Does this work with Snowflake multi-tenant setups?
- How does this scale to 1,000+ tenants?
The Challenge: One Dashboard, Hundreds of Databases
You’re building a SaaS product. Each customer has their own database, same schema, separate data. Now product wants analytics embedded in the app.
The naive approach: build a separate dashboard for each customer. With 10 customers, that’s manageable. With 1,000+? Impossible.
This tutorial shows you how to build one dashboard template that dynamically connects to the right database based on who’s logged in. No duplicate dashboards. No maintenance nightmare. Just clean, scalable multi-tenant embedded analytics.
What you’ll learn:
- How to parameterize database connections at runtime
- Two embedding approaches: SSO API vs Secure URL (and when to use each)
- Setting up drilldowns and filters that work across all tenants
- Assigning database credentials per user or group
Architecture Overview
Here’s what we’re building:
The key insight: parameterized runtime tokens let you swap the database connection at query execution time, not dashboard design time.
Step 1: Create the Datasource with Runtime Parameters
First, create your datasource connection. But instead of hardcoding the database name, you’ll use a runtime token.
For Snowflake
Navigate to Datasources → New Datasource → Snowflake.
In the database field, use a runtime parameter token. The syntax is: c9_TOKENNAME
For example, use $c9_database$ or $c9_dbname$ as the database name.
Full connection string example:
- Account: your-account.snowflakecomputing.com
- Database: c9_database (this is the runtime token)
- Schema: public (or parameterize with c9_schema)
- Warehouse: COMPUTE_WH
- Username/Password: Your service account credentials
For PostgreSQL/MySQL
Same concept, parameterize the database name in the JDBC URL:
jdbc:postgresql://your-host:5432/$c9_database$
For MongoDB
mongodb://username:password@host:27017/$c9_database$
Why this works: The runtime token (e.g., c9_database) gets replaced at query execution time with the value assigned to the logged-in user. The datasource itself doesn’t “know” which database it’s connecting to until the query runs.
Step 2: Create Queries with Runtime Parameters
Now create your queries. These will use the parameterized datasource and can include additional runtime filters.
Key point: The database token at the datasource level is hidden, users never see it or interact with it. It’s automatically populated based on their user/group configuration.
Step 3: Build the Dashboard Template
Create your dashboard with all the visualizations, drilldowns, and filters your customers need.
Figure 3: A healthcare analytics dashboard template. This same dashboard serves all tenants, each sees only their own data based on their database assignment.
Dashboard Components
- Widgets: Add your charts, tables, KPIs
- Filters: Create user-facing filters (date ranges, categories, etc.)
- Drilldowns: Configure click-through navigation between widgets
Setting Up Drilldowns
Drilldowns let users click on a chart element to see detailed data. Configure these at the widget level:
- Click widget settings → Drilldowns
- Select the source field (e.g., “region”)
- Choose the target widget or dashboard
- Map the field to the target’s filter
Since you’re using runtime parameters, drilldowns work identically across all tenants. The underlying database is different, but the query structure is the same.
Filter Configuration
For user-facing filters (not the hidden database parameter):
- Add filters to your dashboard
- Link them to the runtime parameters in your queries
- Optionally, configure filter suggestions from a lookup dataset
Step 4: Assign Runtime Parameter Values to Users
Here’s where the magic happens. You assign the database token value to each user or group. You can do this through the UI or programmatically via API.
Option A: Via the UI (User-Level Assignment)
Navigate to Settings → Team → Users → Edit User
In the User Filters section: – Mode: Query Parameter – Field Name: c9_dbname (matches the token name in your datasource/query, without the dollar signs) – Value: customer_a_production
Now when this user views the dashboard, all queries automatically route to customer_a_production.
Option B: Via the UI (Group-Level Assignment)
For larger deployments, assign at the group level:
Navigate to Settings → Team → Groups → Edit Group
Add the filter with the same configuration: – Mode: Query Parameter – Field Name: c9_dbname – Value: customer_a_production
All users in this group inherit the database assignment. This is ideal when you have multiple users per tenant.
Option C: Via SSO API (Programmatic)
For fully automated deployments where you’re embedding analytics in your own application, pass the token value programmatically when creating the SSO session. This is the recommended approach for production SaaS applications.
Step 1: Create or update the user with their database assignment
curl -X POST https://www.knowi.com/sso/user/create \
-H “Content-Type: application/x-www-form-urlencoded” \
-d “user=customer_a_user@example.com” \
-d “ssoCustomerToken=YOUR_SSO_TOKEN” \
-d “userGroups[]=Customer_A” \
-d “refresh=true” \
-d ‘contentFilter=[{“fieldName”:”c9_dbname”,”values”:[“customer_a_production”],”operator”:”=”}]’
Parameters explained:
- user: The email address for this embedded user
- ssoCustomerToken: Your Knowi SSO token (found in Settings → SSO)
- userGroups[]: Groups this user belongs to (controls dashboard access)
- refresh: Set to true to update existing user’s filters
- contentFilter: JSON array specifying runtime parameter values
Step 2: Create a session token for embedding
curl -X POST https://www.knowi.com/sso/session/create \
-H “Content-Type: application/x-www-form-urlencoded” \
-d “user=customer_a_user@example.com” \
-d “userToken=TOKEN_FROM_STEP_1”
The response is a session token you use in your embed code.
Full API workflow in Node.js:
const axios = require(‘axios’);
async function createEmbedSession(userEmail, tenantDatabase) {
// Step 1: Create/update user with their database filter
const userResponse = await axios.post(
‘https://www.knowi.com/sso/user/create’,
new URLSearchParams({
user: userEmail,
ssoCustomerToken: process.env.KNOWI_SSO_TOKEN,
‘userGroups[]’: ‘Embedded_Users’,
refresh: ‘true’,
contentFilter: JSON.stringify([
{ fieldName: ‘c9_dbname’, values: [tenantDatabase], operator: ‘=’ }
])
})
);
const userToken = userResponse.data;
// Step 2: Create session token
const sessionResponse = await axios.post(
‘https://www.knowi.com/sso/session/create’,
new URLSearchParams({
user: userEmail,
userToken: userToken
})
);
return sessionResponse.data; // Use this in your embed code
}
// Usage: when a tenant logs into your app
app.get(‘/analytics’, async (req, res) => {
const sessionToken = await createEmbedSession(
req.user.email,
req.user.tenantDatabase // e.g., “acme_corp_db”
);
res.render(‘analytics’, { sessionToken });
});
Step 5: Choose Your Embedding Approach
You have two main options for embedding the dashboard. Each has trade-offs.
Option 1: SSO Embed API (Full Interactivity)
Best for: Users who need to save custom views, create their own filters, or interact deeply with the analytics.
How it works:
- Your backend authenticates the user
- You call Knowi’s SSO API to create/retrieve a user token
- You create a session token 4. Embed using the session token
Backend code (Node.js example):
const axios = require(‘axios’);
async function getKnowiSession(userEmail, customerDatabase) {
// Step 1: Create or get user token
const userResponse = await axios.post(‘https://www.knowi.com/sso/user/create’, {
user: userEmail,
ssoCustomerToken: process.env.KNOWI_SSO_TOKEN,
userGroups: [‘Embedded_Users’],
refresh: true,
contentFilter: JSON.stringify([
{ fieldName: ‘database’, values: [customerDatabase], operator: ‘=’ }
])
});
const userToken = userResponse.data;
// Step 2: Create session token
const sessionResponse = await axios.post(‘https://www.knowi.com/sso/session/create’, {
user: userEmail,
userToken: userToken
});
return sessionResponse.data; // This is the session token
}
Frontend embed:
Knowi.render(‘#analytics-container’, {
type: ‘single’,
token: sessionToken, // From your backend
url: ‘https://www.knowi.com/’,
view: {
title: true,
header: false,
filter: true,
dashListIcon: false
}
}, function() {
console.log(‘Dashboard loaded’);
});
What persists: Any changes the user makes (custom filters, saved views) are stored against their Knowi user account.
Option 2: Secure URL Embed (View-Only)
Best for: Read-only dashboards where you don’t need user-specific persistence.
How it works:
- Generate an encrypted hash containing the content filters (including database)
- Embed using the secure URL with the hash
Key difference: No Knowi user account is created. The dashboard is view-only. Filter selections don’t persist between sessions.
Backend code:
const crypto = require(‘crypto’);
function generateSecureHash(customerDatabase, ttlSeconds = 3600) {
const params = {
contentFilters: [
{ fieldName: ‘database’, values: [customerDatabase], operator: ‘=’ }
],
_t: Date.now(),
_ttl: ttlSeconds
};
const payload = JSON.stringify(params);
// AES encrypt with your customer key (contact Knowi support for this)
const cipher = crypto.createCipheriv(‘aes-256-cbc’,
process.env.KNOWI_ENCRYPTION_KEY,
process.env.KNOWI_ENCRYPTION_IV
);
let encrypted = cipher.update(payload, ‘utf8’, ‘base64’);
encrypted += cipher.final(‘base64’);
return encrypted;
}
Frontend embed:
Knowi.render(‘#analytics-container’, {
type: ‘secure’,
hash: secureHash, // From your backend
dashboard: ‘YOUR_DASHBOARD_SHARE_KEY’,
view: {
title: true,
header: false,
filter: true
}
}, function() {
console.log(‘Dashboard loaded’);
});
Comparison: SSO vs Secure URL
| Feature | SSO Embed API | Secure URL Embed |
|---|---|---|
| User persistence | Yes – changes saved to user account | No – view only |
| Self-service analytics | Yes | Limited |
| Complexity | Higher (user management) | Lower (just hash generation) |
| Performance | Slightly slower (session creation) | Faster |
| User tracking | Full audit trail | Anonymous |
| Best for | Power users, self-service | Read-only dashboards, public embeds |
Step 6: Test Your Multi-Tenant Setup
Before going live, verify the isolation:
Test 1: Cross-Tenant Data Isolation
- Log in as User A (assigned to customer_a_db)
- Verify you see only Customer A’s data
- Log in as User B (assigned to customer_b_db)
- Verify you see only Customer B’s data
- Confirm there’s no data leakage
Test 2: Query Execution
Enable query logging to verify the correct database is being used:
- Open browser developer tools
- Watch the network requests
- Confirm the query is executing against the expected database
Test 3: Filter Persistence (SSO only)
- Apply a custom filter as User A
- Log out and log back in
- Verify the filter persists for User A
- Log in as User B
- Verify User B has their own filter state (or defaults)
Real-World Example: 1,000+ Snowflake Databases
One of our customers runs a B2B SaaS platform where each of their customers has a dedicated Snowflake database. Same schema across all 1,000+ databases. Different data.
Their setup:
- Datasource: Snowflake with $c9_database$ token
- Queries: 15 queries powering 8 dashboard widgets
- Embedding: SSO API integrated with their auth system
- User management: Automated via API when new customers onboard
The workflow:
- New customer signs up for their platform
- Their backend provisions a new Snowflake database
- Their backend calls Knowi SSO API to create a user group with database filter set to the new database name
- Customer logs in → sees their data in the embedded dashboard
Maintenance overhead: Near zero. They built the dashboard once. Adding a new customer = one API call.
Advanced: Parameterizing More Than Just Database
You can parameterize any part of the connection string:
Schema-Level Isolation
Use c9_schema (public)$ as the schema name.
Useful when all customers share a database but have separate schemas.
Full Connection String
For PostgreSQL with fully dynamic host, port, and database:
jdbc:postgresql://$c9_host$:$c9_port$/$c9_database$
Each token (c9_host, c9_port, c9_database) can be assigned different values per user.
Multiple Runtime Filters
Combine database isolation with row-level security in your queries:
SELECT * FROM orders
WHERE tenant_id = $c9_tenant_id$
AND region IN ($c9_regions${list|’|,}$)
Common Pitfalls and How to Avoid Them
Pitfall 1: Forgetting to Set Default Values
If a user doesn’t have a database filter assigned and you don’t set a default, queries will fail.
Fix: Either set a default value in the token (e.g., c9 database(default_db)$) or ensure all users have the filter assigned before they access the dashboard.
Pitfall 2: Caching Issues
Query results may be cached. If you’re testing with different database values, you might see stale data.
Fix: Clear the cache when testing, or use the “Refresh” option on the dashboard.
Pitfall 3: SSO Token Expiry
SSO session tokens expire. If your embedded dashboard suddenly stops working, this might be why.
Fix: Implement token refresh logic in your backend. Call /sso/session/keepAlive to extend sessions, or create new sessions as needed.
Performance Considerations
Connection Pooling
With many databases, connection management matters. Knowi maintains connection pools per datasource configuration. Since your datasource uses a runtime parameter, connections are established per-database at query time.
Recommendation: For very large deployments (100+ databases), consider:
- Caching query results aggressively
- Using scheduled data refreshes instead of live queries
- Implementing a data warehouse that consolidates tenant data with proper isolation
Query Optimization
Since the same query runs across different databases, optimize once, benefit everywhere:
- Add indexes on filtered columns
- Use appropriate date partitioning
- Limit result sets with pagination
Summary: Your Multi-Tenant Embedded Analytics Checklist
- Create datasource with runtime parameter token (e.g., c9_database) in the database field
- Build queries using the parameterized datasource
- Design dashboard with filters and drilldowns
- Assign database values to users/groups (via UI or API)
- Choose embedding approach (SSO for full interactivity, Secure URL for view-only)
- Integrate embedding into your application
- Test cross-tenant data isolation
- Monitor and optimize performance
Next Steps
Ready to implement database-per-tenant embedded analytics?
- Start a free trial – Get hands-on with the platform
- Book a demo – Walk through your specific use case with our team
- Read the SSO API docs – Deep dive into the technical details
Have questions about implementing this for your SaaS product? Contact our team.
Frequently Asked Questions.
What is database-per-tenant embedded analytics?
It’s a multi-tenant architecture where each customer has a separate database, but all customers share one dashboard template. The database is selected dynamically at runtime using user-specific parameters.
How is the database selected for each user?
The database name is passed as a runtime parameter token (for example c9_dbname) that gets replaced at query execution time based on the logged-in user or group assignment.
Do I need to duplicate dashboards for each customer?
No. You build the dashboard once. New customers only require assigning a database value, usually via a single API call.
Is this secure? Can tenants see each other’s data?
Yes, it’s secure when configured correctly:
- Database routing happens server-side
- Tokens are hidden from users
- Each user or group is explicitly assigned a database
- There is no cross-tenant query access unless misconfigured
Can I parameterize more than just the database?
Yes. You can parameterize:
- Schema
- Host
- Port
- Full connection string
- Row-level filters (for example tenant_id, region)
This enables hybrid models like database-per-tenant plus row-level security.
What happens if a user has no database assigned?
Queries will fail unless you:
- Set a default value for the token, or
- Ensure every embedded user is assigned a database before access
This is one of the most common setup mistakes.
Does this work with Snowflake multi-tenant setups?
Yes. This pattern is commonly used with hundreds or thousands of Snowflake databases that share the same schema. The same queries run everywhere.
How does this scale to 1,000+ tenants?
Very well:
- One dashboard
- One query set
- Automated onboarding via API
- Minimal operational overhead
The main considerations at scale are query optimization, caching, and connection management.